Approaches to Asking Users for Their Location Information
KEN -18 -1403Last modified on October 29th, 2025 at 12:03 pm
-
Abstract
PxD operates the MoA-INFO platform in collaboration with Kenya’s Ministry of Agriculture to provide free agricultural recommendations to farmers via SMS.
Kenya’s 2010 Constitution replaced the former provincial and district administrative structures with a devolved system of 47 counties. PxD therefore tested four different approaches to asking platform users for their location information. We tested whether the approach affected the response rate of users and the corroboration of responses with known administrative boundaries.
Asking about counties yielded higher response rates and corroboration rates than asking about districts. Using filtered lists classified more data, albeit at a slightly lower corroboration rate, than unfiltered text responses did. Lastly, increasing the number of location questions reduced the overall data quality, which suggests that collecting more granular administrative data has diminishing returns. -
Status
Completed
-
Start date
Q4 Nov 2018
-
End date
Q4 Nov 2018
-
Experiment Location
Kenya
-
Partner Organization
Kenya Ministry of Agriculture
-
Agricultural season
Short Rains
-
Experiment type
A/B test
-
Sample frame / target population
MoA-INFO Platform Users (farmers)
-
Sample size
7,998
-
Outcome type
Information sharing
-
Mode of data collection
PxD administrative data
-
Research question(s)/hypotheses
What is the best method for asking platform users for their location information?
-
Research theme
Message framing, Service design
-
Research design notes
Treatment 1—(n = 1999): Each recipient was asked first to enter their county as text, then to choose their constituency and ward from filtered lists. If their county was not classified, then they were asked to enter constituency and ward as text.
Treatment 2—starting with county (n = 1999): Each recipient received text questions about county, constituency, and ward.
Treatment 3—starting with district, excluding ward (n = 2000): Each recipient received text questions about district, division, location, and sublocation.
Treatment 4—starting with district, including ward (n = 2000): Each recipient received text questions about district, division, ward, location, and sublocation.
-
Results
Fewer people answered a question about their district than people answered a question about their county, and a smaller proportion of these district replies were corroborated. Filtered lists were overall the best method as substantially more data were classified using filtered lists than using unfiltered text questions, although the corroboration rate was slightly higher when using unfiltered text questions. Increasing the number of questions lowered the corroboration rate as users were more likely to enter the administrative strata in the wrong order. In this context it is preferable to ask about location using county (current administrative structure) instead of district (older structure), and to ask using a limited number of filtered lists in order to classify more responses.