SMS Survey Design Tweaks
KEN -20 -1536Last modified on December 19th, 2025 at 10:39 am
-
Abstract
PxD operates the MoA-INFO platform in collaboration with Kenya’s Ministry of Agriculture to provide free agricultural recommendations to farmers via SMS messages. This experiment tests how subtle changes in the framing or format of SMS survey questions and answers affect users’ response rates. We aimed to identify behavioral design elements that can improve the data quality and the user experience of SMS-based surveys deployed via the MoA-INFO platform. We find that small tweaks in message designs—from the opening-message framing to answer-option formats—make meaningful differences in survey completion rates and the time it takes for users to complete the survey.
-
Status
Completed
-
Start date
Q3 Jul 2020
-
Experiment Location
Kenya
-
Partner Organization
Kenya Ministry of Agriculture
-
Agricultural season
Short Rains
-
Experiment type
A/B test
-
Sample frame / target population
MoA-INFO Platform Users (farmers)
-
Sample size
5,000
-
Outcome type
Service engagement
-
Mode of data collection
Phone survey
-
Research question(s)
Do tweaks in message design for an SMS survey—specifically answer format, introduction message framing, and response order—improve response rates, survey duration, and survey completion rates?
-
Research theme
Message framing
-
Research Design
We sent an SMS survey to 5,000 randomly selected users between July 16 and July 23, 2020. Respondents were asked who is the primary phone user (self or other person) and to provide the gender of the primary phone user. The survey incorporated three randomized elements:
- Answer Response Option Format: 50% of users received answer choices in the form of letters (A/B/C). The other 50% received numeric options (1/2/3).
- Introductory Message Framing: 50% of users received an initial message asking, “Would you like to start?” The other 50% received, “Can we borrow one minute of your time?”
- Gender Answer Order: 50% of users who reached the gender question received the answer options in the sequence: I don’t know; Female; Male. The other 50% received the sequence: I don’t know; Male; Female.
Randomization was done at the individual level with each user randomly assigned to one combination of the above elements.
The measured outcomes were:
- Primary: Response rates to the initial message.
- Secondary: Time spent on surveys, completion rates, and gender response patterns.
-
Results
The experiment yielded several statistically significant findings regarding survey engagement and completion behavior. The introductory message framing had a substantial impact on participation:
Response rate: Users who received the message “Can we borrow one minute of your time?” responded at a higher rate (24.22%) compared to those who received “Would you like to start?” (19.38%), which represents a 25% relative increase in response rate.
Survey duration: Respondents who received the “borrow one minute” framing also completed the survey more quickly. After excluding extreme outliers (users who took more than 10 hours), the average survey duration was 35.4 minutes for the “borrow one minute” group, compared to 45.0 minutes for the “would you like to start?” group, which indicates a meaningful reduction of nearly 10 minutes in completion time.
Completion rate by answer format: For users who opted to proceed after receiving the “borrow one minute” message, the answer-choice format influenced completion. Users assigned to the A/B/C format achieved a higher completion rate (95.86%) than those assigned to 1/2/3 (86.96%), which suggests that lettered options may improve navigability or comprehension.
We find no notable differences in the key outcomes for the gender randomization that varied the answer order of the phone owner’s gender.