Quantcast
Channel: Practical Insights from MERL Tech Conferences Around the World
Viewing all articles
Browse latest Browse all 72

10 Tips for Better Remote Data Collection with Mobile Phone Surveys

$
0
0

research women internet adoption

Many international development organizations invest heavily in phone-based surveys to gather high-quality feedback and project evaluation data. When done correctly, phone-based phone surveys can get fast constituent data at a fraction of the cost of traditional in-person surveys.

During COVID-19 digital response nearly all face-to-face research is suspended, and many organizations conducting remote monitoring research are shifting some or all of their work to phone-based surveys.

10 Tips for Remote Phone Surveys

The Remote Phone Survey Toolkit captures many of the most important lessons 60 Decibels learned in conducting phone-based surveys since 2014. While it is neither complete nor exhaustive, it serves as a useful resource for computer-assisted telephone interviewing (CATI).

1. Shorten your survey to 15 minutes

In our experience, the optimal length for a phone survey is less than 15 minutes. When we go much beyond this, respondents’ attention drops off considerably and at times they will refuse to complete the survey. This translates to a written survey of maximum 30-40 questions. (Note: when numbering questions, be sure that each question + follow-up is labeled as a separate question).

2. Get clear primary survey goals

First step: agree on your priorities. Do this before you start drafting your survey questions. For example, you might want to find out:

  1. who am I reaching
  2. what impact am I having on quality of life, and
  3. are respondents satisfied with the service I am providing.

It’s helpful to use the constraint of a shorter survey as an opportunity to have tougher conversation with relevant stakeholders about must-have versus nice- to-have data from your survey. Then use this moment to force-rank data that you absolutely must get from this survey and cut out any questions that do not directly align to your priorities.

3. Ensure high-quality translation

Use experienced translators for your survey, and make sure they understand local context as well as the language. Always cross-check the translation with someone who understands the perspective of the customers / beneficiaries you will be surveying. Without the benefit of being in-person, the language in your questionnaire needs to be as accurate as possible to avoid confusion or misunderstanding.

4. Put respondents in control

People should never be surprised by your call. Always take steps to inform people that you are going to contact them. This could be in the form of a SMS text message in advance of the call, or a notification through other channels (e.g. a microfinance loan officer could communicate directly with her clients). Ideally, take this a step further by allowing the respondent to schedule the date and time of the phone call she’ll receive.

5. Establish context and rapport from the outset

Trust is the building block for any good survey and establishing it over the phone is not easy. The challenge is, more than half of what we communicate is through body language, and that is missing in a phone interview. To overcome this, put extra time into training your enumerators in building rapport with the people they’ll be speaking to.

In addition, invest heavily in the opening script they read to ensure that it provides adequate context behind why they are calling and what the interview subject should expect from the call. Where possible, be specific about how engaging in the interview could ultimately benefit the respondent, for example: “your feedback will help [organization] improve its service to customers like you”.

6. Use high quality survey software

There are many available CATI survey deployment and mobile data capture software solutions, you’ll need to pick one that best suits your needs. Some criteria to consider:

  • Will your enumerators always be online and with a stable internet connection?
  • How sophisticated is your tech capability for writing scripts and software customization?
  • Do you want to pay for software, or do you need a free solution?
  • Will you use the software just to place calls, or will you do any data analysis in the software?

Answers to these questions will help you zero in on providers that best suit your needs. Here are some companies to research further: Aircall, Five9, Callminer, and Zendesk Talk.

7. Conduct detailed training for each survey

For all 60 Decibels enumerators, we offer a 6-7 hour online training course that we developed in- house. We also provide another two hours of training for each project they work on. This project-specific training focuses on the specifics of the engagement, its context, and the purpose of the survey.

In addition, we go through the survey instrument in detail, highlighting any questions they’ve not asked before and identifying questions we believe will prove particularly thorny.

8. Always pilot test your survey questions

One of the great things about phone surveys is that it’s easy to check on progress in real time and quickly adjust survey questions as needed. We recommend completing ~20 surveys as a formal pilot before rolling out a full survey. When checking on the success of your pilot, make sure you look at the quality of the data—both qualitative and quantitative responses.

9. Track progress with live dashboards

When you have multiple enumerators on one project, it is key to set up the right organizational data dashboard structures and policies to manage the project. This also helps keep track of response rates to ensure robust sampling.

We ask researchers to update a shared database as they conduct calls, indicating when they made the call and the call status – i.e. completed, wrong number, need to call back, etc. From this data, we can estimate when we’ll reach the target sample and can better understand the main barriers to completing surveys so that we can help researchers maximize response rates.

10. Assess open and close-ended question quality

A quality open-ended response should be a verbatim transcription of what the respondent said. Enumerators should encourage detailed responses, probing where needed.

When quality checking open-ended responses, be sure to look out for responses that: do not directly answer the question asked; are vague or too general; or are repetitive or too similar across different respondents. If you notice any of these issues, flag it immediately with the enumerator, provide detailed feedback, and monitor data from their subsequent interviews.

As you are checking close-ended or multiple-choice responses, check the frequency of “other” responses (this might warrant the inclusion of a new answer option to enable higher accuracy), check for outliers in numerical values, and check for formatting errors.

The post 10 Tips for Better Remote Data Collection with Mobile Phone Surveys appeared first on ICTworks.


Viewing all articles
Browse latest Browse all 72

Latest Images

Trending Articles





Latest Images