WASHINGTON, D.C. — Response rates — the percentage of invited respondents who participate in a survey — are one of many measures researchers use to help evaluate sample quality, survey performance and cost efficiency. While there are other important methodological considerations, improving a study’s response rate is often a key goal of a survey’s design.
One of the most effective and efficient methods for improving response rates in surveys conducted by mail (also referred to as address-based sample or ABS surveys) is the use of prepaid cash incentives: A small amount of money (as little as $1) is provided with the survey invitation. It is given freely to potential respondents for considering the request to participate, but without any requirement to complete the survey.
Although some study funders may be skeptical about mailing prepaid cash incentives, research consistently shows that prepaid incentives can help boost response rates.[1] In addition to boosting response rates, research has found that pre-incentives can reduce nonresponse bias by convincing reluctant or resistant sample members, people with lower levels of education, and those less familiar with the survey topic to respond.[2][3]
Prepaid cash incentives can also reduce overall survey costs. For example, if it costs $5 per survey to mail a survey to 1,000 people, and 100 people complete the survey (a 10% response rate), the total cost per completed survey is $50.
Using this example, let’s say the researcher improves the mailing design (such as by adding a $1 pre-incentive), which increases the cost to $6 per mailing, and this improves response rates to 15% (150 people complete the survey). The cost per completed survey is reduced to $40. Put another way, because of the increase in response rate, fewer surveys can be mailed to achieve the same number of completed surveys, thereby reducing costs.
Over the past 15 years, 优蜜传媒has conducted its own extensive testing of prepaid cash incentives in mail and mail push-to-web surveys (surveys sent through the mail that include a link to an online survey). Our findings are consistent with the survey research literature: Small token cash incentives improve response rates and reduce costs. Later in this article, we share the results of two experiments that summarize our findings and recommendations on prepaid cash incentives.
More on Prepaid Incentives: Why They Work
Small token prepaid incentives are not offered as a payment for time. Rather, pre-incentives are communicated as a token thank-you gesture for considering the survey request. This exchange (when done well) provides a positive impact on response at a lower cost than not including a pre-incentive, which we will demonstrate in this blog. Offering an incentive without requiring respondents to complete the survey is the part of the social exchange between the survey organization and the potential respondent that makes the pre-incentive effective.
A pre-incentive builds goodwill and trust and can elicit a sense of obligation. When people receive something of value up front, they have already been thanked for completing the task and feel a sense of obligation to reciprocate by completing the survey.[4].
Pre-incentives are even more effective when they are viewable through a clear window in the mailing envelopes.[5] The visual of seeing money in an envelope greatly increases the chance that people open the envelope. Getting a sampled household to open a mailing with a survey invitation is the first barrier to achieving a response, and nothing works better at overcoming this barrier than seeing money inside the envelope. Once it is opened, people are able to read and process the survey invitation and to participate.
Postincentives (incentives that are contingent on completing the survey) have not been found to improve survey response rates in mail-based surveys of randomly selected households as effectively as pre-incentives.[6] Respondents (and Institutional Review Boards or IRBs) may see postpaid incentives as payment for a respondent’s time, and survey organizations rarely pay people enough money to justify the time they spend completing a survey task. Further, respondents must trust that they will receive the postpaid incentive after they have completed the survey.
There may also be a delay of days or weeks between survey completion and the time the postincentive is received, which delays gratification and may not be as motivating as a small prepaid cash incentive. Receiving a postpaid reward may also require providing personally identifiable information (PII), such as a name or an email address. Some mail-based projects that are long or complicated may benefit from a postincentive, but only in addition to a pre-incentive. Including a small token pre-incentive increases trust that the promised postincentive will be delivered, improving the effectiveness of the recruitment overall.
Cash Is King
Cash incentives are more effective than other forms of incentives, such as raffles[7] (i.e., complete the survey for a chance to win a prize). Research continuously confirms that raffles are equivalent to providing respondents with zero incentive — this is true for a variety of reasons, including the assumption that one is unlikely to win the raffle. From a legal perspective, sweepstakes and raffles can be difficult to execute. State laws regarding sweepstakes must be carefully followed and often have rules for allowing people to enter the sweepstakes without completing the survey.
Experimental Results
Experiment 1: Comparing a $1 Cash Incentive to No Incentive
In 2010, 优蜜传媒tested pre-incentives with a four-page mail survey sent to 29,980 households randomly selected from an address-based sample frame. Before this experiment, prior waves of the survey had been sent with no prepaid incentive, and the goal of the experiment was to see how the inclusion of the $1 incentive would affect response rates and costs.
In the survey packet, respondents received a cover letter, survey booklet and business reply envelope to return the paper survey. A total of 9,989 respondents received no cash incentive in their initial survey invitation packet, while 19,991 respondents received a $1 bill. All respondents received two reminder postcards after the initial survey mailing, encouraging them to return the survey. The postcard was the same for both treatment groups and did not mention an incentive.
Households that received no cash incentive ($0) had a response rate of 11.8%, compared with a 26.3% response rate for the group that received a $1 incentive.
Because of the significant improvement in the response rate, sending a $1 incentive was more cost effective per completed survey than sending no incentive. The average cost per completed survey when $1 was sent was $5.50 lower than the cost per complete when sending no incentive ($0).[8] In other words, using a $1 incentive was a win-win: The response rate increased, and in future survey administrations, including the $1 incentive meant we were able to mail fewer survey invitations to achieve the required number of survey completes.
Experiment 2: Comparing $1, $2 and $5
In October 2022, 优蜜传媒tested the use of $1, $2 and $5 incentives for a mail push-to-web survey. In this experiment we randomly assigned 15,000 households, randomly selected from an ABS frame, to one of the three incentive groups. All households received an initial survey invitation packet in the mail, which included a cover letter, the randomly assigned prepaid cash incentive amount and a link to complete the survey online.
Everyone also received a reminder postcard, and households that had not yet responded after two weeks in the field received a final reminder mailing, which included a reminder letter with a link to the online survey, a four-page paper survey and a business reply envelope (with no additional cash incentive). All materials were the same across treatment groups, except for the cash pre-incentive amount.
As expected, response rates were higher when a $5 incentive was included (21.4%), compared with $1 (16.8%) and $2 (17.4%) incentives. Although the $5 incentive resulted in higher response rates, the $1 incentive was the most cost effective per complete. The $1 incentive cost approximately $6 less per complete than the $2 incentive, and the $1 incentive was roughly $17 less per complete than the $5 incentive.
We also compared the demographics of survey respondents by incentive treatment group. There were no significant differences by age, race, ethnicity, education level, gender, income, children in the household or political party ID. The only demographic where we observed a significant difference by incentive group was marital status. Respondents who received a $5 prepaid incentive were slightly more likely than the $1 and $2 incentive groups to report being married. Taken as a whole, our findings indicate that the incentive worked similarly across demographic groups and didn’t introduce concerning differences in sample composition.
Conclusion
Small prepaid token cash incentives play a vital role in improving survey response rates and reducing overall survey costs for mail and mail push-to-web surveys. While some survey funders or respondents may perceive the cash incentives as an unnecessary expense, the evidence clearly shows that they can lead to higher response rates and lower overall costs, making them a valuable tool for mail-based surveys. By strategically using token prepaid cash incentives, survey researchers can achieve higher participation rates, better data quality and more cost-effective survey operations.
For mail projects that are budget conscious, 优蜜传媒has consistently found that a $1 prepaid cash incentive is the most cost effective, which is consistent with other survey research literature. For projects in which maximizing response rates is the primary concern, and budget is of lesser concern, incentives larger than $1 can lead to improvements. As methodology and technology for delivering incentives continue to evolve, 优蜜传媒will continue to test which strategies can help improve the overall quality of results.
To stay up to date with the latest 优蜜传媒News insights and updates, follow us on X .
[1] Singer, E. & Ye, C.. (2013). “The Use and Effects of Incentives in Surveys.” The Annals of the American Academy of Political and Social Science 645:112–41. http://www.jstor.org/stable/23479084.
[2] Groves, R.M., Couper, M.P., Presser, S., Singer, E., Tourangeau, R., Piani Acosta, G., & Nelson, L. (2006). “Experiments in Producing Nonresponse Bias.” Public Opinion Quarterly, 70(5): 720-736.
[3] Petrolia, D.E. & Bhattacherjee, S. (2009). “Revisiting Incentive Effects: Evidence from a Random-Sample Mail Survey on Consumer Preferences for Fuel Ethanol.” Public Opinion Quarterly, 73(3): 537-550.
[4] Dillman, D.A., Smyth, J.D., & Christian, L.M. (2014). Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method, 4th ed. Hoboken, NJ: Wiley.
[5] DeBell, M. (2023). “The Visible Cash Effect with Prepaid Incentives: Evidence for Data Quality, Response Rates, Generalizability, and Cost.” Journal of Survey Statistics and Methodology 11(5), 991-1010, https://doi.org/10.1093/jssam/smac032
[6] Rao, N. (2020). “Cost Effectiveness of Pre- and Post-Paid Incentives for Mail Survey Response.” Survey Practice 13(1). https://doi.org/10.29115/SP-2020-0004.
[7] LaRose, R., & Tsai, H.S.. (2014). “Completion rates and non-response error in online surveys: Comparing sweepstakes and pre-paid cash incentives in studies of online behavior.” Computers in Human Behavior 34:110-11.
[8] The cost per complete = total cost of all mailings within treatment group/total completes within treatment group