Survey response rates by requestor’s characteristics

Authors

DOI:

https://doi.org/10.37433/aad.v6i1.525

Keywords:

SDG 17: Partnerships, survey research methods, agriculture, nonresponse bias, response optimization

Abstract

Response rates are crucial for the effectiveness of survey-based behavioral science and the validity of research conclusions. This study investigates the impact of the characteristics of individuals requesting participation on response rates in online surveys within the agricultural sector. Using social exchange theory and Coleman’s social capital theory as guiding frameworks, we examined whether the personal characteristics of the requestor influence response rates. Data were collected from a sample of 1,452 agricultural development personnel using four different request formats varying by the gender and position of the requestor. Following Dillman’s tailored design method, participants received a pre-email, a request email with a link to the survey, and four waves of follow-up emails. Response rates were analyzed based on the four treatment groups, the gender of the requestor, and the position of the requestor. The findings indicate no significant differences in response rates based on the requestor's gender or position. Analysis of variance (ANOVA) and independent t-tests revealed that neither the highest level of education, years of teaching experience, nor the wave of response significantly affected by the requestor's characteristics. These results suggest that the established trust and social capital within the agricultural community do not significantly influence survey participation. The study highlights the need for researchers to address declining response rates in survey research. It recommends building and maintaining community trust by providing clear, concise, and accessible research findings. Researchers should also consider more targeted sampling methods to reduce survey fatigue and improve response rates. The implications of these findings extend to the broader field of social science research, emphasizing that the gender and position of the requestor do not increase response rates or reduce selection bias. Future research should explore alternative methods to enhance survey participation and address the challenges of non-response bias in agricultural education research.

Downloads

Download data is not yet available.

References

Baruch, Y. (1999). Response rate in academic studies: A comparative analysis. Human Relations, 52(4), 421–38. https://doi.org/10.1177/001872679905200401

Baruch, Y., & Holtom, B. C. (2008). Survey response rate levels and trends in organizational research. Human Relations, 61(8), 1139–1160. https://doi.org/10.1177/0018726708094863

Bauer, P. J., & Coyne, M. J. (1997). When the name says it all: Preschoolers' recognition and use of the gendered nature of common proper names. Social Development, 6(3), 271-291. https://doi.org/10.1111/j.1467-9507.1997.tb00106.x

Billiet, J., & Matsuo, H. (2012). Non-response and measurement error. In: Gideon, L. (Eds.) Handbook of survey methodology for the social sciences. (pp. 149–178). Springer. https://doi.org/10.1007/978-1-4614-3876-2_10

Blau, P. M. (1964). Exchange and power in social life. Wiley.

Bourdieu, P. (1986). The forms of capital. In J. Richardson (Eds.), Handbook of theory and research for the sociology of education (pp. 46–58). Greenwood.

Cochran, W. G. (1977) Sampling techniques. (3rd edition). Wiley & Sons.

Cohen, J. (1992). Statistical power analysis. Current Directions in Psychological Science, 1(3), 98-101. https://doi.org/10.1111/1467-8721.ep10768783

Coleman, J. S. (1988). Social capital in the creation of human capital. American Journal of Sociology, 94, S95-S120. https://doi.org/10.1086/228943

Coleman, J. S. (1990). Foundations of social theory. Harvard University Press.

Dillman, D. A. (1978). Mail and telephone surveys: The total design method. Wiley & Sons.

Dillman, D. A. (1983). Mail and self-administered surveys. In. P. H. Rossi, J. D. Wright, & A. B. Anderson (Eds.), Handbook of survey research. (pp. 359–377). Academic Press.

Dillman, D. A. (1991). The design and administration of mail surveys. Annual Review of Sociology, 17(1), 225-249. https://doi.org/10.1146/annurev.so.17.080191.001301

Dillman, D. A., Smyth, J. D., & Christian, L. M. (2014). Internet, phone, mail, and mixed mode surveys: The tailored design method (4th ed.). John Wiley & Sons Inc.

Dooley, L. M., & Lindner, J. R. (2003). The handling of nonresponse error. Human Resource Development Quarterly, 14(1), 99–110. https://doi.org/10.1002/hrdq.1052

Eggleston, J. (2020). Frequent survey requests and declining response rates: Evidence from the 2020 census and household surveys. Journal of Survey Statistics and Methodology, 12(5), 1138-1156. https://doi.org/10.1093/jssam/smae022

Fan, W., & Yan, Z. (2010). Factors affecting response rates of the web survey: A systematic review. Computers in Human Behavior, 26(2), 132–139. https://doi.org/10.1016/j.chb.2009.10.015

Field, A. (2013). Discovering statistics using IBM SPSS statistics (4th ed.). Sage.

Fraze, S. D., Hardin, K. K., Brashears, M. T., Haygood, J. L., & Smith, J. H. (2003). The effects of delivery mode upon survey response rate and perceived attitudes of Texas agri-science teachers. Journal of Agricultural Education, 44(2), 27–37. https://doi.org/10.5032/jae.2003.02027

Greenberg, P., & Dillman, D. (2021). Mail communications and survey response: A test of social exchange versus pre-suasion theory for improving response rates and data quality. Journal of Survey Statistics and Methodology, 11(1), 1–22. https://doi.org/10.1093/jssam/smab020

Groves, R. (2006). Nonresponse Rates and Nonresponse Bias in Household Surveys. Public Opinion Quarterly, 70(5), 646-675. https://doi.org/10.1093/poq/nfl033

Groves, R. M., Couper, M. P., Presser, S., Singer, E., Tourangeau, R., Acosta, G. P., & Nelson, L. (2006). Experiments in producing nonresponse bias. Public Opinion Quarterly, 70(5), 720–736. https://doi.org/10.1093/poq/nfl036

Grammarly, Inc. (2025). Grammarly (Continuously updated) [online tool]. San Francisco, CA: Grammarly, Inc. https://www.grammarly.com

Groves, R. M., & Peytcheva, E. (2008). The impact of nonresponse rates on nonresponse bias: A meta-analysis. Public Opinion Quarterly, 72(2), 167-189. https://doi.org/10.1093/poq/nfn011

Hansen, M. H., Hurwitz, W. N., Marks, E. S., & Mauldin, W. P. (1951). Response errors in surveys. Journal of the American Statistical Association, 46(254), 147–190. https://doi.org/10.1080/01621459.1951.10500779

Heberlein, T. A., & Baumgartner, R. (1978). Factors affecting response rates to mailed questionnaires: A quantitative analysis of the published literature. American Sociological Review, 447–462. https://www.jstor.org/stable/2094771

Homans, G. C. (1961). Social behavior: Its elementary forms. Harcourt, Brace & World.

Koen, B., Loosveldt, G., Vandenplas, C., & Stoop, I. (2018). Response rates in the European social survey: Increasing, decreasing, or a matter of fieldwork efforts? Survey Methods: Insights from the field. https://doi.org/10.13094/SMIF-2018-00003

Lindner, J. R. (2002). Handling of nonresponse error in the Journal of International Agricultural and Extension Education. Journal of International Agricultural and Extension Education, 9(3), 55–60. https://doi.org/10.5191/jiaee.2002.09307

Lindner, J. R., & Lindner, N. (2024). Interpreting Likert type, summated, unidimensional, and attitudinal scales: I neither agree nor disagree, Likert or not. Advancements in Agricultural Development, 5(2), 152–163. https://doi.org/10.37433/aad.v5i2.351

Lindner, J. R., Murphy, T. H., & Briers, G. E. (2001). Handling nonresponse in social research. Journal of Agricultural Education, 42(4), 43–53. https://doi.org/10.5032/jae.2001.04043

Loury, G. C. (1977). A dynamic theory of racial income differences. In. P. A. Wallace & A. M. LaMond (Eds.), Women, minorities, and employment discrimination, (pp. 153–186). Lexington Books, D.C. Health and Co.

Loury, G. C. (1987). Why should we care about group inequality? Social Philosophy and Policy, 5(1), 249–271. https://doi.org/10.1017/S0265052500001345

McKibben, J. D., Clemons, C. A., & Nurradin, M. (2022). Hybrid vigor: A quantitative analysis of job satisfaction of United States school based secondary agricultural education classrooms. Journal of Agricultural Education, 63(2), 238–250. https://doi.org/10.5032/jae.2022.02238

Roberts, T. G., & Dyer, J. E. (2005). A summary of distance education In university agricultural education departments. Journal of Agricultural Education, 46(2), 70–82. https://doi.org/10.5032/jae.2005.02070

Stedman, R. C., Connelly, N. A., Heberlein, T. A., Decker, D. J., & Allred, S. B. (2019). The end of the (research) world as we know it? Understanding and coping with declining response rates to mail surveys. Society & Natural Resources, 32(10), 1139–1154. https://doi.org/10.1080/08941920.2019.1587127

Thibaut, J. W., & Kelley, H. H. (1986). The social psychology of groups. Transaction Books.

Tomaskovic-Devey, D., Leiter, J., & Thompson, S. (1994). Organizational survey nonresponse. Administrative Science Quarterly, 439–457. https://doi.org/10.2307/2393298

Zahl-Thanem, A., Burton, R. J., & Vik, J. (2021). Should we use email for farm surveys? A comparative study of email and postal survey response rate and non-response bias. Journal of Rural Studies, 87, 352–360. https://doi.org/10.1016/j.jrurstud.2021.09.029

Downloads

Published

2025-02-21

How to Cite

McKibben, J. D., Clemons, C. A., & Blythe, J. M. (2025). Survey response rates by requestor’s characteristics. Advancements in Agricultural Development, 6(1), 43–54. https://doi.org/10.37433/aad.v6i1.525

Issue

Section

Articles

Funding data

Most read articles by the same author(s)