Decision-making and data quality: Applying fraud response strategies to clean survey panel data

Authors

DOI:

https://doi.org/10.37433/aad.v6i1.573

Keywords:

SDG 4: Quality Education, online surveys, survey straightlining, low-quality indicators, extension professionals

Abstract

Survey panels provide extension professionals with a valuable tool for collecting data on a wide range of topics without overburdening their program participants. Paid data panels are particularly useful for gathering unbiased feedback about Extension programs. However, some survey participants in these panels engage in satisficing or straightlining behaviors to earn rewards with minimal effort, which compromises data quality. This study explored whether survey panelists’ perceptions of online survey items varied based on response quality. It compared normal and low-quality responses across broad issue areas and investigated whether age, education, income, or gender identity influenced these differences. Analysis of 94 respondents in each group revealed no significant data quality differences based on age, education, income, or gender identity. There was a statistically significant difference in data quality when using an open-ended question requiring greater cognitive effort. We recommend adopting more conservative data cleaning strategies. While this approach has limitations, its benefits are particularly valuable when the data informs an organization’s strategic priorities.

Downloads

Download data is not yet available.

References

Arndt, A. D., Ford, J. B., Babin, B. J., & Luong, V. (2022). Collecting samples from online services: how to use screeners to improve data quality. International Journal of Research in Marketing, 39(1), 117–133. https://doi.org/10.1016/j.ijresmar.2021.05.001

Belliveau, J., Soucy, K. I., & Yakovenko, I. (2022). The validity of Qualtrics panel data for research on video gaming and gaming disorder. Experimental and Clinical Psychopharmacology, 30(4), 424–431. https://doi.org/10.1037/pha0000575

Belliveau, J., & Yakovenko, I. (2022). Evaluating and improving the quality of survey data from panel and crowd-sourced samples: A practical guide for psychological research. Experimental and Clinical Psychopharmacology, 30(4), 400–408. https://doi.org/10.1037/pha0000564

Carifio, J., & Perla, R. (2008). Resolving the 50-year debate around using and misusing Likert scales. Medical Education, 42(12), 1150–1152. https://doi.org/10.1111/j.1365-2923.2008.03172.x

Carver, R. P. (1992). Reading rate: Theory, research, and practical implications. Journal of Reading, 36(2), 84–95. https://api.semanticscholar.org/CorpusID:143405844

Chandler, J. J., & Paoloacci, G. (2017). Lie for a dime: When most prescreening responses are honest but most study participants are imposters. Social Psychology and Personality Science, 8(5), 500–508. https://doi.org/10.1177/1948550617698203

Fortunato, D., Hibbing, M. V., & Hibbing, T. P. (2022). Hurdles to inference: The demographic correlates of survey breakoff and shirking. Social Science Quarterly, 103(2), 455–465. https://doi.org/10.1111/ssqu.13128

Hamby, T., & Taylor, W. (2016). Survey satisficing inflates reliability and validity measures: An experimental comparison of college and Amazon Mechanical Turk samples. Educational and Psychological Measurement, 76(6), 912–932. https://doi.org/10.1177/0013164415627349

Harder, A., Craig, D., Israel, G., Benge, M., & Caillouet, O. (2023). Exploring the possibilities of a standardized questionnaire for assessing residents’ needs. Journal of Extension, 61(2), Article 1. https://doi.org/10.34068/joe.61.02.01

Holt, J., Rumble, J. N., Telg, R., & Lamm, A. (2015) The message or the channel: An experimental design of consumers' perceptions of a local food message and the media channels used to deliver the information. Journal of Applied Communications, 99(4). https://doi.org/10.4148/1051-0834.1053

Johnson, M. S., Adams, V. M., & Byrne, J. (2023). Addressing fraudulent responses in online surveys: Insights from a web-based participatory mapping study. People and Nature, 6(1), 147–164. https://doi.org/10.1002/pan3.10557

Jolliffe, I. T. (2002). Principal component analysis (2nd ed.). Springer.

Kaminska, O., McCutcheon, A. L., Billiet, J. (2010). Satisficing among reluctant respondents in a cross-national context. Public Opinion Quarterly, 74(5), 956–984. https://doi.org/10.1093/poq/nfq062

Kelly, M. R., Getchis, T., Concepcion, A., & Bovay, J. (2019). Using online panels to inform extension programming. Journal of Extension, 57(5), Article 24. https://doi.org/10.34068/joe.57.05.24

Lawlor, J., Thomas, C., Guhin, A. T., Kenyon, K., Lerner, M. D., & Drahota, A. (2021). Suspicious and fraudulent online survey participation: Introducing the REAL framework. Methodological Innovations, 14(3). https://doi.org/10.1177/20597991211050467

Likert, R. (1932). A technique for the measurement of attitudes. Archives of Psychology, 22(140), 1–55. https://psycnet.apa.org/record/1933-01885-001

Narine, L. K., Ali, A. D., & Hill, P. A. (2020). Application of a three-phase needs assessment framework to identify priority issue areas for extension programming. Journal of Extension, 58(4), Article 24. https://open.clemson.edu/joe/vol58/iss4/24/

Pozzar, R., Hammer, M. J., Underhill-Blazey, M., Wright, A. A., Tulsky, J. A., Hong, F., Gunderson, D. A., & Berry, D. L. (2020). Threats of bots and other bad actors to data quality following research participant recruitment through social media: Cross-sectional questionnaire. Journal of Medical Internet Research, 22(10), e23021. https://doi.org/10.2196/23021

Pratt-Chapman, M., Moses, J., & Arem, H. (2021). Strategies for the identification and prevention of survey fraud: Data analysis of a web-based survey. JMIR Cancer, 7(3), e30730. https://doi.org/10.2196/30730

Qualtrics. (2023, October 1). Fraud detection. https://www.qualtrics.com/support/survey-platform/survey-module/survey-checker/fraud-detection/

Revilla, M., & Ochoa, C. (2015). What are the links in a web survey among response time, quality, and auto-evaluation of the efforts done? Social Science Computer Review, 33(1), 97–114. https://doi.org/10.1177/0894439314531214

Roberts, C., Gilbert, E., Allum, N., Eisner, L. (2019). Research synthesis: Satisficing in surveys: A systematic review of the literature. Public Opinion Quarterly, 83(3), 598–626. https://doi.org/10.1093/poq/nfz035

Schmidt, K., Gummer, T., & Roßmann, J. (2020). Effects of respondent and survey characteristics on the response quality of an open-ended attitude question in web surveys. methods, data, analyses, 14(1), 3–34. https://doi.org/10.12758/mda.2019.05

Spreen, T. L., House, L. A., & Gao, Z. (2020). The impact of varying financial incentives on data quality in web panel surveys. Journal of Survey Statistics and Methodology, 8(5), 832–850. https://doi.org/10.1093/jssam/smz030

Warner, L. A., & Diaz, J. M. (2021). Amplifying the theory of planned behavior with connectedness to water to inform impactful water conservation program planning and evaluation. The Journal of Agricultural Education and Extension, 27(2), 229–253. https://doi.org/10.1080/1389224X.2020.1844771

Warner, R. M. (2012). Applied statistics: From bivariate through multivariate techniques. Sage Publications.

Yarrish, C., Groshon, L., Mitchell, J. D., Appelbaum, A., Klock, S., Winternitz, T., & Friedman-Wheeler, D. G. (2019). Finding the signal in the noise: Minimizing responses from bots and inattentive humans in online research. The Behavior Therapist, 42(7), 235–242.

Zhang, C., & Conrad, F. G. (2014). Speeding in Web surveys: The tendency to answer very fast and its association with straightlining. Survey Research Methods, 8(2), 127–135. https://doi.org/10.18148/srm/2014.v8i2.5453

Downloads

Published

2025-03-08

How to Cite

Harder, A., Narine, L. K., & Stearns, S. (2025). Decision-making and data quality: Applying fraud response strategies to clean survey panel data. Advancements in Agricultural Development, 6(1), 97–108. https://doi.org/10.37433/aad.v6i1.573

Issue

Section

Articles

Most read articles by the same author(s)

1 2 > >>