Video vs. Text Responses: Navigating the Challenges and Opportunities For Your Survey

Published on Apr 24, 2024 by Birgit Eschmann

Understanding The Complexities of Video and Written Survey Responses

In the quest to understand consumer behavior and experiences, researchers continually seek innovative methods to gather more authentic and expressive data. As technology weaves deeper into the fabric of research methodologies, choosing between video and written responses has become not just a logistical decision but a strategic one that may reshape how we understand and interact with data. Video response usage in surveys and its effectiveness needs to be compared to the tried-and-true written word in capturing vivid personal narratives.

video responses
Video Response for Survey Question

Last November, Socratic conducted a short online survey with 302 consumers sourced from a general population panel. After answering a few close-ended questions about the Thanksgiving holiday, they were asked to share a special memory of celebrating Thanksgiving. Right before that question, participants were asked if they were willing to give their next answer as a short video recording (or not); video rejectors were then asked to type in their answer.

Obstacles In The Use Of Video Responses

Two-thirds did not agree to video recording themselves (n=202), and 100 participants agreed. This high rejection rate is probably the biggest problem with collecting video responses in survey research, and if you are looking at research with hard-to-reach audiences, screening out people for their unwillingness to participate by video may present a serious obstacle to achieving quotas.

Video rejectors in our survey were significantly older, on average 57 years vs. 50 years in the video group, and more likely to take the survey on a desktop as opposed to a mobile device (54% on desktop vs. 40% of the video group). The groups did not differ significantly by gender or census region. The top reasons people gave for video rejection: disliking pictures/videos of themselves (51%), not feeling camera-ready (48%), and finding this intrusive (32%). Other reasons ranged from simple technicalities (“I don't have a webcam or microphone,” “I am at work”) to profound security concerns (“It makes it easier for someone to steal your identity or use your voice/face for nefarious purposes”), no surprise in this age of deepfakes and other abuses of people’s likeness.

After data collection, we transcribed the video responses about Thanksgiving memories with our proprietary JSON-based software tool and compared them to the written open-ends. Another issue arose during this stage: 15% of videos were not usable due to lack of audio, though we had offered video participants the option of reviewing and re-recording their answer. Now, if a written open-end answer is too short, we are able to encourage further elaboration (for example, in this survey we programmed a check for the open-end: IF LESS THAN 50 CHARACTERS, SHOW ERROR MESSAGE Please be more specific in your response), but it is technologically highly challenging to assess the quality of video/audio during a live survey.

But There Is An Upside To The Video Responses

Comparing the 85 usable video answers to the 202 written open-ends, from a linguistic standpoint, the quality is on par: The Flesch-Kincaid grade level assessed in Word averages 7.4 in the written open-ends vs. 7.3 in the videos. Yet, video responses are substantially longer, with 3.2 sentences or 45 words on average, compared to 1.7 sentences or 25 words in the written open-ends; a cursory examination with the AI-based text analysis application Luminoso Daylight indicates richer themes and greater multi-dimensionality in the videos.

Final Thoughts And Stay Tuned!

While answering by video can provide valuable insights and is fun for people who like doing that kind of thing (generation TikTok), as survey researchers, we need to be cautious about implementing video on a greater scale, given the high rejection rate. Instead, let’s continue working on ways to improve the quality of written open-ends while making it easy and engaging for our survey participants to express their thoughts and feelings in writing. In this spirit, stay tuned for one of our upcoming blog posts, in which we will present insights from a collaboration with Reveal.AI where we look at open-ended responses that are refined with the help of an AI-powered “chat bot.”  

Birgit Eschmann, Research Director at Socratic, has more than 20 years of experience in managing large-scale marketing research projects for clients in diverse industries, helping them to turn data into compelling insights and stories that address key business questions. Clients rely on Birgit being adept at all stages of the research process, from study design, through questionnaire writing and field management, to advanced data analysis, report writing and final presentation.