In order to collect high-quality data, it is critical to test questionnaires before a survey is launched. This is especially important for questionnaires which will be administered online, as respondents must complete these without any support from a trained interviewer.
Over the last few years there has been an increasing trend for surveys which have traditionally been administered face-to-face to move to web-first or mixed/multi-mode designs. This process has been accelerated by the pandemic. In these instances, questions may need to be adapted for the new mode(s), both in the way in which they are framed and how they are visually presented to respondents. Thorough testing is essential to understand how questions perform in the new mode(s).
NatCen’s Methodology and Innovation Hub conducts hybrid cognitive and usability testing of online questionnaires. These ‘cogability’ interviews aim to explore how participants understand questions and to establish the user-friendliness of the visual design. It is important to test questions on both larger screens (laptops or desktops) and on smaller screens (smartphones). Due to the disruption to face-to-face fieldwork during the COVID-19 pandemic, we were required to adapt our approach to this type of testing.
Prior to the pandemic, interviews were carried out face-to-face in participants’ homes, at their places of work or sometimes in community venues such as libraries. The interviews involved observing participants completing web questionnaires, typically on their own devices. Participants would be encouraged to ‘think aloud’ i.e. to articulate their thought process, when answering the questions. The interviewer would record observations, such as hesitations and changes to answers, to capture their implicit reactions to the materials. They would then ask probes to collect further details about how participants found the questions and their views on the overall look and feel of the survey.
During face-to-face interviews we used a document camera for our testing. These cameras provide a video recording of a laptop or mobile screen to allow in-person observation of survey completion. As a result of COVID-19, the face-to-face testing transitioned to remote video conferencing interviews (via Zoom or Teams). In remote interviews participants are asked to use the screen sharing feature as an alternative to the document camera. However, when testing mobile touchscreen devices remotely, the inability to see hand movements was lost, which provides an additional layer of observation data.
During the pandemic, all recruitment was conducted by an agency from an existing database of participants, rather than street or doorstep recruitment which was prohibited due to the lockdown measures. This approach had both advantages and disadvantages. Whilst recruitment from such databases excludes access to ‘offline’ participants, it opens testing to a wider population in terms of geographic coverage. Participants were offered the same incentive for taking part in remote interviews, although e-vouchers sent by email replaced the physical vouchers that had been used previously.
The benefit of this transition was a reduced burden for both participants and interviewers. A perceived lack of common technological knowledge and software was previously seen as a barrier, however due to the pandemic many people had increased experience of using Zoom. Generally, interviews went smoothly without technical problems. Instructions were sent out in advance of interviews so that participants were able to share screens and join the call quickly. Audio and screen recording were easy to achieve, and clients were also able to observe remotely. This is potentially less intrusive than being physically in the room with the respondent.
Using online testing for web surveys has certainly been a success, however, there is some doubt as to whether it would become the solo method for all questionnaire testing. Remote testing puts a greater reliance on participants in terms of using technology. The best way forward may be to combine online modes of interviewing with face-to-face interactions, depending on the survey’s content and target audience. For example, it is likely that face-to-face interviewing may prove beneficial if testing questions with people who are less digitally literate.
In terms of cost-efficacy there are potential savings to be made by conducting remote interviews, as no travel time is involved. However, with remote interviews we have noted higher numbers of no shows or rescheduled appointments which adds to study costs and lost time for researchers.
Post-pandemic, NatCen will continue to adopt a mixture of remote and face-to-face testing, particularly for mixed-mode surveys where we need to engage with offline communities, or test physical materials such as paper questionnaires.
NatCen’s Methodology and Innovation Hub, alongside its Centre of Social Survey Transformation, bring together a wealth of experience in testing and trialling different survey methods and designs. As more surveys look to switch from one mode to another, or incorporate additional modes within their current design, this type of testing will become more important than ever.
This blog is authored by researchers Sophie Pilley and Natasha Phillips in NatCen's Questionnare Development and Testing Hub.