Report

Time Use Surveys in the UK: A Methodological Review

A review exploring the current state of online time use surveys in the UK, with the aim of improving efficacy for both respondents and data users.
Lewes, England. Market stalls along Cliffe High St in Lewes busy with visitors.

About the study

Examining how people allocate their time provides a distinctive perspective on society, offering insights beyond traditional survey-based research or economic indicators like GDP. But what is the most effective way to collect high-quality time use data?

As part of the Economic Statistics Centre of Excellence (ESCoE) research programme, NatCen were asked to conduct a review of time use online data collection conducted in the UK from 2020 to 2023 using two tools: ‘ELiDDI’ developed by the Centre for Time Use Research based at University College London, and ‘OTUS’ developed by the Office for National Statistics. 

The review assessed whether the surveys are collecting the best possible quality data and meeting a range of economic measurement needs, with the aim of informing decision making and guiding future developments in the field of time use surveys.

Findings

Below are some key recommendations drawn from the review for creating an optimally designed online time use study.

Respondent journey: 

  • When creating a time use diary, consideration should be given to the cognitive processes respondents go through when thinking about their time and the corresponding burden placed on them.
  • Answer categories (activity codes) should be kept as distinct as possible. Where activities could fit into multiple categories, clear guidance should be provided on what code to select. 
  • Asking respondents to calculate an activity’s duration is cognitively burdensome, with the start and finish time of an activity being more salient. 
  • Clear guidance, whether through video or text-based instructions, is essential to ensure participants can navigate the tool and enter data independently, as well as resolve any encountered issues.

Online tools and supporting materials: 

  • In terms of layout and visualisation, online tools should be user-friendly and visually appealing, adopting a mobile first approach.
  • Paper materials like a memory aid booklet can help to simplify the experience for respondents and improve data quality.
  • Experimental data, from a parallel run conducted on the NatCen Opinion Panel, showed that providing participants with paper support material increases the likelihood of respondents taking notes and referring to them. See this ESCoE blog for more details. 

Accessibility: 

  • Accessibility requirements should be considered in the design. Methods of data entry should be compliant with the latest Web Content Accessibility Guidelines (WCAG).
  • Different users may find certain data entry methods (e.g., drag and drop, drop-down lists) easier or more difficult to use. For example, time use tools should support both keyboard-only and mouse-only entry to accommodate those with motor or dexterity impairments.
  • Online tools should be compatible with screen readers to support those with visual impairments.

Representativeness and fieldwork efficiency: 

  • Allowing respondents to switch between completion modes is desirable but can be impractical given budgetary constraints and tight turnarounds for diary completion.
  • The use of telephone fieldwork can improve response rates and sample profile.
  • Offering higher incentives can be highly effective at increasing response rates but does have diminishing returns.
  • Re-issue fieldwork periods, which give respondents another chance to participate if they had missed their initial allocated days, can boost response rates. However, this approach complicates analysis since data will be collected on different weeks and delays delivery.
  • Providing a longer completion window (72 hrs vs 48hrs) and opening the diary at the start of the diary day have a positive effect on response rates.

Methodology

The review consisted of three strands of work:

  1. A desk review of both time use surveys, covering respondent communications, paper supporting materials and the entire online tool, to identify barriers and enablers to participation, as well as factors which may impact data quality. The review assessed the surveys in relation to the holistic respondent journey, language, usability and accessibility.
  2. Analysis of data available from data collections conducted by NatCen using the two online tools between 2020 and 2023, including a parallel-run, to identify ways to optimise fieldwork efficiency, both in terms of costs and data quality. Data evaluated included response rate data, sample profile data, experimental data, paradata and feedback from respondents and telephone interviewers.
  3. A timings exercise conducted to evaluate on average how long each study’s time diaries take to complete. Fourteen participants were purposively selected from NatCen staff to take part in the testing exercise, with a mix of people 40 years old or above and those below 40 as well as with and without day-to-day experience of working on survey development ensured.