Blog

The 2nd annual ABS webinar

Webinar Q&A

On the 26th of June 2024, the A Better Start (ABS) national evaluation hosted their second annual webinar. The evaluation team shared interim findings, which are presented in the second annual report, and the next steps for the national evaluation. 

A Better Start is a ten-year (2015-2025), £215 million programme set-up by The National Lottery Community Fund, the largest funder of community activity in the UK. Working with local parents, the ABS partnerships are developing and testing ways to improve children’s diet and nutrition, social and emotional development, and speech, language, and communication. ABS is place-based and enabling systems change. It aims to improve the way that organisations work together and with families to shift attitudes and spending towards preventing problems that can start in early life. The national evaluation is delivered by a consortium led by NatCen in partnership with University of Sussex, RSM, the National Children’s Bureau, and Research in Practice.

You can watch a recording of the webinar below. Questions were submitted during the presentations, and we have provided responses to them below the recording.

Webinar Q&A

Who  delivers A Better Start?

A Better Start is delivered through five partnerships based in Blackpool, Bradford, Lambeth, Nottingham, and Southend-on-Sea. Within the partnerships, activities are delivered through a wide network of partners in their local areas. Each of the partnership’s websites provides the details of services provided.

How were the evaluation questions developed? 

The four evaluation objectives were developed by The Fund and presented in the invitation to tender for the national evaluation. During phase one of national evaluation, the evaluation team tested the feasibility of those evaluation objectives. Learning from phase one informed the methodology presented in the phase 2 protocol and currently being delivered. The evaluation questions relating to each of the four objectives are:

  1. What contribution has A Better Start made to the life chances of children?
  2. How have the services in A Better Start sites contributed to improving children’s diet and nutrition, social and emotional skills, language and communication skills and systems change?
  3. How have families from different backgrounds experienced A Better Start?
  4. How has A Better Start reduced public costs that relate to primary school-aged children?

The child-level outcomes are from the Common Outcomes Framework for the A Better Start programme. This was developed by the Personal Social Services Research Unit (PSSRU) at the London School of Economics for The Fund to support the ABS delivery partnerships and anyone interested improving children’s outcome through investment in early years provision. You can find the framework online here. The framework makes use of routinely collected data and indicators on child development in the three ABS outcome domains: social and emotional development, communication and language development, and diet and nutrition.

What is the timeline for findings on each objective?

Findings for Objectives 2 (ABS implementation evaluation) and 3 (experiences of families) are reported annually as the emerge from the qualitative data that are collected over the course of each year of the evaluation. 

Findings for Objective 1 (child-level outcomes evaluation) will be reported in the final report in 2026. This is because we are relying administrative data that has not been available until towards the end of the ABS delivery.

Findings relating to ABS reach and grant spend for Objective 4 (cost evaluation) are reported annually. The cost-consequence analysis will be reported in 2026 along with the findings from Objective 1.

The Phase 2 Protocol presents the detailed timelines for each evaluation objective.

What will the contribution analysis be able to say about ABS? 

The contribution analysis will synthesise the mosaic of evidence that is being developed by the evaluation activity across the four evaluation objectives. The contribution narrative will aim to provide a robust account of the link between programme implementation processes, intended and unintended intermediary and later stage outcomes, independent contextual features, and the development of causal mechanisms that can explain how and why outcomes have (or have not) been achieved. 

The approach to contribution that the national evaluation is taking can be found in Phase 2 Protocol and in the second annual report. Draft contribution claims are presented in Chapter 6 (p.16) of the second annual report and throughout, we discuss the how emerging evidence is contributing to the contribution narrative.

Is there any more detail about the costs and types of activities that contribute to systems change in each area and across the programme?

The Objective 4 technical appendix shows ABS grant spend per partnership and can be found in pages 108 – 115 in the second annual report. In the annual reports, only the projects contributing outcomes that represent more than 10% of total spend for each partnership are listed. Therefore, if there are projects contributing less than 10% of the total spend, that information is not published in the interim annual reports. 

A typology of activities that contribute to ABS outcomes, including systems change, will be presented in the final report that will be published in Spring 2026. 

Is there information on the numbers of children and families reached in each area?

Unique primary beneficiary (UPB) data is presented in Chapter 10 of the second annual report starting on p. 88. The ABS programme supported up to 19,200 UPBs per year between 1st April 2015 and 31st March 2023. The majority of UPBs were children between the ages of 0-3 (90%); the other 10% were pregnant people. As discussed in the report, it is worth noting some limitations in the monitoring data on UPB numbers reported by the ABS partnerships.

  • Firstly, partnerships collect data from different sources. Locally commissioned services may have different data collection processes compared to centrally commissioned services, meaning differences in the number of UPBs across services and partnerships may not be entirely attributed to uptake.
  • Secondly, the degree of resource requirement, engagement, quality, and experience differ across services. For example, some ABS funded services offer intensive, bespoke support to a small number of families with acute needs, whereas other ABS funded services offer less resource intensive, universal provision to the entire eligible population. This nuance will be lost in the partnership-level UPB data analysis, where any participant from any ABS service is counted as one UPB.
  • Thirdly, any beneficiaries who accessed more than one service within the same year will only be counted once, whereas beneficiaries who accessed support in more than one year will be counted once in each year they accessed support, regardless of how many services they used. This means that the cost per beneficiary analysis presented should be used to inform the overall picture of implementation rather than for assessing the performance of the ABS programme or individual partnerships.