Project Spark: Attention, Attention!

Continuing our spotlight on wave 1 of Project Spark!, this week we shine the spotlight on the Department of Work and Pensions, who were the third team to secure the Dragons backing and partner with the GCS Innovation Lab.

Chris Terry and Mark Storey, from the Department for Work and Pensions, explain how the team tested attention based metrics in campaigns, with the ambition of establishing new effective ways to improve GCS campaign performance.

Following the Dragons Den in late 2022, our team immediately began engaging with media agencies, industry trade bodies, and big tech companies. With the aim to understand in depth the current maturity and market adoption of attention based approaches.

It soon became clear that attention based approaches are still an emerging field within marketing and communications. And that GCS is ahead of the curve in exploring its application!

Through close partnership with the government’s media buying agency, OmniGOV, we identified 2 different techniques which we could potentially use to pilot and evaluate the use of attention based approaches. One from Amplified Intelligence, and another from Adelaide, two agencies at the forefront in the application of attention based metrics. 

In early 2023, we worked with OmniGOV to explore both options. We worked through their different methodologies, requirements, and limitations, to understand how to effectively test with a GCS campaign. 

After thorough exploration, the technique which Adelaide proposed was identified as the most viable. This was due to it being an ‘off the shelf’ option ready to be immediately implemented and tested within a campaign. Whereas Amplified Intelligence’s approach would require many more months for OmniGOV to understand the underlying data and build a test around this. 

We then immediately designed a pilot with the Help for Households campaign during April. This tested optimising display ad placements for the campaign towards attention – rather than optimising placements towards current metrics such as viewability or CTR. This was the first pilot in the public sector to harness attention based metrics!

How did the pilot work?

The pilot used Java tags to collect data on the environment around each display impression, for example: data on onsite ad clutter, placement position, scroll time and duration. This was then analysed by continuously trained Machine Learning (ML) models to generate and assign an ‘Attention Unit’ (AU) score between 0-100 for every impression. AU Scores could then be monitored in real-time by OmniGOV, enabling live reporting and optimisation over the four week test. 

Initial pilot results:

  • Early indications are that there is a strong likelihood that high AU scoring media could drive uplifts across some KPIs. 
  • For the Help for Households pilot, a 6-8% uplift was measured for favourability and main-message-take-out, whereas 15 other measures did not have statistically significant uplifts. This was measured in a brand lift study conducted by a third-party measurement agency. 
  • After exploring the data in depth, the Innovation Lab team identified that a significantly larger panel size would be required for subsequent brand uplift studies, if the results were to be robust enough to draw definitive conclusions on effectiveness across a greater range of KPIs. This was a key learning from the pilot which can be taken forward in future tests. 
  • Early indications are that in-flight AU measurement and optimisation could unlock potential re-investment opportunities (up to 8% of the total pilot media spend), through redeploying low scoring AU media to higher scoring placements. 
  • Aside from the very highest quality placements, CPMs were similar throughout the range of AU scored media, indicating there is currently not a cost premium to optimise towards attention metrics. The attention agency expects that this will begin to change over the next 2-3 years, as they expect the market will move toward increasingly transacting using attention based metrics. 

What’s next?

The pilot with the Help for Households campaign has demonstrated that whilst attention based approaches are in their infancy, there is value in continuing to robustly test and at a larger scale. 

OmniGOV and the central GCS team are working to explore further testing using more variety of campaigns. This will increase our understanding of effectiveness across a greater number of channels and KPIs. A key part of this will be a greater emphasis on making sure the sample sizes and data analysis of subsequent brand uplift studies build on our learnings, as well as identifying best-practice approaches for future tests. 

Alongside the recent pilot, we have also developed a ‘State of Attention’ blog for GCS members which shares all we have learned so far. This shines a spotlight on the current maturity of attention-based approaches in the market, their current applications, and how they work, to guide your own exploration of this innovative new approach!