Learning culture key to measuring marketing effectiveness

'Making effectiveness work' launched at the IPA Effectiveness Conference

As data and advertising become ever more fragmented, a learning culture is essential to measuring marketing effectiveness. A learning culture reduces silos, empowers marketers and ultimately allows them to ask better questions. Crucially, it, should be driven by a new Model-Experiment-Simulate-Implement (MESI) approach.

This is the rallying call to action, accompanied by detailed advice, in a new IPA report, Making effectiveness work, which was launched today (9 October), at the professional body’s flagship IPA Effectiveness Conference.

According to the comprehensive report, authored by Simeon Duckworth, Neil Charles and Duncan Stoddard of the Melt Collective, marketers should feel bullish about the approaches they have at their disposal to demonstrate value and very optimistic about measuring ad effectiveness. Despite this, as the report makes clear, there is no silver bullet. No approach can address all measurement challenges across strategic, campaign and tactical use cases. The consequence is therefore that most advertisers will need to stitch together multiple approaches and data sources, which is where a learning culture and MESI approach are vital.

The importance of Learning Agendas

According to the report, making advertising effectiveness work is less about chasing the perfect evaluation technique, or finding a single universal ROI. It is now more about establishing a decisive, effectiveness culture; “a culture that is evidence-based – optimistic and enthusiastic about data and analytics - but designed to manage its blind spots. One that fosters a commitment to learning, innovation and evidence-based decisions, while working to address how misaligned incentives impact 'what we know'.”

The report explains the importance of formal Learning Agendas to deepen the commitment to active learning. A Learning Agenda:

  • Is more focused than a programme of research and analytics.
  • Provides the pivotal information that changes minds and decisions, creating clear long-term learning goals and actions that cut across silos.
  • Recognises that there are key marketing questions which brands cannot answer. And others that can only be answered by combining information from multiple sources and, importantly, by trying something new.
  • Understands real effectiveness is not just a collection of modelling or research debriefs.
  • Is a commitment to experimentation, innovation and (hopefully) better understanding and better results.
  • For modelling, it presses for progressive innovation and development of innovative approaches.

A MESI approach – how to combine modelling and experiments

To drive a learning culture, the report’s authors advocate a measurement approach that they dub MESI – Model-Experiment-Simulate-Implement. It draws on a “causal ladder”, can be used for tactical, campaign and strategic decisions, and involves the following steps, summarised below and detailed fully in chapters within the report:

Start with model

Start with a model, this could be MMM, data-driven attribution or consumer modelling, to map marketing effectiveness. Use this model to highlight where there is evidence to suggest changing the communications plan may be more effective.

Use experiments to discover and innovate

Design an experiment to analyse the change you are considering and that will help you learn what you need to know to move forward (e.g the incremental value of search; removing all promotions in a single region; running new brand activity alongside performance campaigns; increasing price in selected markets etc). Use the model to know what scale of experiment is required (e.g. reduce spend on paid search by 20%; double video spend to accommodate longer brand ads) and implement this with aggressive and imaginative experimentation.

Simulate impact of new plan

Combine this new evidence (from MMM and the experiments) into a planning/simulation tool. These are often offered by agencies or through other independent econometrics or analytics companies. Ideally, the simulation tool allows strategic not just tactical simulation. 

Implement experiments’ findings and scale

Use new learning from the experiments to adjust and fine-tune attribution and MMM and to validate and prove to stakeholders.

Commenting on the report:

Says Laurence Green, Director of Effectiveness IPA:

"This report aims to help steer marketers through the effectiveness thicket, using the core techniques of marketing mix modelling, experiments and attribution. What they will find is that the solution lies more in establishing a decisive, effectiveness culture than in chasing the perfect evaluation technique. The IPA’s Effectiveness Leadership Group is to be congratulated on this latest contribution to best practice."

Says Simeon Duckworth, Founder of the Melt Collective:

"There is no silver bullet to measure marketing effectiveness. Each approach has its role and its limitations. For most marketers, the solution is to focus more on the process and culture of active learning rather than on specific techniques. Our report recommends a set of practical tips, starting with a formal Learning Agenda, to kick-start that culture."

Says Nico Neumann, Assistant Professor at Melbourne Business School:

"While the technical aspects of measuring marketing effectiveness often generate the biggest fears, it's usually the mindset and culture of an organisation that determine whether analytics projects succeed. The report highlights valuable tips and lessons for every marketer, regardless of their analytics expertise."

Download your copy of 'Making effectiveness work'
Last updated 09 October 2024