Where to next for behavioural insights?

Behavioural insights have risen rapidly on the agenda of governments around Australia and the world. This year, there are just over 200 nudge units across the world.

Now that “nudge” units have been integrated into the machinery of government, behavioural economists are setting their sights on new frontiers. The buzz at Behavioural Exchange 2018, held in Sydney on 25 and 26 June, was all about where behavioural insights was headed next.

Top of the list were tackling more complex social problems, harnessing new technologies and machine learning, increasing interdisciplinary collaboration and scaling.

Can algorithms be accountable?

Artificial intelligence and algorithms have potential to assist governments in addressing complex problems. For example, the UK’s Behavioural Insights Team has used tech-based approaches to analyse social worker case notes and outcomes to understand factors that indicate a need for intervention. Combined with discussions with experienced social workers, this is informing training for new social workers.

Australia’s own Data 61 at CSIRO is exploring the use of integrated datasets, machine learning and the potential for personalisation. While Stats NZ have an Integrated Data Infrastructure that allows data linkage across agencies, which many an Australian researcher would love to access.

If you’re wondering about the ethical dilemmas these new approaches give rise to, Bill Simpson-Young, Director of Engineering and Design at CSIRO, set out five handy principles for accountable algorithms: Responsibility (i.e. a human is responsible), Explainability, Auditability, Accuracy and Fairness.

When an audience member asked about government’s responsibility to share the results of integrated data analysis, such as that undertaken by Stats NZ to identify key factors in childhood that relate to poorer outcomes in adulthood, Michael Sanders of the UK’s Behavioural Insights Team raised the need to ensure that analyses do not reinforce negative expectations and create self-fulfilling prophecies (e.g. by telling people that they fit the criteria for poor life outcomes).

Behavioural insights and design thinking – best friends or barely able to relate?

Not to neglect the other big buzzword in government these days, the conference also explored synergies between co-design and behavioural insights.

Nina Terrey of Thinkplace set out the different foundations of design thinking and behavioural economics:

  • worldviews (social constructionist vs logical positivism)
  • approaches to problem solving (abductive vs inductive and deductive)
  • processes (participatory and dialectic vs expert collaboration)
  • approach to systems (system disruption and collaborative generation of solutions vs identifying ways to make existing systems work more efficiently and effectively).

Sometimes these differences can give rise to tensions. But the disciplines have been able to forge partnerships because both enable deeper learning and designing of solutions to complex policy problems.

Dilip Soman, Professor at the University of Toronto’s Rotman School of Management, suggested the two are two sides of the same coin – both begin with empathy.

Who are the stakeholders?

Both behavioural insights and design thinking have a focus on understanding stakeholders’ perspectives (albeit in different ways), so it’s unsurprising that consultation was on the agenda. Martin Parkinson, Secretary of the Department of the Prime Minister and Cabinet, kicked off the conference by calling on the public service to make better use of evidence and to learn from failure. He suggested that almost all problems in policy arise because we haven’t thought about all of the stakeholders – not only the end users, but the decision makers and practitioners.

Cass Sunstein, Professor at Harvard University, and one of the key authors in behavioural insights similarly identified public consultation as important to informing government decision-making, not an exercise in legitimation.

Where was evaluation?

There were plenty of references to randomised control trials (RCTs and debate about what constitutes evidence. Professor in Economics at the University of Chicago, John A List proposed that 3-4 well-powered, independent RCT replications are required before scaling up an intervention. Deputy Secretary, Economic at the Department of the Prime Minister and Cabinet, David Gruen agreed on the importance of evidence but noted the need for timely action and that “the truth is only one special interest group in policymaking and not particularly well funded.”

But there was little reference to the broader discipline known as evaluation – and its range of approaches to answering different questions in different contexts. For the relatively straightforward kind of interventions and the kinds of questions behavioural insights trials have engaged with to date, the focus on RCTs has been appropriate. But will RCTs have the answers, as behavioural insights moves to tackle more complex problems in complex dynamic systems?

Our experience in evaluation suggests a broader repertoire of approaches and engagement with system dynamics will be needed. Evaluation has long acknowledged and frequently included economic evaluation in its repertoire. Will behavioural insights start to draw more on evaluation expertise?

Here’s hoping a deeper relationship between evaluation and behavioural insights practitioners will be one of the new frontiers.

Receive our latest news and insights
  • This field is for validation purposes and should be left unchanged.