DECIPHer short course: Developing and evaluating complex public health interventions

I registered for DECIPHer’s five-day short course on developing and evaluating complex public health interventions hoping for an opportunity to learn, reflect, and develop ideas and strategies for better decision-making relating to my work on complex interventions. I was also hoping for a more detailed understanding of some of the key frameworks for the design and evaluation of different types of complex health intervention, and the challenges, limitations and benefits of conducting these. The course certainly met my hopes – but let me share my experience of those five days with you.

Interactions and interplay

A central aim of the course was developing an understanding of complex systems and interventions, and how to ‘break them down’ by looking at their different levels and their interactions. We looked at a variety of different frameworks, including the RE-AIM (Reach, Efficacy, Adoption, Implementation and Maintenance) and MRC frameworks for evaluating interventions. Perhaps most central to the course, though, was the socio-ecological framework, which guides the research undertaken by DECIPHer and many other public health research centres.

Rather than only focussing on the individual’s knowledge, attitude and immediate network, this framework recognises the interactive and dynamic interplay of the ‘intrapersonal’, ‘interpersonal’, and ‘organisational’ levels, as well as the wider environment and policy context. It was even suggested during discussion that this framework might benefit from including yet another, ‘international’, level. This could, delegates suggested, take into account the global interplays that mean public health decision-making within one country may influence that of another.


Image of the socio-ecological framework

A graphical representation of the socio-ecological framework, which course delegates should not have any trouble remembering in their sleep.

An international example of using the socio-ecological framework to develop and guide interventions was presented by Associate Professor Andrea de Silva-Sanigorski from the University of Melbourne. Andrea took us on a journey through ‘Romp and Chomp’, her intensive and complex research project aimed at reducing overweight and obesity in young children in Australia.

Theorising systems and interventions

Underlying the discussion of different frameworks was another key message from the course: the importance of developing theoretical models regarding public health problems, interventions and their implementation.

We looked at how fully theorising a complex public health issue can allow us to be attuned to the unforeseen effects of locality, history and feedback that can emerge in a complex system. By being aware of these effects, I saw how components can be designed to target multiple levels of a complex issue in a simple and practical manner.

We also discussed how theorising interventions can help to negotiate the perceived need for interventions to be standardised and implemented identically. This conflicts with the view that interventions should focus on ‘locality’ effects and be flexible according to settings’ different needs, strength and weaknesses. We discussed how by developing theories of the processes involved, interventions could be adapted to meet local conditions and context by standardising the ‘functions’ of interventions rather than having overly prescriptive components.

Importantly, it was made clear that theorising does not necessarily mean using a formal theory, and when constructing a ‘logic model’, which sets out how it is believed a planned intervention is going to lead to change, we may draw on a range of formal and informal theories and concepts. We applied this new knowledge by developing our own logic models, and I found myself thinking about what the logic model of a new feasibility study that I am involved in might look like.

Taking a phased approach

Case studies of research projects were used to illustrate how different research designs and methods can be used to evaluate interventions and policies in a phased approach to answer different questions. Dr. Jeremy Segrott, for example, explained how a feasibility trial was necessary for the Kids, Adults Together Programme, in order to consider issues such as whether the measures of data collection were feasible and acceptable for study participants.

Dr. Suzanne Audrey drew on the example of ASSIST, A Stop Smoking In Schools Trial, to illustrate how carrying out a process evaluation could allow researchers to obtain an in-depth knowledge of change mechanisms. Suzanne also addressed some of the practical issues involved in carrying out a process evaluation, and I took away from this session a new understanding of the importance of carefully planning the timing of measures designed to gain insight on research participants’ views, and of involving a range of different people.

Picture of army teamwork exercise

The course highlighted the importance of teamwork and collaboration in carrying out good research.

We were also introduced to the realist perspective on trials, and different ways of understanding “what works, for whom, under what circumstances, and why” – which might itself change as the intervention changes. Acknowledging that RCTs are not always possible or appropriate, we were presented with some alternative approaches, including natural experiments and use of data linkage. Dr. Sarah Rodgers discussed the work of the Centre for Improvement of Population Health through e-Records Research (CIPHER), a newly funded MRC research centre anonymously linking individual-level health and environmental data. With an interest in active travel, this made me think about how current environmental changes in Bristol could possibly be investigated with regard to health outcomes.

Translating research into practice

All of the speakers emphasised the importance of involving key stakeholders in the research at the early stages. We learned that the ‘Romp and Chop’ study began with a consultation with over 50 stakeholders on the project action plan, and we saw in other sessions how this type of knowledge exchange can help overcome some of the challenges of translating research into practice. These include researchers’ and politicians’ different timelines, priorities, access and use of the research evidence.

Dr. Sarah Rodgers presenting on CIPHER, the Centre for Improvement of Population Health through E-Records Research

Dr. Sarah Rodgers presenting on CIPHER, the Centre for Improvement of Population Health through E-Records Research

Dr. Simon Murphy presented an example of how to facilitate this knowledge exchange: PHIRN, the Public Health Improvement Research Network, a Wales-specific policy, practice and academic network for transdisciplinary action research. Using case studies of PHIRN’s work in aiding successful evaluations of national schemes, including DECIPHer’s evaluation of the Free Breakfast Initiative in Wales, the Strengthening Families Programme and Wales’s National Exercise Referral Scheme, we were given some ideas for supporting engagement with policy and practice, and how to make research ‘count’. This part of the day highlighted the need for genuine collaboration between researchers and policy-makers in order to impact on public health, and made me think about the ways in which policy-makers currently are and could be involved in some of my projects.

Five as the magic number?

Five informative, well-organised and fun days later, I am back on the train to Bristol with plenty to think about. It’s been a great course, for both thinking about complex interventions and how challenges can be overcome, and sharing knowledge and experience with other members of DECIPHer teams in Cardiff and Swansea, and individuals from a range of research and practice backgrounds.

By reflecting upon the different frameworks learned, it seems as though five – the number of levels of change of the socio-ecological framework, dimensions in the RE-AIM framework, and  main phases presented in the original MRC guidance – is an important number in evaluating complex interventions.

Therefore, here are my five take-home messages for you:

  1. Spend a good amount of time on designing and developing the interventions and on feasibility work before attempting a larger trial.
  2. Combine methods in order to find out whether the intervention has had an effect and why this might be the case.
  3. Collaborate on research with colleagues, practitioners, policy makers, and members of the group you are trying to target.
  4. In order to design and evaluate complex health interventions, familiarise and make use of different frameworks and theoretical approaches to research and apply them in your research.
  5. Last but not least, enjoy your research, collaborate with others and try to make your research count by disseminating it appropriately and widely.

It was great to return to Cardiff a week later for the UKCRC Public Health Research Centres of Excellence Conference and see how the socio-ecological framework, complexity theory, and various guidelines are being used in interventions to improve public health.


 About the author: Heide Busse is Research Assistant at DECIPHer, based at the University of Bristol

DECIPHer runs annual short courses on developing and evaluating complex public health interventions. For full information on the 2014 courses, and to book, please see the short courses webpage.

Image of socio-ecological framework: source – ‘A Holistic Approach to Environmental Public Health’, September 29, 2011, Bryandamis, National Environmental Health Promotion Network blog.

Image of short course: source – Owen Richards

Leave a Reply

Your email address will not be published. Required fields are marked *