A couple of weeks ago the W3 team attended a seminar on systems thinking convened by the Centre of Excellence in Intervention and Prevention Science (CEIPS).
The keynote was presented by Diane Finegood, a Canadian scientist, research funder and systems thinker. She made a point of the difference between complicated and complex: complicated is simply ‘there’s an awful lot of stuff in the mix’ but complex is ‘there’s a lot of stuff but it’s organised, although possibly in ways we don’t fully understand right now’.
(Sidebar: see this recent article for an explanation of how views of social problems as complex vs complicated underpin the different approaches used for research and evaluation.)
I love attending CEIPS seminars because I always come away feeling ‘okay, we’re on the right track’ and also ‘whew, we’re not the only ones finding this stuff tricky’. Trying something new always involves a certain amount of discomfort — sometimes quite a lot of it — so the encouragement I get from attending the CEIPS events is important.
It also helps to clarify how we’re using systems concepts differently from others in the field. In a nutshell, we’re using modified causal loop diagrams to draw out the ‘system logic’ of prevention programs that engage with communities understood as complex adaptive systems.
Rather than trying to understand a complex problem, we’re trying to better articulate how peer-based HIV and hepatitis C programs operate as solutions in a complex environment.
This builds on existing ideas within the tradition of realistic evaluation. In particular, we liked these observations from a key document in that tradition:
- interventions consist of a chain of steps/processes with negotiation and feedback at each stage
- the chain is often not linear
- interventions are embedded in social systems
- interventions are prone to modification
- interventions are open systems and change through learning
(Source: Realist Synthesis: An Introduction, by Ray Pawson, Trisha Greenhalgh, Gill Harvey and Kieran Walshe, 2004, ESRC Research Methods Working Papers, p5.)
Another difference is that we’re engaging with existing programs, rather than identifying new possibilities for intervention. And we’re starting from the assumption that many practitioners in our partner programs are already tacit ‘systems thinkers’, rather than introducing it as a new approach.
This means we’re treating systems thinking as a language and set of tools for representing and updating systems-like mental models of HIV prevention, rather than as a new way of thinking that will reveal hitherto-unimagined solutions to the HIV epidemic.
Part one of the project, our focus in 2014, has been developing a methodology to elicit and diagram and revise those mental models in a preliminary draft form. We recently completed our fourth and final draft map, and we’ve moved on to ‘analysing’ the maps — which has meant having a further conversation about what principles should guide that process.
One key principle emerged from the workshops, where participants understood the maps but wanted to know ‘what can I do with them?’ There was agreement that the full system logic map was probably not something you’d ever show to your funders… but that the draft map might reveal strategic considerations — warnings about loops that could work against you, or perhaps what Holland calls ‘leverage points’ where a small intervention could yield large effects.
In addition, by thinking about ‘what needs to work as expected to achieve our long-term goals’ — a consideration adapted from theory-based evaluation — our goal in phase two of the project (2015) is to develop and test draft quality indicators that could contribute to improved effectiveness in programs and ongoing refinement of the system maps.
Our ‘indicators’ are not the quantitative metrics or targets you might see in an ISO9000 quality standard. Instead, we see them as flexible themes or ‘hooks’ for collection of a diverse range of different formats, perspectives and sources of knowledge, via organisational learning and evaluation, that inform our confidence that the program has understood and engages closely enough with its context to have effectiveness despite the uncertainties involved.
‘Inform’ is a bit of a weasel word — in fact, the knowledge gathered around an indicator could decrease a program manager’s confidence that things are heading along the right track. That’s pretty valuable information for her, though, if it comes in time to try something different.
One of the key findings of our mapping exercise has been that many sources of knowledge used in state and national HIV and hepatitis C policy-making operate on very slow timescales; if we only relied on those sources, timely quality improvement would not be possible.
So to summarise our project in relation to systems thinking, we’re not doing anything terribly new, but we are seeking to articulate existing traditions in a new-ish configuration:
- drawing on existing ideas about systems to extend the tradition of realist evaluation;
- using systems thinking tools to develop program theories about highly flexible programs engaging with communities understood as complex adaptive systems;
- analysing the ‘system logic’ of interventions to identify strategic considerations for program leaders and quality indicators to guide organisational learning and evaluation.
On Wed 26 November from (corrected:) 2-3PM we’re holding an ARCSHS Seminar ‘Not just individuals and information’ on our first year findings, talking through all these issues and more, with an opportunity to ask questions and give us feedback from your own experience on the approach we’ve taken so far, and the challenges we might face in the year and half to come. We’d love to see you there.