Policy process advocacy – is the desirable, feasible?

A few weeks ago, the co-occurrence of a ‘twitter discussion’ (initiated by @LukeCravenand some reading I was doing on policy analysis tools prompted me to start thinking about what Hill (2013) refers to as process advocacy.  Process advocacy is concerned with improving the nature of policy making.  It is different to policy advocacy in that it concerns advocating generally for ‘better’ policy process rather than the substance or content of a particular policy.

I haven’t spent ages detecting every nuance of the history of process advocacy, but it strikes me it has been going on for a very long time.  The ‘policy analysis’ movement which started in post-war USA drew on the belief that social science (then positivist) and economic analysis techniques could make policy better.  Turnpenny et al (2015) highlight that researchers and policy practitioners have developed a range of tools to help with the formulation of policy and a lot has been written about individual tools and how they should be used (I’ve thought hard about how I can subtly introduce my favourite jargon I have learned in this reading – but I am just going to say it – this focus on tools and analysis is referred to as ‘analycentric’ 😀 ).  For the most part tools support their users to understand the situation (problem) and consider the cost, effectiveness and impact of possible interventions. Turnpenny et al (2015, p.21) introduce a typology of policy formulation tools which gives examples of tools that are helpful in different policy formulation tasks – problem characterisation/evaluation, specification of objectives, options assessment/policy design.  The interesting thing about Turnpenny et al’s typology is that none of the examples of tools are what I would characterise as ‘systems tools’ (It’s odd this because ‘systems analysis’ and ‘policy analysis’ were born in the same post-war fever for rational analysis and every so often I have spotted that systems thinkers and policy process advocates have definitely ‘rubbed shoulders’ and drawn on similiar intellectual traditions but maybe that should be the topic of another blog).

I have used policy analysis tools as an example here but there are other ‘live’ forms of process advocacy.  For example, the interest in evidence-based policy for example leads to the idea that policy practitioners should consult or even conduct systematic reviews of literature to inform the way they proceed.  In addition, the interest in participatory democracy has also led people to advocate for the use of participatory techniques, such as citizens juries and participatory budgeting.

Systems practitioners, including myself, advocate for ‘systems approaches’ or ‘complexity-friendly’ approaches to be used in the policy process.  We do that because our experience (which may include doing or reading formal research) leads us to claim that they are helpful for understanding, and considering how to act in, situations which are characteristed by interdependent variables, multiple perspectives and ethical/political conflict.  A lot has also been written about these tools and how they could be used – either in books focussing on a single approach or in books compiling a number of approaches.  The tools aren’t only written about from the perspective of their use for policy (they can also be used for strategy development in organisations) but many case studies are given of analysing situations where government/public attention and action is expected.

In advocating for the use of ‘systems approaches’ in the policy process, we ‘compete’ with those who advocate for the use of econometric analysis techniques, forecasting, risk assessment, cost-benefit analysis and so on.  We also ‘compete’ with those advocating for the use of tools supporting evidence-based policy making.  We also ‘compete’ with those advocating for the use of participatory techniques.  In other words, we add to the ‘noise’ of people claiming they have a ‘better’ way of doing policy.

Turnpenny et al (2015) comment that little is known about how policy analysis tools are actually used in practice.  This is what resonated when I saw Luke’s tweets – we also know very little about how systems tools are actually used within the policy process.

 

Political scientists, such as Paul Cairney (blog https://paulcairney.wordpress.com/), have argued that if we would like ‘science’ to be considered in policy (as is the case with the evidence-based policy movement) then it is important to pay attention to what political science understands about how policy is made.  I think the same is true when we are advocating for approaches/tools to be used as part of ‘better’ policy process – you have to consider the real-life work and context of those involved and other normative expectations that shape their work.

As I have been recently reviewing literature on the work of policy practitioners, I have seen some interesting insights into the use (or non-use) of approaches and tools…

In Canada, there have been a series of surveys investigating policy capacity at different levels of the multi-level governance system.  In the survey is a question which asks respondents how frequently they use a range of policy analysis tools, ranging form ‘soft’ techniques such as brainstorming to ‘hard’ techniques such as quantitative modelling.  The studies consistently find that practitioners use informal and simple techniques, such as brainstorming or checklists, much more frequently than formal complex ones (Howlett 2009a, Howlett 2009b, Howlett and Newman 2010, Bernier and Howlett 2012, Craft and Daku 2016).  This informality is also identified in a qualitative study carried out in Australia – Gleeson (2009) notes that practitioners describe undertaking analyses but there “was little evidence of the formal application of policy analytic techniques described in the literature” (p.142).

A different study of public servants in Quebec, Canada identified that 58% of respondents had never heard of systematic reviews and just 19% had consulted one in the previous twelve months (Bédard 2015).

The same sort of picture is found in relation to participatory techniques.  Cooper and Smith (2012) interviewed participation practitioners who work in a consultancy role in Britain and Germany.  They identify a wide range of participatory techniques that the practitioners draw on.  However, they comment that the text-book forms of these techniques aren’t really used.  Principles and tools are often adapted and blended with others during a piece of work.

So, policy practitioners do not do what the approaches/tools oriented literature advocate that they should do.  No authors seem to go on to question the tools/approaches and whether what they have said is ‘desirable’ is actually ‘feasible’.  However, the question does arise with respect to why? why is it that policy practitioners don’t use these desirable tools that can help them contribute more effectively to a better policy process?  There are some comments in the literature about issues of education and training – in short whether – in the eyes of the researcher – policy practitioners are ‘qualified’ to do what they do.

But I think it is important to take the context into account.  Policy practitioners work in an environment where they need to respond to the changing needs and expectations of political or managerial leadership (which are often in turn influenced by public/media opinion), therefore their workload is constantly being re-prioritised (Baehler and Bryson 2008; 2009).  Policy practitioners spend a considerable amount of their time engaging in firefighting (Wellstead et al 2009. Howlett and Newman, 2010) meaning they don’t necessarily have the time to do work that requires in-depth technical focus and coordination.

Furthermore, even when policy practitioners do manage to create the space to draw on their systems thinking knowledge, they have to account for their work using different narratives.  A study of a natural resource management project in Australia identified that the public servants aspirations for a way of working informed by constructivist, soft systems principles were ‘subverted’ by project management and evaluation practices associated with dominant public administration practices (Boxelaar et al 2006).

It’s only a small set of insights but these research studies do resonate with my own experience as a policy practitioner.  As someone who has had formal training in systems approaches, I rarely identified a context where I could design whole pieces of work explicitly introducing them to others and using them.  They require a context where people have space to develop a new language, to think differently and to challenge the context within which they work – these sort of spaces are a luxury I was never able to create.  In the volatile world of policy work, the ‘tortoise’ approaches unfortunately get little space.

However, it is crucial to point out that I did ‘use’ systems thinking every single working day – my knowledge of systems concepts, ideas and approaches gave me a language and a wide range of heuristics that I used as part of my sense-making and in informal interaction with others.  For me, systems thinking was an integral part of what Maybin (2013) identifies as the ‘understanding and thinking’ practices engaged in by civil servants. They weren’t however part of what Maybin (2013) refers to as ‘legitimating and justifying’ practices – the more public-facing part of policy.

So, now I have got to the point when I am pondering these questions…

…why is it that systems tools/approaches get so little airing in mainstream texts on policy analysis tools?  Do policy studies researchers and systems researchers ‘rub’ shoulders or engage in debate often enough? If not, why not?

…what do systems practitioners mean when they advocate for systems thinking to be ‘used’ in the policy process?  Is it about the use of approaches/tools? Or is it about the cognitive processes and attitudes of those involved in policy?  Or is it about the interaction of both?

…if systems practitioners do think that the approaches/tools should be used, how can they be adapted for the ‘real-world’ of policy work?  How can they not just be desirable but feasible in a fast-paced, volatile context?

…how do those engaged in growing systems competence of individuals do so in a way that recognises the ‘real-world’ of policy work?  Is it right to advocate the use of a single approach and train people in its use when novice systems practitioners are in a context which is not conducive to the use of the approach?  Only doing this is setting people up to either (a) feel a failure or (b) get disappointed with systems approaches or (c) both of those.  It seems to be that it is more important to support novice systems practitioners to draw on and blend different systems ideas and approaches and use them ‘internally’ as part of sense-making, than advocate the ‘pure’ use of any single approach.

…I am sure there will be more…

 

References

Baehler, K. and Bryson, J. (2008), Stress, Minister: government policy advisors and work stress. International Journal of Public Sector Management, 21(3), pp.257–270.

Baehler, K. and Bryson, J. (2009), Behind the Beehive buzz: Sources of occupational stress for New Zealand policy officials. Kōtuitui: New Zealand Journal of Social Sciences Online, 4(1), pp.5–23.

Bédard, P.-O. (2015), The Mobilization of Scientific Evidence by Public Policy Analysts. SAGE Open, 5(3). Available at: http://sgo.sagepub.com/content/5/3/2158244015604193.abstract.

Bernier, L. and Howlett, M. (2012), The Policy Analytical Capacity of the Government of Quebec: Results from survey of officials. Canadian Political Science Review, 6(2–3), pp.281–285.

Boxelaar, L., Paine, M. and Beilin, R. (2006), Community engagement and public administration: Of silos, overlays and technologies of government. Australian Journal of Public Administration, 65(1), pp.113–126.

Cooper, E. and Smith, G. (2012), Organizing Deliberation: The perspectives of professional practitioners in Britain and Germany. Journal of Public Deliberation, 8(1). Available at: http://www.publicdeliberation.net/jpd/v ol8/iss1/art3 [Accessed November 5, 2016].

Craft, J. and Daku, M. (2016), A Comparative Assessment of Elite Policy Recruits in Canada. Journal of Comparative Policy Analysis: Research and Practice, pp.1–20.

Gleeson, D. (2009), Developing policy leadership: a strategic approach to strengthening policy capacity in the health bureaucracy. PhD Thesis. Australia: La Trobe University.

Hill, M. (2013), The Public Policy Process Sixth Edition., Harlow: Pearson Education Limited.

Howlett, M. (2009a), A profile of B.C. Provincial Policy Analysts: Troubleshooters or Planners. Canadian Political Science Review, 3(3), pp.50–68.

Howlett, M. (2009b), Policy Advice in Multi-Level Governance Systems: Sub-National Policy Analysts and Analysis. International Review of Public Administration, 13(3), pp.1–16.

Howlett, M. and Newman, J. (2010), Policy analysis and policy work in federal systems: Policy advice and its contribution to evidence-based policy-making in multi-level governance systems. Policy and Society, 29(2), pp.123–136.

Maybin, J. (2013), Knowledge and Knowing in Policy Work: a case study of civil servants in England’s Department of Health. Edinburgh: University of Edinburgh. Available at: http://kingsfundlibrary.co.uk/publications/maybin_phd_thesis_2013.pdf [Accessed February 12, 2016].

Turnpenny, J., Jordan, A.J., Benson, D. and Rayner, T. (2015), Chapter 1: The tools of policy formulation: an introduction. In A. J. Jordan and J. R. Turnpenny, eds. The Tools of Policy Formulation: Actors, Capacities, Venues and Effects. Cheltenham, UK: Edward Elgar Publishing, pp. 3–30.

Wellstead, A.M., Stedman, R.C. and Lindquist, E.A. (2009), The nature of regional policy work in Canada’s federal public service. Canadian Political Science Review, 3(1), pp.34–56.

3 thoughts on “Policy process advocacy – is the desirable, feasible?

  1. Hi Helen, really great to see you blogging again! And on this very relevant topic on top, which we recently discussed on and off Twitter. Thanks for sharing your deeper thoughts and inspired by so much relevant literature, and for going even further in that turn. What you say here resonates completely! I’ve just been at some conference and meetings about evaluation and EIPM, and what you describe here can definitely be experienced there. It seems to me that there is a dualism between methods and their uses – and systems and complexity-aware approaches still do loose out – despite the rhetoric for more such methods and approaches being needed. So the questions (really interesting!) survey findings you reference here are really pertinent: how come these approaches are advocated for (and exist), but not really so much used in practice? This was also the question in my presentation at the evaluation conference. The little ‘vox pop’ impromptu poll I conducted during my presentation asked the evaluation practitioners there whether they have tried to introduce systems approaches in their practice (show of hands was more than half (of 35 participants); of these, most answered yes to question did they encounter difficulty / resistance in doing so; and nearly everyone answered yes to the question whether they agreed that further research needed to be done on this, and wanted to be involved…. I conclude from that there is definitely a demand and need! The points you are raising here seem to point to a kind of theoretical framework that might underpin such inquiries…. You are giving me very useful food for thought here, and need to follow up some of the references you mention here further.

    On the point of adaptation: I’ve seen two recent examples of these during the conference:

    1) the approach described in the book: Bamberger, M et al (2016) Dealing with complexity in development evaluation, a practical approach.
    And

    2) UN Women (2018) Inclusive Systemic Evaluation for Gender equality, environments and marginalized voices (ISE4GEMs): a new approach for the SDG era (authors/eds: Anne Stephens, Ellen D. Lewis, and Shravanti Reddy).

    Both approaches are adaptations intended for practitioners. The 2nd one is based on systems approaches of familiar flavours: notably boundary analysis (Part 7) and deeply steeped in Critical systems approaches. (Wonderful example actually). Problem is: when translated into such a practical guide and tool, it inevitably ends up as a rather complicated process. This is counter acted through some practical tools (roadmaps, process guides, visualizations), but there is no turning away from the fact that a full approach is very complicated, and has a lot in it! For a practitioner, it is hard to envisage that someone would faithfully implement a whole process like that in its entirety. I think it’s inevitable for practitioners to ‘cut corners’, and take a selective approach, picking and choosing from it and customize it to their own circumstances and situations. At least it offers some entry points for practitioners into what is effectively a ‘systemic inquiry’ process, without having to take in a whole lot of heavy theory – but then, it’s also difficult to leave theory out of it altogether. It seems to be a difficult balancing act…

    The 1st approach is an example for what happens when ‘reductionist’ approach wins out, even where the attempt is to ‘deal with complexity’ (in development evaluation). Whilst the motivation is laudable, the result (to my mind) is quite reductionist in style and approach. Doesn’t strike me as systemic at all, nor complexity-sensitive, but rather an attempt to tackle complexity with approaches more suited for ‘complicatedness’. The step-by-step process suggests to first use ‘complexity theory’ to understand the situation as a whole, then to identify specific policies / elements and to evaluate those with normal methods… later, to reassemble them, to get back to the big picture again… (well, metaphors that came to my mind went between Humpty-Dumpty and the Cartesian Duck…). But it exposes the tension between a (claimed) recognition of complexity, and still tackling the situations with lots and lots of (at least, mixed!) methods. Reasons provided why this is the case mentioned that this was necessary, taking into account the organisational ‘path dependencies’ – i.e. therefore the approach provided needed to be familiar and comfortable for users in order to have a chance to be adopted… (that strikes me at least as an interesting path of inquiry, to look into these path dependencies a bit more perhaps when it comes to promoting systems approaches / etc uptake?

    The rest of the conference has also shown a continued dominance of what you so aptly call ‘analyticentricity’. It’s rampant! Methods, methods, methods, everywhere, as usual. Although at least it is also a little comforting that there is growing (hopefully) recognition of the role of the evaluator (practitioner) and the act of evaluating, and what that involves, and ethical ways of doing so when engaging in such a practice. There are signs of reflexivity! But practice lags behind…

    Sorry for long and meandering response, Helen. … I think I need to start blogging myself, to get some more coherence to my thoughts. Your blog continues to be a main inspiration!

  2. Hi Helen, Barbara

    Yes, those of us aiming to build capabilities to act systemically need to think about the real world conditions that people encounter rather than imagined perfect settings for practice.

    The fields of participatory research and participatory development and action research have had similar concerns. In 15 years of community development practice, I’ve never found that the ideal scenario of a participatory research project to exist. What I’ve tried to do is improve things by being participatory ‘a little bit and a little bit more’ (to coin a phrase from Mama Panya – a character in one of the books we have been reading the girls at the moment).

    In systems – it’s a question of how can we bring more ‘systemicity’ into our existing practice context. If we think of practice as a verb phrase then this makes a lot of sense because we are talking about the qualities of doing, which can be multiple and are always situated.

    At the Open University we’ve been playing around with a conceptual framework that distinguishes ‘systemic sensibilities’, ‘systems literacy’ and ‘systems thinking in practice capability’. We’re still unfolding this framework to see how it might be useful in guiding our Masters level teaching and work on Systems Thinking Practitioner apprenticeship standard. In so doing I’ve started looking at it in the light of another meta-discipline – one where novice practitioners often get scared (and scarred) by early experiences such that they shy away from using its elegant insights in practice (sound familiar?) – mathematics.

    In the field of maths there is the notion of ‘number sense’. Number sense is described as:

    “a student’s ability to work with numbers flexibly and fluidly….giving meaning to numbers – that is, knowing about how they relate to each other and their relative magnitudes….the effect of mathematical operations on numbers, such as whether multiplication of a given number by another would make the number bigger or smaller. Having a sense of number is vital for the understanding of numerical aspects of the world.” (TESS-India, nd)

    It is seen to be a holistic construct that arose out of desire to balance a skills-based approach with one that connected with real-world situations and was “concerned with the development of a wide range of understandings, skills and attitudes about number that extend beyond those generally associated with numeracy and encompass everyday uses” (Dunphy 2006).

    I’m fascinated by how number sense seems to be framed around having a positive attitude or ‘friendliness’ towards numbers. I love the idea of getting the girls to become friends with numbers and then setting them lose on ever more complex situations. It highlights the emotional disposition that we have towards systems thinking – it only benefits the ‘high priests’ good if most people are terrified of getting it wrong. Highly complicated systems methods, like complicated maths operations (and religious rituals), can be of use but probably only for a small subset of the population of potential users. For most of us, most of the time a good ‘what, why, how’ or a sketchy system dynamics diagram along with an inquiring attitude would make a significant difference to our ability to go on systemically with whatever it is that we’re already doing – whether that’s evaluation of complex development programmes or dealing with the landlord breathing down your neck.

    The material invites some interesting angles for deepening my own understanding of how to build capacity for systems thinking in practice. For example, in terms of this framework, I wonder if ‘number sense’ can be related to a combination of systemic sensibility and systems literacy, whilst systems thinking in practice capability is like being able to use maths in day-to-day situations that you encounter (eg at the shops, when confronted by a Wonga ad, working out betting odds) – when your number sense is performed in a situated particular…I know how to calculate how much interest I will have to pay and can decide whether to take a £300 loan.

    Indeed, I find the underlying conceptual framework of systems thinking to be a kind of parallel to the conceptual framework of mathematics in that these are liberating but highly structured frames that help us engage with the complexity of situations. We can admire and marvel at the beauty of mathematical rules and underlying principles etc. but for most people most of the time – whether down at the shops or in the CERN lab – it comes down to how the frameworks can support effective practice in the open messiness of our living together – ‘a little bit and a little bit more’!

    References
    Dunphy, Elizabeth (2006). An exploration of young children’s number sense on entry to primary school in Ireland. EdD thesis The Open University.
    Tess-India (nd) Using number games: developing number sense. TESS-India and Open University available at http://www.open.edu/openlearncreate/pluginfile.php/134923/mod_resource/content/3/EM01_AIE_Final.pdf

    Other source
    Chamberlin, M. and Chamberlin, R. (2005) Mama Panya’s Pancakes: A Village Tale from Kenya. Barefoot Books

  3. Thank you Barbara and Rupesh for your comments here. Your comments give much additional food for thought.

    Today, I had reason to re-visit Checkland’s 30 year retrospective and was reminded of his discussion of what it is to ‘use’ SSM. It gave me some additional insights that I think are quite helpful to what I describe above.

    He distinguishes two forms of SSM ‘use’ – prescriptive and internalised SSM or mode 1 and mode 2.
    – Mode 1 use is methodology-driven. SSM is treated as an external recipe and can be used in quite a procedural way
    – Mode 2 use is situation-driven. SSM is more of an internalised model. It’s use is much more iterative and interactive

    When I talked about how I use my knowledge of systems approaches, I would say they approximate mode 2 use – they are internalised and I draw on them in interaction with the situation. I suspect therefore my ‘use’ of systems would be hard to research (unless a first person inquiry).

    (It’s interesting too that Checkland’s cites the OU’s work in teaching SSM as part of the experiences informing this distinction).

    Reference
    Checkland, P. (1999) Systems thinking, Systems practice: includes a 30-year retrospective, Wiley: Chichester, UK

Share what you think...

This site uses Akismet to reduce spam. Learn how your comment data is processed.