The report summarises findings from the research informed by literature in design studies and organisation studies. It uses a format inspired by graphic novels in order to open up the work of interpretation about the role of design approaches in policy making and government.
Some excerpts from the report are below. If you want to (re)use them, please note this work is licensed under the Creative Commons Attribution-ShareAlike 4.0 International License. Please use this citation: Kimbell, L. and Macdonald. H. (2015). Applying Design Approaches to Policy Making: Discovering Policy Lab. University of Brighton.
People-centred design, creativity and co-design are increasingly visible in the UK and internationally. 18 months after it was set up, Policy Lab based in the Cabinet Office has demonstrated in its projects with government departments that design approaches bring valuable insight, creativity and collaboration support to the early stages of policy making.
Join us for a discussion which looks more broadly at the role of design in government and asks whether policy makers should be policy designers.
Date: Thursday 8 October 1800-1930 followed by drinks/networking
Mat Hunter, director of strategy at Central Research Laboratory, former chief design officer at the Design Council
Dr Derek Miller, The Policy Lab (Oslo)/UN Institute for Disarmament Research
Steve Gooding, RAC Foundation, former head of policy profession, Department for Transport
Paul Maltby, Director, Cabinet Office
Who should come?
Civil servants especially policy makers
Design specialists inside government
Policy innovators and think tanks
Academics from the design, public administration and innovation communities
People working in international innovation labs
People supporting government to innovate and use design
How to attend
Email lucykimbell at gmail to get more information (for security reasons the sign up is not public)
Copies of my report about design for policy making resulting from my one year AHRC research fellowship in Policy Lab/University of Brighton, will also be available. A PDF version will also be posted here soon.
What is the difference that Policy Lab’s approach makes to policy making?
1 What the Lab approach is/does
Lab’s approach problematises policy making – it’s not just exploring new tools, techniques and new data. Policy Lab connects/reassembles/tweens actualities and potentialities, problems and solutions, thinking and doing, inside and outside.
The key characteristics of this approach are that it is based in:
Abductive discovery, through which insights, guesses, framings and concepts emerge eg ethnographic research, co-design, prototyping in the fuzzy front end of policy making.
Collective inquiry – through which problems and solutions co-evolve, which is participatory, and through which constituents of an issue are identified and recognised, and solutions are tested eg prototyping.
Recombining experiences, resources and policies – the constituents of an issue – into new (temporary) configurations.
2 What Lab approach results in – its impact which we can seek proxy measures for
Project level – Relating to the policy area
New insights, guesses, framings
Plausible concepts for artifact-experience bundles
Prototyped proofs of concept – “proto policies”
An issue team/public engaged in a collective inquiry engaging with a more ordered problem
Capabilities in within the policy profession and wider ecosystem
Reordered relations between actors in an issue (inside and outside an issue)
Reordered relations between actors and evidence
Ability to set up and participate effectively in collective inquiries and early-stage abductive exploration
Awareness of the interdependencies between experiences, resources and policies
Part 2 of some retrospective sensemaking of my research fellowship within the Policy Lab team in the Cabinet Office.
Phase 1 Infrastructuring: January – early May 2015
With my newly-gained, temporary insider status and confidence – enabled by the security pass which allowed me into some government buildings without an escort and by my emerging understanding of the civil service’s policy making environment, the first few months of 2015 gave me deep access to new developments in Policy Lab’s world. As well as continuing to deliver many one-hour or longer taster workshops to departmental policy officials, Policy Lab took shape through its demonstrator projects lasting over several months and ongoing discussions about its future in the context of a countdown to the general election.
One of Lab’s five demonstrator projects, with the Department of Work and Pensions (DWP) and Department of Health (DH) on health outcomes, kicked off in late January after two months of what Pelle Ehn calls “infrastructuring” – the briefings, proposals, meetings, emails, commitments and contracting that construct a project. In previous projects I had been more of an observer. In this project I took a more active role at the beginning, for example helping Lab’s project lead Cat Drew and the rest of the team design, facilitate and make sense of the policy “sprint” workshop. Unlike the earlier projects in which Lab and the government department subcontracted chunks of the project to specialists in ethnographic research and design, in the health outcomes project Policy Lab directly brought together and mediated between experts. They worked in close collaboration with one another and with staff from the two departments including policy makers, analysts and some of their advisers and other stakeholders. The two-and-a-half day sprint staged the project from the outset as a collective inquiry by articulating and iterating a goal, defining research questions and approaches, and building a shared, although provisional understanding of the issue.
Other demonstrator projects with HM Revenue and Customers (HMRC) on National Insurance numbers and young people, and with the Department for Education (DfE) on childcare, moved forward with combinations of ethnographically-informed research and analysis, design and prototyping. I participated in workshops in which Policy Lab and the wider project-specific teams shared research insights, supported collaborative design by civil servants and other stakeholders. I also participated in review meetings and sometimes helped edit or produce documents at key points in a project lifecycle. In one project I took a direct role as the lead for Policy Lab, on a consultancy basis. This project was for the small team serving the civil service’s Heads of Policy Profession Board with the goal of exploring and developing proposals for assessing and accrediting the capabilities of people working in the policy profession. I’ll discuss the ethical, political, and methodological implications of doing this alongside my fellowship in a separate post.
Projects that were more or less completed such as with the Home Office on digital policing, and the Ministry of Justice (MOJ) on family mediation, were still part of Policy Lab’s world, surfacing in team discussions about next steps and demonstrating Lab’s impact. The challenge for each was how to take forward what the Policy Lab project had produced but without ownership – as the policy areas lived in departments, not in the Cabinet Office – and without much resource in terms of time or money, nor yet much visible commitment from senior leaders in the civil service or ministers.
I became more aware of the importance of formal governance structures and processes in this civil service world. Crafting Policy Lab’s demonstrator projects involves setting up “boards” chaired by Paul Maltby, the director of the Government Innovation Group in which Policy Lab is embedded. These involve the Policy Lab lead and the policy officials leading the policy area but, crucially, involve senior civil servants from the departments involved. Punctuating the project journey, these boards invite senior people to review Policy Lab’s work including the research insights and emerging concepts and decide how to move forward and making commitments to one another, often across departmental boundaries.
In early February – in a very short space of time – the Open Policy Making team and Policy Lab organised 19 events for over 500 people with the title Open Policy 2015. Many of these were practical workshops and taster sessions for civil servants to try out tools and techniques including user research, behavioural insights, agile approaches such as hackdays, and working with stakeholders. While these approaches are not new to some policy teams, these were opportunities for participants to hear an experienced civil servant or external speaker share experiences of using a particular approach and then try aspects of it out. I attended some events such as Policy Lab’s prototyping workshop and also organised one for Policy Lab, which was discursive rather than practical, on ethnography in policy making.
As it got closer to the first anniversary of Policy Lab being set up – initially for one year in April 2014 – discussions among the team and with senior stakeholders focused on making sense of what Lab had been doing, and demonstrating its impact. This went in parallel with constructing future projects with departments and articulating options for senior civil servants to consider about its purpose, resourcing, and expected outcomes. Various ways of framing Policy Lab were discussed, with a recurring themes of experimentation, engagement and evidence and what it takes to make a project “land”.
My research at this time was guided still by Bent Flyvberg’s Making Social Science Matter, as well as by Jesper Christiansen’s PhD thesis entitled The Irrealities of Pubic Innovation based on his research/work at MindLab. Researcher Ben Williamson’s blog, the anthrodesign mailing list and the twitter hashtag #psilabs were also useful. I continued taking lots of notes and photos, doing some interviews, but decided against using video to gather data.
The UK general election date of May 7 marked an end point to this phase of the research. Owing to the relationship between policy makers and ministers, as well as to the particular uncertainty around who might win that election, the months leading up to the election had a particular intensity and urgency. Civil servants talked about the pressure of getting things done before “Purdah”, the name given to the period of time after Parliament is dissolved and before a new government is formed, when the civil service is not supposed to favour any political party. Although the civil servants in Policy Lab did not work directly with ministers at that point, this urgency to get things done shaped the working culture and expectations about the timeframes within which some of its projects with departments had to produce results.
As the civil service entered Purdah, it seemed ironic that parts of the civil service advocating and practicing open government decided not to have any online digital engagement during this time – even though some government departments did. For example the OPM team and Policy Lab were advised not to tweet or blog. With an inside/outside role, I changed some of my own online behaviours during this time too.
This phase of my research was still about building, connecting and expanding rather than making sense. Writing up a couple of blog posts for the OPM blog (the links are above) and doing a couple of keynotes and talks to early career researchers forced me to try to locate and digest the research to date. I found it very hard. In my application for the fellowship I had said I would co-design an evaluation framework for Policy Lab and made various efforts to do so, working closely with the team of Andrea Siodmok, Beatrice Andrews, Hannah Rutter, Cat Drew and Cabinet Office intern (and doctor) Lisa Graham. But I was still in the mess of being-in-the-work, trying to understand what Policy Lab was doing in its various emergent forms in a context of massive uncertainty and ambiguity. It began to get clearer – to me at least – that Policy Lab and its publics might benefit from an account of what it was doing – the difference it made to policy making – which needed to precede any framework.
 Ehn P (2008) Participation in design things. In: PDC ’08: Proceedings of the tenth conference on participatory design, Bloomington, Indiana, 30 September–4 October 2008. New York: ACM Press, pp. 92–101.
The aim of this section is to help clarify what kinds of experiments are going on in the work of Policy Lab. This section draws on the work of philosopher Charles Sanders Peirce who developed the term abduction. First it will describe what abduction is, and how abduction relates to the other kinds of inference, deduction and induction, which are well-established in policy development. It will then discuss Policy Lab’s work through the lens of abduction and show how policy experimentation via abductive reasoning intersects with the other logics shaping policy making.
Deduction and induction
The term abduction is associated with the philosopher CS Peirce, who explored the term over some decades. Like other Pragmatists such as Dewey, Peirce’s approach has an orientation toward our experiences of what happens in practice, rather than proposing an idealized analysis. To explain why Peirce’s work is a useful contribution to understanding Policy Lab’s approach requires a brief detour into the two other logics through which reasoning is usually understood to proceed.
Deduction is the process of taking a principle (a rule) and then inferring a result in a particular case. For example:
Rule: People living in the Midlands are friendly.
Case: These people are from the Midlands.
Result: These people are friendly.
Deductions offer reliability if the initial statements are true, but Peirce argued they do not generate anything new.
Induction starts with surveying data (the case) and generalising across many observations (the result) to identify a pattern (the rule). For example:
Case: These people are from the Midlands.
Result: These people are friendly.
Rule: People from the Midlands are friendly.
Inductions indicate probability about patterns in the data. They suggest that something is the case. Depending on the research methods used, they offer some kind of validity. As with deduction, Peirce argued inductive reasoning does not generate new concepts or knowledge.
In the context of experimental research, knowledge building typically proceeds by developing a hypothesis based on the stock of existing knowledge via deduction and then seeking confirmation by induction if it holds in a particular case.
The logics of policy making
Much of the evidence used to inform policy making uses mixed methods based on deductive or inductive reasoning in various combinations. Neither are right or wrong – they do different things and offer different kinds of validity to different audiences, to allow policy officials and ministers to reach decisions. But in the culture of policy making, the deductive logic offers the allure of offering definitive evidence. For one civil servant, “Trials are the gold standard for policy making” because they are able to prove whether something is true or not, providing sound evidence that decision makers want about whether to go ahead with a policy.
Deductive reasoning underpins work in the natural and physical sciences and also shapes research in the social sciences. For example the methodological approach “Test, Learn Adapt” advocated by the Behavioural Insights Team is grounded in deductive logic . BIT helps civil servants design and construct trials of policy by demonstrating whether an intervention will achieve intended outcomes, informed by existing knowledge about human behavior. Randomised control trials (RCTs) are one way to test systematically, in a particular case, whether a hypothesis underpinning an intervention is true or not. In policy terms, RCTs can prove whether a proposed intervention will lead to the desired change in a particular case. It generates statistically valid data about changes to variables associated with the outcome that result from the intervention.
Inductive reasoning is also very familiar within the policy environment. It underpins much of the research in the social sciences and the humanities. Inductive research does not have to use qualitative data but it is strongly associated with it. Researchers specialising in research methods working within this logic make efforts to show to what extent their findings have validity. They make careful claims about whether they can show links between cause and effect and discuss the extent to which their findings are generalisable to other contexts.
But where do hypotheses and new ideas come from in the first place? What happens in the context of massive uncertainty, when there is very little data, or much of it is in disagreement? What if you have a desired outcome that you want to achieve but are not sure of the constituent elements that might help you achieve it or how they relate? How do researchers get to the point that they are able to isolate an outcome variable which could be tested through a trial?
Recognising a gap in the philosophy of science, Peirce developed the idea of abduction as a logic of discovery within scientific inquiry, in contrast to the logic of justification associated with deductive reasoning. He argued that philosophers of science had paid insufficient attention to where ideas come from. Informed by ancient Greek thought, he developed the concept of abduction to explain how new concepts and hypothesis are created.
Abduction takes a result and a rule, and then jumps to making an interference that links the two. For example:
Rule: People from the Midlands are friendly.
Result: These people are friendly.
Case: These people are from the Midlands.
In abduction, we link things together in new ways. We can’t say if the interference is true or not, as is the case with deduction. Nor can we say it has strong validity because of the observations we made, as with induction. But with abductive reasoning, what we do get is a new insight or concept that we can explore further with the other two logics.
Hypotheses are not out there waiting to be discovered. Instead, Peirce argued, they are the outputs of a process of sensemaking. As we make observations through our own experience of the world, we compare these to the existing stock of knowledge. We may find something surprising that we can’t account for, resulting in a tentative guess – an embryonic hypothesis. Social researcher Jo Reichertz explains, “Something unintelligible is discovered in the data and, on the basis of the mental design of a new rule, the rule is discovered or invented and, simultaneously, it becomes clear what the case is.” In contrast to deduction, which offers reliability, abduction offers possibility by generating something new, which can then be explored further through induction and deduction.
With the concept of abduction, Peirce was able reconnect creativity in science with the well-established idea that scientific reasoning can prove things. Other researchers such as Karl Popper separated the logic of discovery from the logic of justification. They focused on how science can make more reliable truth claims, paying less attention to how novel concepts are generated . In Peirce’s own words, “Abduction is the process of forming an explanatory hypothesis. It is the only logical operation that introduces new ideas, for induction does nothing but determine a value, and deduction merely evolves the necessary consequences of a pure hypothesis.” Table 1 shows the differences between the three logics, based on Peirce’s work. In his view there is an order: abduction precedes deduction and induction .
Table 1: Peirce’s ordering of the logics of scientific inquiry, developed from Hansen 2008.
Producing plausible, provisional results
Abduction reasons from effects to causes with incomplete data. It constructs plausible guesses and insights, shaped by our existing stocks of knowledge and in responses to effects gathered through observations or experiences. For management researcher Hans Hansen, “We take disparate elements and place them into relationships that are meaningful for us. Abduction generates hypothesis (sic) in the absence of any existing construct to interpret observation.” Abduction shows something may be, but does not prove it, whereas deduction shows something is true in a particular case. Abductive inferences are plausible but are not justified by the structure of the argument. But they are plausible enough to move a project forward.
Abduction results in a new order that takes surprising observations and offers a way to make sense of them – for now – which is still productive. However for social researcher Jo Reichertz,
“The search for order is never definitively complete and is always undertaken provisionally. So long as the new order is helpful in the completion of a task it is allowed to remain in force: if its value is limited, distinctions must be made; if it shows itself to be useless, it is abandoned. In this sense, abductively discovered orders are neither (preferred) constructions nor (valid) reconstructions, but usable (re-) constructions.”
At first glance there is a relationship here with the work of Karl Popper. He argued that science proceeds by hypotheses being challenged or upheld by subsequent research, a process he called falsification. But abductive interferences are never as firm as hypotheses in the first place. They are provisional, plausible constructs that are usable – they move a process of inquiry along – but do not offer a truth claim.
Creating the conditions for abduction
When developing his theory of abduction, Peirce discussed the conditions that gave rise it to it, which can be seen as broader principles for enabling the generation of new ideas. Rather than arguing that coming up with new insights is simply the result of chance, he identified strategies or enabling conditions for making it more systematic.
Peirce developed his thoughts on abduction as a result of a personal experience. This was when a Tiffany watch was stolen from him after he left it behind by accident on a ship to New York. Initially he had no idea who was responsible. In his account of the stolen watch, Peirce asked the captain of the ship to line up all the crew for him to talk to. At first he found himself unable to work out who might have taken his watch. Reflecting on what happened, Peirce described how this experience prompted him to reconsider how knowledge is generated.
The first condition conducive to the presence of developing an abductive inference is genuine doubt, uncertainty or great pressure to act. The Tiffany watch was a gift, and a valuable one. But the fear motivating Peirce was not fear of its loss, but of professional disgrace for not being able at first to work out who might be guilty.
The second condition conducive to abduction Peirce identified was to let his mind wander with no specific goal – what he called “musement”. Instead of trying to use deductive logic to work out who had stolen his watch, Peirce gave into a state that was not controlled by his conscious mind. As a result, he concluded that his consciously working mind, which usually relied on logical rules, was outmaneovoured.
The third condition to make abduction more likely was to decide to act, even if the direction seemed arbitrary. As he tried to work out who had stolen the watch by walking up and down the lined up crew of the ship, Peirce concluded he must fasten on someone even through it would be almost a random choice. The guess that Peirce made turned out later to be true and eventually he got the watch back. Management researcher Hans Hansen summarises, “At the point of being surprised by a surprising fact, if we can make a guess, any guess, we can make progress.”
Recent interest in abduction
Researchers and practitioners in several domains are using abduction to help them distinguish between different kinds of research activity and practical experimentation. In business, Roger Martin argued that managers need to use abductive as well as deductive and inductive reasoning as tools to achieve competitive advantage. In design, researchers have used the theory of abduction to explain how designers come up with new concepts. In social research, especially in fields such as nursing, researchers have turned to abduction to better understand how themes, codes and categories emerge during research. In artificial intelligence and data science, there is a longstanding interest in abduction and induction and how they relate to one other in algorithmic machine learning.
Abduction in policy making
Drawing these argument together, the concept of abduction helps those involved in policy experimentation distinguish between the logic of discovery and the logic of justification. What Peirce’s ideas do is highlight the often invisible work that goes on during what we might call the “fuzzy front end” of policy making.
This discussion highlights the different kinds of reasoning produce different results at different phases of the policy making cycle. They are not directly comparable and further, if Peirce is right, then there are interdependencies between them. Peirce’s view is that there is a sequence which starts with abduction. The exploratory insights and guesses produced through abductive reasoning with limited data but with are nonetheless plausible can then be further developed through deductive and inductive reasoning. Abduction produces the provisional insights and guesses linking things together in new ways, that become hypotheses that be tested through experimentation and other research based in deductive and inductive logics. Deductive research can answer if a policy intervention works or not; inductive research helps explain why it does; but abductive reasoning enables the discovery of insights and guesses when there is not yet theory or evidence but a desired policy outcome.
This highlights the mostly invisible work that policy makers do, where they are required to rapidly gather and assess evidence and come up with options for ministers. Looking at this through the lens of Peirce’s work, we might pay more attention to this early, hypothesis-free, exploratory phase. Trials test policy interventions in particular kinds of case when there is a hypothesis to test. In contrast design and prototyping support policy inventiveness by making plausible links between elements of an issue to achieve an intended outcome. If RCTs are a robust way establishing if a policy is working, the abductive cycle of generating and exploring insights and guesses is the best way of developing and iterating plausible early-stage policy ideas.
This is where Policy Lab’s approach comes in. It is rooted in the practical experimentation of design going through cycles of rapid insight generation, idea generation and exploration via prototyping. Working within this tradition, Policy Lab foregrounds the abductive early stage of policy making – where “early stage” includes revisiting persistent, complex policy issues. Alongside data science, and agile approaches also used in government, Policy Lab reveals and supports the often neglected abductive work that policy makers do, but opens it up to a wider group of participants and new sources of evidence and inspiration which the next section will illustrate.
References (not complete or checked)
 Scholars make a distinction between his earlier work using the term and his later work, which is what I draw on here. See Roozenburg, N. (1993). ￼On the pattern of reasoning in innovative design. Design Issues 14(1): 4-18.
 The speaker was a participant in a workshop organised by the What Works network held at the Institute for Government in June 2015.
 Haynes, Laura, Service, Owain, Goldacre, Ben and Torgerson, David. (2012). Test, Learn, Adapt: Developing Public Policy with Randomised Controlled Trial. London: Cabinet Office.
 Reichertz, Jo. (2010). Abduction: The Logic of Discovery of Grounded Theory. Forum: Qualitative Social Research. 1(13). Italics in original.
 Reichertz, Jo. (2010). Abduction: The Logic of Discovery of Grounded Theory. Forum: Qualitative Social Research. 1(13).
 CS Peirce cited in Hansen, Hans. (2008). Abduction. In Barry, David and Hansen, Hans (eds). The Sage Handbook of New Approaches in Management and Organization. p.456
 Table developed from Table 3.5.1 in Hansen, Hans. (2008). Abduction. In Barry, David and Hansen, Hans (eds). The Sage Handbook of New Approaches in Management and Organization. p.457.
 Hansen, Hans. (2008). Abduction. In Barry, David and Hansen, Hans (eds). The Sage Handbook of New Approaches in Management and Organization. p.457
 This is the underpinning to lean start up and agile software development.
 Martin, Roger. (2009). The Design of Business: Why Design Thinking is the Next Competitive Advantage. Harvard Business Press.
 See for example, Roozenburg op cit; Kolko, John. (2010). Abductive Thinking and Sensemaking: The Drivers of Design Synthesis. Design Issues, 26(1); and Dorst, K. (2015). Frame Innovation: Create New Thinking By Design. MIT Press.
 See for example Tavory, Iddo and Timmermans, Stefan (2014). Abductive Analysis: Theorising Qualitative Research. Chicago University Press.
 See for example Flach, Peter and Kakas, Antonis (eds). (2000). Abduction and Induction. Essays on Their Relation and Integration. Springer.
 Fuzzy front end is a term describing the early stage of product development, introduced in Khurana, Anil and Rosenthal, S. (1997). Integrating the Fuzzy Front End of New Product Develpoment. Sloan Management Review, Winter. pp. 103-120.
This is a draft section from the report I’m writing for Policy Lab in which I’ve been embedded for 10 months. The primary audience will be policy makers and others involved in public sector experimentation. The final report will be published in late August/early September.
The workshop: Experiencing policy making as a collective inquiry
In a windowless conference room in the basement of the Department for Business, Innovation and Skills, about 20 civil servants are sitting on chairs arranged in rows facing a raised stage. On tables to one side are piles of brightly coloured materials such as pens, Play-Doh, straws, cardboard boxes and pipe cleaners. Andrea Siodmok, head of Policy Lab, and I arrive slightly late from running a workshop elsewhere. She checks with our Policy Lab colleagues who are already present if the Powerpoint slides she planned to use are ready on the computer to be projected. They are. This is a two-hour event on prototyping in government organised by Policy Lab as part of Open Policy 2015, a week-long series of practical workshops and talks aimed at policy makers.
Andrea does not go on to the raised stage but instead stays on the same level as the seated participants. She moves in front of the people and starts talking. She apologises for being slightly late, says the workshop will be mostly practical and invites people to ask what they want to find out today about prototyping in government. As she talks and listens she hands out some of the coloured Play-Doh to participants and starts molding some in her own hands. Pretty soon all the people in the room are molding Play-Doh in their hands. The mood is open, relaxed and expectant.
She leads a discussion on prototyping to which people in the room are contributing unselfconsciously and in an open manner. She instructs people to make a particular shape with the Play-Doh – a duck – and people do and then are willing to hold up and share what they’ve done with others. She then says “One of you is going to shout out what you are going to make next” and we hear someone say “monkey”. Andrea says “Ok, 15 seconds” and everyone quickly makes a monkey shape and again shares them. Then someone calls out “Fox”. One man holds up his shape: “It’s part fox, part monkey and I’m very proud of him” and we laugh.
About ten minutes in, it feels as if the workshop has not yet started but participants in the room continue to be attentive. Any skepticism they might feel is not evident in their behaviour. Andrea shares observations on prototyping drawing on the work of UK manufacturer Dyson as well as her early career as a product designer. She says “The purpose of prototyping is to come up with something tangible we can test, share and improve” and talks about how Policy Lab is working with departments to explore how to use this approach at the early stages of policy making, exploring what policy ideas might looks like at the point of delivery and experience. She then instructs, “One last prototype – in 15 seconds. Make a prototype of how to integrate health care and social care.” People laugh at the shift in register from the childlike to the very serious policy challenge, but carry out the request. Andrea then asks people to share their prototypes. “Mine’s a car crash – the incentives are misaligned,” says one woman. Another says, “I’m really trying to join them up.”
Andrea invites people to share reflections on what they think prototyping is. “It’s quick and easy – you can see the result,” says one woman. Another says, “There are lots of levels of abstraction”. Someone else points out that participants’ experiences were impacted by their capabilities with using Play-Doh. Another person reflects, “What it did was force us to do something – often in policy making we sit around trying to come up with the perfect idea.”
Participants in the workshop continue to be very engaged. Andrea moves to the stage and starts showing the Powerpoint slides which are projected on to a large screen. One of the slides includes a screengrab of the text of a recent speech by the Home Secretary in which she refers to prototyping a new service for people reporting crime, one of Policy Lab’s projects, possibly the first time a minister has used the term prototyping in public. Andrea shows photographs, many from local government projects, of exploring new policy or public service concepts at a very early stage by mocking up what the experience would be like for end users or citizens.
About an hour in to the workshop, Andrea sets a challenge for participants. This is to work together create a prototype of a new kind of GP surgery based on the principle of “the patient will see you now” rather than the current model of “the doctor will see you now”. Over the next hour participants work together in small teams of about five, producing ideas they share in a two-minute pitch to the rest of the group which is also an opportunity for feedback. To facilitate this, some of us arrange chairs around some large tables so that people can work together around a flat surface.
Most of the participants, all civil servants, are strangers to one another. They come from a range of departments and policy areas but they succeed in quickly self-organising into teams and collaborating to generate and prioritise ideas. They help themselves to the materials and get on with making small models of alternative GP surgeries, responding to the items made available such as cardboard boxes, pens, feathers and pipecleaners. As I move from group to group, I hear them discuss things like patients’ needs, frustrations they themselves have with visiting GP surgeries and concepts from other service contexts.
Everyone seems engrossed in the flow of exploring what a new primary healthcare experience might be and what the infrastructure and resources are required to deliver it. When they come to share their models, some of them use role play to show the new patient experience but they also refer to some of the resources and infrastructure associated with delivering it. One group communicates their concept through a woman performing as if in a TV advert. After each team presents we all have an opportunity to give feedback on the ideas they have presented or query details.
The workshop ends with a brief discussion led by Andrea. She emphasizes that prototyping is an early stage, exploratory and collaborative activity that can be done very early on when concepts are very malleable, as well as later on when concepts are more defined. It’s striking how she continues not to offer a definitive set of proposals as to what prototyping might mean in the context of policy making. Instead, the workshop has involved a practical exploration of the question in relation to a policy challenge. To conclude, she asks, “How many of you are creative?” Nearly everyone puts a hand up. The workshop is over. The Policy Lab team gather up the materials, put the room back to its arrangement of tidy rows of chairs facing the front, and leave.
There are many ways of discussing what was going on in this workshop. In what follows I will draw on the work of Pragmatist philosopher John Dewey to illuminate what made this workshop hang together: how it was that playing around with Play-Doh and pipecleaners became an appropriate and productive way of exploring the issue of prototyping in government and what it offers to conventional ways of thinking about policy making.
Making sense of Policy Lab: Policy making as inquiring
John Dewey made an important contribution to philosophy by focusing on how knowledge is developed in practice, rather than formulating concepts or generating facts without connecting with what goes on in the world. Dewey was one of several Pragmatist philosophers working in the early 20th century who challenged the well-established mind-body dualism in which thinking was maintained as separate from the world. Dewey’s thought was shaped by the idea that humans exist through interacting with their surroundings.
Much of Dewey’s work was in the realm of the theory of knowledge, although he preferred to use the term inquiry. For Dewey and other Pragmatists, what mattered was knowledge being put into use in the world to achieve human ends, rather than more abstract discussions of logic and truth. The point of inquiry is to provide a basis for action. Our knowing is a result of our interacting with our surroundings. Dewey’s work has been extremely influential, including shaping student-centred learning in education and professional development as well as action research and participatory community development.
Dewey’s definition of inquiry can be summarized as the controlled or directed transformation of an indeterminate situation into one that is determinate enough to hang together. Dewey says the process of inquiry involves these steps
recognition that there is a problem, which could have a solution
working out what the constituents of the problem are
making observations about the problem
allowing a possible relevant solution to present itself; through so doing, more aspects of the problem become clear
exploring the meanings of possible solution to see if they are relevant to the problem at hand
finding facts to see if they link up with other facts to produce a coherent whole of problem and solution
further developing a new ordering of the facts which suggests a modified idea (or hypothesis), which results in new observations, which results in a new ordering; and
continuing this cycle until a new order is judged to be complete. In the course of this “the ideas that represent possible solutions are tested or ‘proved’.”
So for Dewey, in inquiry the problem and a solution emerge together through practical interventions into and observations of the world. Instead of this being a linear process in which first a problem is defined and then solutions are found, in inquiry, the problems and solutions co-evolve together. Dewey’s philosophical argument has since been demonstrated through academic research into how designers approach their work, sometimes called “design thinking”. Researchers have shown that during designing, the problem and the solution emerge together.
However Dewey was writing about how knowledge is created in science. To explore how this is relevant to policy making which, like design, is a practical endeavor that involves lots of stakeholders, it’s worth going into a little more detail about how ideas are generated and how they are used.
“Because inquiry is a progressive determination of a problem and its possible solution, ideas differ in grade according to the stage of inquiry reached. At first, save in highly familiar matters, they are vague. They occur at first simply as suggestions; suggestions just spring up, flash upon us, occur to us. They may then become stimuli to direct an overt activity but they have as yet no logical status. Every idea originates as a suggestion, but not every suggestion is an idea. The suggestion becomes an idea when it is examined with reference to its functional fitness; its capacity as a means of resolving the given situation.”
So for Dewey, ideas are not just abstractions; they act in and on the world. And in acting in the world, ideas help re-organize the current understanding of a problem. Through generating ideas and exploring them in relation to the problem, a better understanding emerges of its nature as well as its possible solution.
Dewey’s argument highlights the provisionality of ideas. What matters is how ideas are put into operation to check their fitness.
“Ideas are operational in that they instigate and direct further operations of observation; they are proposals and plans for acting upon existing conditions to bring new facts to light and to organize all the selected facts into a coherent whole.”
To summarise, the key ideas from Dewey’s work relevant to this discussion about Policy Lab and its way of working are as follows.
An inquiry is a process of creating knowledge which is always purposeful, rather than concerned with generating abstractions that are not connected to practical situations.
The process of inquiry does not start with a problem and then move to solutions. Rather, the problem and the solution co-evolve together.
The constituents of an issue are not known in advance; they are discovered through inquiry.
Inquiring into a problem by generating ideas helps clarify the nature of the problem.
Ideas are always provisional and can be put to work.
Exploring ideas helps organize the understanding of the problem and its solution.
Dewey’s work offers a way of thinking about what Andrea was doing in the workshop that helps explain (a) Policy Lab’s experiential approach and (b) what prototyping is.
Staging a collective inquiry. Rather than talking from a position in which she held the knowledge, Andrea constructed the workshop as a collective inquiry. While Andrea is a Senior Civil Servant and has extensive experience of leading strategic design projects, she did not invoke her position in the hierarchy or her expertise to make an argument from authority. There was just one occasion where Andrea did use an argument from authority, when she showed on screen the text of a recent speech by the Home Secretary using the word “prototyping”. Other than this, from the outset of the workshop, she involved participants in exploring together what prototyping in government might be. Instead of telling them what she knew – a kind of one-way knowledge transfer – she enabled people to explore through discussion and practical, embodied interaction with the question of what prototyping in government is or could be. Instead of problem-solving for or with participants, she involved them in inventive “problem making” in the sense of collectively working out what the constituents of the problem might be. Through so doing, she created space for participants to share their lack of knowledge along with their knowledge and ideas. Through the practical activities accompanying the discussion, she invited them to progressively develop their own understanding of the question.
A grounding in practice. Andrea’s design and facilitation of the workshop oriented participants towards action, rather than abstract discussion. For example she set them a challenge in which they could practically explore what prototyping might mean in the context of policy making. By choosing the topic of healthcare, this enabled people to draw on their own lived experience of healthcare. Since most likely everyone in the room had direct relevant experience of implications of the policy question, this enabled the workshop’s inquiry to proceed quickly, as everyone was a ‘user’ of a primary health care service. Further, when setting the challenge, Andrea asked people to make a model of the new GP surgery service and try to communicate what the experience it would offer to a patient. Rather than designing an ideal form such as a healthcare system, positioning themselves as outside of it, participants were asked to describe someone’s future experience of primary healthcare, and to communicate what it would be like inside of it.
Exploring the problem by iteratively exploring ideas and making observations towards coherence. Participants were empowered and supported to explore the question of prototyping in policy making by trying out the techniques Andrea introduced. Unexpected in the context of government, the Play-Doh exercise was extremely simple and accessible to everyone present. Through several rounds of making animal shapes, followed by a policy challenge of health and social care, the activity normalized the idea that it was possible and even worthwhile to give tangible form to your ideas. Then, through collectively making and sharing simple physical models of new kinds of GP services, participants developed a deeper understanding of the nature of healthcare service delivery and experience. By spending an hour making the models, they went through cycles of understanding the problem, making observations (based in some cases on their own experiences), generating ideas, exploring the ideas in relation to the problem, further developing the ideas, making new observations and building towards a (temporary) coherence between the problem and the solution.
Participatory leadership. The way the workshop was delivered resulted in handing over much of the responsibility for its success to participants. Throughout the workshop, participants were constructed as having to find their own answers to the question at the heart of the workshop about prototyping in government, rather than being persuaded (or not) by Andrea’s expert position.
All the way through, participants were invited to share what they wanted to learn, what they believed and what they knew. Andrea offered many opportunities for them to shape the activities, for example, calling out what animal to make next with Play-Doh, deciding which challenge they should work on together, or inviting them to move to another group if they wanted to.
Andrea’s use of humour and her self-deprecating stance further emphasized her rejection of holding an expert position. This opened up the question for participants about their degree of participation in the workshop.
Experiencing policy making. The design and facilitation of the workshop invited people to manipulate and organize material things – the Play-Doh, the cardboard boxes, the feathers. Andrea and the rest of us in the Policy Lab team intervened to reorganize the room to enable people to work together at tables, rather than being bounded by the conventional format of the room as we found it with rows of chairs facing a stage. Rather than operating in the domain of manipulating symbols, with which civil servants are very comfortable, the workshop foregrounded the material, the spatial and experiential. This decentred the civil servants from their own expertise. It also invited people to reflect on their own experiences as civil servants – the things they take for granted in the material and spatial organisation of their work and how it enables or inhibits how they approach problem finding and exploration in their day to day routines.
The next section will go on to explore what this approach offers to policy makers.
Note – the references are not yet checked or complete
 In sociological research exploring the public understanding of science, Mike Michael and others have developed the concept of “inventive problem making” to describe bringing together distinct perspectives on an issue in material and discursive form. See Michael, M. (2012). “What Are We Busy Doing?” Engaging the. Idiot.” Science, Technology & Human Values 37, no. 5 (2012): 528-54.
This blog is a public-facing vehicle for findings from my one-year, three-day a week AHRC research fellowship based in Policy Lab, a small team based within the Cabinet Office working with government departments. What I’ve discovered is that it was incredibly hard for me to make sense of this research while I am in the middle of it – at least in ways I want to share openly via a blog.
Now, as I write up some of my findings in various formats for sharing inside Policy Lab and within the policy profession, as well as with others outside government with an interest in design and policy experimentation, I am ready to do some blogging. But it’s the retrospective sense-making kind – a sort of short-form research lite – rather than near real-time reflections from the field which I had wanted to do. This is the first in a series of posts to chart what I now see as the main phases of the research.
Phase 0 – Discovery – September to December 2014
At the point of starting the fellowship I very little knowledge of policy making or how government works. Whitehall and Westminster were parts of London I rarely set foot in. I was more likely to see them on the news that go there. I already knew some of the Policy Lab team – Andrea Siodmok, who leads it, for example, has been involved with strategic design and policy issues for over a decade through her work at the Design Council and the DOTT Cornwall project and our paths had crossed several times. I had briefly met Beatrice Andrews of Policy Lab and Maria Nyberg of the Open Policy Making team (OPM) through my research on Mapping Social Design for the AHRC.
When I started, Policy Lab had been set up in April 2014 and was now mid way through its first demonstrator project with the Home Office on crime reporting. It existed in the form of a very small resource – some headcount (less than 3 people full-time), some budget, some desks to sit at – and what seemed like a constant stream of activities – workshops, discussions, meetings, writing up workshops, planning and drawing. Much of the discussion was about what Policy Lab could or should be and activities to bring it into being by trying things out. With blog posts on the Open Policy blog and a twitter feed it was already visible to the world outside government, while inside it was just as much in formation in relation to Cabinet Office and departmental priorities. The election in May 2015 was already on people’s minds.
Much of my participant observation consisted of turning up in the morning, and following the team around to their various workshops and meetings. In particular I sat in on several “Lab Lights”, short taster sessions during which Policy Lab would work with policy makers from a government department enabling them to try out using design methods on their challenges. Mostly I listened and asked some questions as I tried to understand what was going on. I wrote lots of notes (by hand) and tentatively took some photos (see the note on ethics below). I did some interviews which I sometimes recorded by audio. I did a few literature searches and nosed around the web.
The practicalities of getting inside the building where Policy Lab is based hampered my research. Policy Lab is part of the Government Innovation Group (GIG) team in the Cabinet Office, based in the HM Treasury building at 1 Horseguards Road. As a visitor to the building your host has to come down and escort you through the barriers and stay with you throughout your visit, which does not suit day-long visits, several days a week, where you might want to make tea, go for lunch, pop out of the building, or go to the toilet.
Then in mid-December, my security clearance came through and I finally got given a security pass giving me access to many government buildings. Around the same time the team switched to using a calendar system that meant I could see their diaries (even as I write this I am wondering how open to be about locations and technology providers, and erring on the side of caution). This meant I could see what was planned each day – I didn’t have to ask when I arrived what was happening and where and if I could come along. I was also given access to the digital drive where Policy Lab’s documents were stored which I could now roam around, much as I could now roam around the building unescorted. We talked about whether I should get a cabinetoffice.gov.uk email address too to have access to the email discussions where much of their work happened. But to get this I’d have to get a secure laptop and use it and I was not keen.
Working out what research ethics meant in this context too was a challenge. The Policy Lab team and the slightly larger OPM team who we sit next to and work closely with knew who I was and what I was doing. At meetings with others I made sure I that I introduced myself as a researcher and explained that I was taking notes and might be using them. I always emphasised that I would not attribute anything to anyone named without asking them which seemed to satisfy them, although even if I was not naming them or making them recognisable I was still researching them. Sometimes I took photos of workshops as did other members of the Policy Lab team and we said we might use these images publicly for Policy Lab – but where did Policy Lab end and my research begin? For my few formal interviews I brought a consent form for the participant to sign. But informed consent was tricky to work out for a long-term engagement when I was working consistently with some people, meeting many others each week, and had access to privileged information.
The blurred lines between research and my own strategic design practice emerged early on too. For example I helped facilitate one of the eight concurrent ideas days organised by Policy Lab for Northern Futures. For this I was paid as a facilitator on the same basis as the other facilitators. But unlike most of them as a researcher I was involved in helping shape and make sense of the ideas days and what learning they offered to Policy Lab and the Cabinet Office. My growing access to this world was had limits, of course. One of the people we worked closely with did not give me access to the summary of that event which was a report that was going to be seem by a minister. A line emerged – things that were for ministers were not things I could see.
A brief trip to Washington DC for a data science and humanities workshop, to New York to visit GovLab, the Parsons DESIS Lab and Public Policy Lab, and another short trip with some of the Policy Lab and OPM team to MindLab in Denmark, gave me access to other practitioner and practice-academic hybrids also experimenting with design and policy. By the Christmas break I had a sense of what Policy Lab was trying to do, I knew something of the culture of the civil service and the emerging “policy profession” and the OPM agenda. I was enjoying being part of this all-women (at that point) team. And I was excited by the idea that academic research could help shape organisational practice in the field.
My guides at this point were Dan Neyland’s Organisational Ethnography, Bent Flyvberg’s Making Social Science Matter and Geertz’s The Interpretation of Cultures. Re-reading these while visiting a central government department on regular basis helped me acknowledge the essential ambiguity inherent in participant observation. Just as Policy Lab was performing itself into being as a resource for the policy profession, so I too was performing myself into being as a practice-oriented researcher.
A sketch that pulls together bits of various literatures to try to make sense of why evaluating things like Policy Lab is hard (in the conventional terms of evaluation favoured by civil servants). It draws heavily on Roger Martin’s work in The Design of Business (2009) to connect the different logics required for innovation – abduction, deduction and induction – and combines this with other research in sociology and design.
Originally published on the Cabinet Office Open Policy blog on 27 March, this is my write up of an event I chaired during Open Policy 2015.
Increasingly on the agenda of policymakers is a need to understand the needs, capacities and perspectives of citizens, service users, beneficiaries and front line staff so that policies are fit for purpose and deliverable and public services are better designed. Ethnography is seen as one way to achieve this. Based on a methodology originally associated with anthropology, ethnographic approaches are now found within product development, innovation, strategy, marketing, and research and development in a wide range of organisations from Intel and Amazon to start-ups to central and local government. In the UK, the Government Digital Service (GDS) and government departments, as well as others such as Nesta and the King’s Fund have promoted ethnographically-informed approaches to doing user research.
But the value of ethnography is not simply that it’s a method for understanding people in the context of their own lives, although it does offer that. The real potential for ethnography in policymaking is to help reframe government’s understanding of its purposes and how the world in which it exists and which it shapes is changing. This insight emerged from a panel discussion organised by Policy Lab in the grand surroundings of the Churchill Room during Open Policy 2015, during which three people with different perspectives reflected on the opportunities and barriers for ethnography in government. An audience of over 50 people, the majority of whom were civil servants, gained a valuable overview from leading practitioners applying ethnographic approaches to contemporary organisational and social issues.
The first speaker, Dr Simon Roberts of Stripe Partners, set the scene, informed by his extensive knowledge of applied ethnography from consulting work and gained by twice co-chairing the international Ethnographic Praxis in Industry conference. He summarised what makes ethnographic research distinctive and illuminated the ongoing challenge of understanding its value and impact.
Next up was Lisa Rudnick from Interpeace, previously at the UN Institute for Disarmament Research, where she co-led the development of the Evidence-based Program Design tool and the Security Needs Assessment Protocol with Derek Miller, now of the Boston-based consultancy, The Policy Lab. Sharing perspectives gained from conducting ethnographic research in post-conflict contexts such as Nepal conducted with Ruth Edmonds, she highlighted the importance of not just using the parts of ethnography that generate local descriptions, but of engaging ethnographic analysis of those descriptions as well as the basis from which to design, or implement, policy for shaping local action on the ground.
The third speaker, Rupert Gill, is a policymaker within the Department of Work and Pensions, currently using ethnography as well as data science approaches on a joint project with the Department of Health and Policy Lab. He shared some of the challenges civil servants face when trying to make use of ethnography in a policymaking culture which values particular kinds of argument and evidence.
What ethnographic research is
Today’s ethnographic research in organisations exists on a spectrum from hypothesis-free, exploratory research over several months into topics such as “ageing and mobility”, to targeted requirements gathering over a few days to inform the design of a service.
As a kind of qualitative research, ethnography investigates worldviews, socio-cultural structures and the practices that shape behaviours. It’s not just finding out what people think, listening to what they say or watching what they do.
Ethnographic research makes a commitment to being there with people in their worlds – which these days includes people’s digital lives. Its emblematic method since anthropologist Malinowski went to the Trobriand Islands a century ago is participant observation.
What ethnographic research produces
Ethnography is not just descriptive fieldwork. It’s a theory-building endeavour that makes use of research from across the social sciences. While the data might include stories about people’s lives, in their own language and categories, or observations about what people do, what is just as important is the interpretive analysis of that data. Or as Lisa Rudnick put it, “It’s not the story that matters for policymaking. The value is in what makes the story make sense.”
As a result, the output of ethnography is informed by people’s stories, and generates insights derived from people’s day to day experiences, but is better understood as an analysis of a social world within which people exist within and have relationships with others including organisations, governments and places.
The opportunity for using ethnography for government
The value of ethnographic research is how it creates (re)framings of a social world and helps an organisation understand what it exists for. Reflecting on the impact on technology firms such as Intel, which have made extensive use of ethnography over nearly two decades, Simon Roberts argued, “Ethnography has created a space and a possibility for organisations to reshape their understandings of the world and their understandings of how they have those understandings.”
The opportunity is for government to address the complexity of society by understanding people better in the context of their lives, and then changing the focus of policy responses, especially when things are changing. Rupert Gill said there was an appetite for this within the civil service. “We hope to get insights we wouldn’t get elsewhere and use them to create interventions we wouldn’t otherwise have thought of.”
Barriers and challenges
The culture within which policy making takes place is dominated by the need to produce evidence that is statistically valid, and not “policy by anecdote”. The small sample sizes associated with ethnographic research may not be seen as valid in this context. Minsters who have to give an account to parliament about their policies feel more confident about analysis from large data sets. But there is a contradiction here, in that ministers also get first hand access to, and are influenced by, stories from their constituents – a kind of field data with very small samples sizes.
What’s needed is to combine quantitative data with other approaches, recognising what each brings. “It’s real depth that we need and we can’t get this from numbers,” argued Rupert Gill.
Within the UN and peacebuilding contexts in which she works, Lisa Rudnick shared how the approach she co-developed (with Derek Miller) makes managers accountable for the data they use (or don’t) to shape their decisions. Like the policy tests being used in the civil service, this involves asking managers considering a proposal if they have the right kind information for the question at hand, enough information and whether it’s reliable. Posing such questions makes any commitment to action rest on research findings, not on data points or methods.
Policy Lab, GDS and government departments continue to explore ethnographic approaches in practical projects in policy making. If you are a civil servant, look out for guidance on the Open Policy Making Toolkit, Civil Service Learning short courses and Policy Lab workshops to try using the approach yourself.
This post discusses what agile approaches can bring to policymaking. It is based on my recent participant observation in a “policy sprint” organized by the Policy Lab in which I am embedded as an academic researcher. The sprint took the form of a two-day workshop held in Whitehall, followed the next day by a half-day stakeholder workshop, during the early phase of a larger, ongoing project involving Policy Lab, the Department of Work and Pensions (DWP) and the Department of Health (DH).
The focus of the project was better supporting people in relation to their health and employment outcomes. The activities involved a co-located, cross-disciplinary and cross-government group of people including policymakers and specialists skilled in ethnography, data science and design research, in exploring a policy issue and working out collectively what research and design activities were needed to move the project on.
Inspired by and drawing on the “sprints” used in agile software development, this event brought a new way of working to policymaking. A recent blog post by Lisa Ollerhead of the Cabinet Office Open Policy Making team, drawing on her experience of GovCamp, in which she adapted the principles of the agile manifesto discussed what agile approaches might bring to policymaking. She concluded that one significant challenge is establishing what the deliverable or output of a sprint is.
As part of its remit to try out new tools and technique for policymaking, Policy Lab decided to use an agile-inspired approach within one of its demonstrator projects to see how these principles could be adapted to the challenge of developing policies, rather than software. A blog post published right after the event by Cat Drew of Policy Lab, who led the sprint, summarised what happened in the January workshop and how it worked.
The purpose of this post is to analyse what the sprint did for the project. The data on which it draws are notes and photographs from my participant observation, emails and documents, as well as interviews and responses to a survey distributed to participants after the event.
Background – agile, lean, design thinking for policymaking
The concepts and activities used by Policy Lab to shape its policy sprint are not new. Within government, many of the principles and activities associated with agile software development inform the work of the UK Government Digital Service (GDS). For example the GDS approach to designing government services that are “digital by default” is shared here, including a discussion of using sprints. Some of GDS’s terminology and approaches are also found in government departments, especially that have in-house digital teams. The space that Policy Lab is exploring bringing this approach to the early stage of making policy, not building digital services.
Internationally there are other examples of bringing design and digital approaches to policymaking, which share lineages with agile software development. The Helsinki Design Lab, part of Finish innovation agency Sitra which was active between 2008-2011 organized “studios”, typically week-long collective inquiries into an issue such as ageing in Finland, the outputs of which included a roadmap of strategic improvements (see Steinberg 2014). These approaches are closely related too to design-led innovation. For example, innovation consultancy IDEO organizes “deep dives” into issues such as the one on the shopping cart filmed by ABC television in 1999, which involved multi-disciplinary teams collectively inquiring into a problem area, generating ideas and creating rough prototypes of potential solutions.
Related principles such as looking at the bigger picture and systems thinking are also found in other UK policy initiatives like the Total Place programme run by the Leadership Centre for Local Government. The Troubled Families programme takes a “whole person” perspective on families and their interactions with public services.
There are also overlaps with lean start-up, popularized by books by Steve Blank and Bob Dorf (2012) and Eric Ries (2011). Key concepts here include rapid experimental cycles in which entrepreneurs articulate their hypotheses, build software in chunks known as “minimum viable products”, test them with customers and then, depending on the results, “pivot” by switching to a different product or business model if necessary.
Within agile software development, a sprint is usually understood as a time-limited set of activities which result in delivering working software. Used more loosely in the context of policymaking, here this policy sprint was an intense burst of activity that focussed on the early stage of a project – called “discovery” in software development, which will be followed by other short, intense bursts of co-design and prototyping over the months to come.
Principles in the policy sprint
In the preparation, design and facilitation of its policy sprint, Policy Lab recombined principles borrowed from agile, design and lean start-up as follows:
Recognition that a project’s context cannot be fully understood or defined. Instead, the focus is on enabling a team to form and to work together to achieve something meaningful in a short time frame. Instead of trying to do a definitive analysis, the emphasis is on generating provisional accounts of an issue, and on an orientation towards repeated cycles of collective learning, idea generation, prototyping and synthesising.
Staging a collective enquiry into the issue. The sprint event recognised that there is already considerable data and analysis within the two departments involved and beyond. The design and facilitation of the workshop enabled a range of participants from different backgrounds collectively identifying “what we know” and “what we don’t know” to synthesise current knowledge and identify gaps relevant to the project at hand – albeit without being fully comprehensive – in order to move forward the project’s goals.
An orientation towards cycles of practical work. The development of new products or software should be based on trying things out with real customers (lean start-up) or via cycles of iteration based on defining user needs, delivering software and testing it with users (agile). These partial, temporary placeholders for solutions are termed minimal viable products (in lean start-up), which is the minimum set of functionality that is of value to end users. Agile software developers use the terminology of alphas and betas, which suggests evolutionary development. The policy sprint was positioned as the start of a project structured with fast learning cycles.
A holistic approach to understanding the experiences of the target population and those around them. Lean start-up aims to identify customers’ needs and test hypotheses by engaging them directly. Agile starts by identifying user needs and builds software solutions for them. However the context of policymaking is systemic: there is no single customer and nor should the focus just be on only service users. Instead a holistic agile approach for policymaking acknowledges citizens and service users, but also family members, other members of the public such as neighbours, front line staff and managers in delivery agencies, volunteers, but also ministers and other actors in the wider political and policy environment. For example in the case of this project, the sprint brought into focus the role of GPs, HR personnel and line managers as well as family members, friends and colleagues shaping the experience of people becoming ill.
Less emphasis on producing definitive documentation, and more emphasis on co-articulating just-in-time knowledge. The emblematic artefact associated with design thinking and agile software is the post-it note. Along with whiteboards, flipcharts and other such low-fi organizational technologies, post-it notes emphasise the provisionality of ideas and knowledge because they are easy to write on, stick up, move around, edit, or remove. Further, the collective activity of writing things down on post-its and sharing and clustering them on a wall makes collective knowledge and ideas visible to everyone – at least those who are in the room and able to see. Collections of post it notes stuck on to flipchart paper, which are then photographed and shared offer repeated snapshots of the current status of a project. In contrast to the highly polished, formal reports produced by government, such arrangements of small fragments of ideas and knowledge highlight policy making as a work-in-progress.
Blank, S. and Dorf, B. 2012. The Start-Up Owners Manual: The Step-by-step Guide for Building a Great Company, Volume 1.
Steinberg, M. 2014. Strategic Design and the Art of Public Sector Innovation. In Bason, C. (ed). Design for Policy.
Reis, E. 2011. The Lean Startup: How Today’s Entrepreneurs Use Continuous Innovation to Create Radically Successful Businesses.