Sophie Park, Clinical Senior Lecturer/ Senior Academic General Practitioner, UCL
We all have transformational moments in our careers, developing new ways of understanding the world around us. I have always enjoyed the Guardian April Fool spoof as a way of sharing with students how any theoretical lens shapes what is visible (or not) to us as we experience the world. In research, the lens we use for sense-making, is crucial to the production of coherent and good quality research. (Dowling, 2010) There are some important questions to consider when planning and producing high quality research:
- What is a problem? The process of problematizing a particular process or situation as worthy of critical examination is in itself shaped by the theoretical lens (or ‘goggles’) used. The lens we use to view the world, influences how we treat particular events as ‘normal’, ‘abnormal’, ‘acceptable’, or ‘unacceptable’. Calling something a researchable problem involves ‘making the familiar strange’ and suggests some new knowledge might be useful to know about this ‘problem’.
- What research questions are useful or interesting? Once a broad problem has been identified, a theoretical lens will shape how a specific set of questions are constructed. For example, determining what elements of the problem are treated as ‘interesting’ or ‘relevant’ to examine.
- What methods are chosen to examine the problem? The ‘goggles’ used by a researcher, determine which methodological processes are treated as legitimate; useful; and acceptable. Certain ways of interacting or maneouvering data are acceptable using one pair of goggles, and not another. For example, determining the nature of the process of interpretation, or how data is understood as being produced.
- What analysis is used? Ways of ‘playing’ or ‘working’ with data are closely linked to the methods used. In many cases, certain ‘goggles’ determine what is considered to be an acceptable or relevant way to analyse data, and hence determine what analytical approach is chosen.
- What counts as results? What is valued as legitimate knowledge to produce from ‘research’ varies enormously and is again closely linked to the ‘goggles’ used and how this preferences particular knowledge as valuable (or not).
- What is the purpose of the research? The ‘impact’ or ‘implementation’ of findings will vary enormously dependent upon the goggles used and the related purpose of the research. Some theoretical lenses, for example, focus on furthering theoretical or critical understanding, while others seek to produce operational findings explicitly relevant to a specific group or setting.
There are many approaches now used to synthesize evidence. The principles above are just as relevant to working with secondary data as primary. In our BEME review, for example, we used one (‘aggregative’) (Gough, 2012) approach to examine the effectiveness of placements as ‘an intervention’, but used a different approach (meta-ethnography) (Noblit, 1988) to examine the underpinning concepts and theories (Park, 2015). Each approach required us to use a different pair of 'goggles' and as a result directed our findings.
The Evidence Synthesis Working Group is a new collaboration of researchers funded by the NIHR School of Primary Care Research. One of the workstreams aims to examine elements of primary care work organisation. The nature of the problems required us to think about which type of 'goggles' we would need and realist ‘goggles’ were considered the most appropriate for a number of the review topics and questions..
So, this winter, I set off to Oxford to attend the Oxford Realist Review and Realist Evaluation Course. Here I share some of the insights from experiences attending this course, outlining some of the transformational shifts made possible to me through this new set of 'goggles'. What is attractive about this approach, is that it legitimizes interpretation, while making the production of knowledge (or ‘findings’) explicit to the reader.
- What is a problem? With realist 'goggles', something can be problematised as a ‘complex intervention’. This provides a way to examine the complexities of the relationship between a process and the context in which that process occurs. This lens, therefore, makes interesting or curious, the ‘causal mechanisms’ by which particular events result in particular outcomes.
- This approach enables the researcher to ask a number of questions aiming to produce knowledge about the nature and impacts of ‘causality’. For example rather than asking if something can work, questions focus on, ‘Why does it work?’ ‘How?’ ‘Why?’ ‘For whom?’ ‘To what extent?’ ‘In what circumstances?’. No one project is likely to answer all of these, but will focus on particular questions relevant to the aims of the researcher(s).
- The method or process used is evolving, but has some fundamental principles in order to make the steps of interpretation explicit. First, the researcher(s) aim to articulate their starting point or initial ‘hunch’. They write down their initial causal explanations for how a complex intervention might be working (how outcomes have been caused). This is called the initial programme theory and will aim to include at least one ‘context’ ‘mechanism’ and ‘outcome’ configuration, based upon the researcher(s) initial knowledge (maybe from reading or experience) of the project area.
- The analysis is iterative, meaning that it evolves in response to the research questions and the emerging data. The researcher(s) will begin to collect sources of data to refine, refute or confirm the initial programme theory about a possible causal explanation. They will then start a process of detective work to build up a set of ‘data’. The researcher(s) will decide whether each bit of data falls within a ‘context’, ‘mechanism’ or ‘outcome’ category. They will then start to plot the relationship between each (configurations), gradually building up an overall picture from these multiple pieces of the jigsaw.
- The results or knowledge produced are contingent upon what data is available, and how the researcher(s) begin to categorise this data into particular parts of a programme theory. A project might, therefore, be helpful in identifying both existing and absent data contributing to a causal explanation: one important outcome of realist analysis can therefore be to inform recommendations about future research where existing knowledge is lacking. Insights (made visible to the researcher) become possible through the process of bringing together or ‘configuring’ data in relation to other data about this same intervention. Thus, relationships refuting or confirming particular causal explanations can begin to be developed.
- The purpose of research through these goggles is to produce insights about contingent causal explanations for particular processes or interventions. In other words explanations of when desired outcomes are likely to occur under which contexts and for whom. It enables the process or intervention to be considered not as a discreet, isolated event, but as contingent upon the context in which it takes place. This approach is, therefore, becoming increasingly popular among practitioners (such as healthcare settings) where the subtleties of when, where and for whom an intervention might produce a desirable outcome (or not), are extremely relevant. The process can only say as much as the available data, and so may also be useful in identifying particular areas which require further research.
We all categorise our experiences all the time. Sometimes we will be explicitly aware of a theoretical lens or ‘goggles’ shaping the way in which we engage with that process, at other times not. As with any new process, the initial learning can be challenging, trying to determine what ‘counts’ as a ‘context’, ‘mechanism’ or ‘outcome’.
As with many research approaches, there is often no one correct answer, and it is through continued discussion and dialogue between the data, research questions and critical colleagues, that clarity begins to emerge about possible categories and configurations. The process is inevitably constrained by available resources, framing the feasibility and scope and focus of any particular realist project.
Ray Pawson’s ‘The Science of Evaluation: A Realist Manifesto’ offers an interesting read about different aspects of the realist approach. The RAMESES website (www.ramesesproject.org) also offers some useful resources and networking opportunities in order to understand particular ways in which the approach has been used to date. Now the challenge is to put theory into practice!
Acknowledgements: Many thanks to Geoff Wong and Carl Heneghan for their helpful comments on a blog draft.
Conflict of Interest. SP and GW receive funding form the NIHR SPCR Evidence Synthesis Working Group. GW also coordinates a Realist Review module at the University of Oxford
Biography:
Sophie Park is a GP and Director of UG Medical Education (Community and Primary Care) + Head of Teaching Primary Care and Population Health, UCL Medical School. Sophie is Chair of the SAPC Education Research Group. She is a collaborator with the NIHR funded ESWG, including roles as PI & Co-I for work-stream 4 reviews and Co-Chair of the Capacity Building Group.
Bibliography
DOWLING, P., AND BROWN, A. 2010. Doing Research / Reading Research: Re-interrogating Education, Oxon, Routledge.
GOUGH, D., OLIVER, S., AND THOMAS, J. 2012. An Introduction to Systematic Reviews, London, SAGE.
NOBLIT, G. W., AND HARE, R.D. 1988. Meta-ethnography: synthesizing qualitative studies, Newbury Park, Sage.
PARK, S., KHAN, N., HAMPSHIRE, M., KNOX, R., MALPASS, A., THOMAS, J., ANAGNOSTELIS, B., NEWMAN, M., BOWER, P., ROSENTHAL, J., MURRAY, E., ILIFFE, S., HENEGHAN, C., BAND, A., AND GEORGIEVA, Z. 2015. A BEME systematic review of UK undergraduate medical education in the general practice setting: BEME Guide No. 32. Medical Teacher, 37, 611-630.