This is a story of how open-mindedness transformed a study, opening up a new area of research. It’s about the different demands of a field that isn’t the one you trained in. And it’s about how we can ensure we’re safe and effective in helping others.
Imagine you are a medical student. You are likely to practise communication skills with simulated patients or actors before consulting with real patients; you will be handed a pig gut to stitch before being allowed to carry out an operation under supervision; you would carry out a rectal examination on a mannequin before getting your hands on the real thing. This is simulation-based education (SBE). SBE is a means of allowing healthcare students and postgraduate trainees deliberate hands-on practice of clinical skills and behaviours prior to, and alongside, entry into clinical environments. The aim is to develop safe clinicians by creating alternative situations and settings in which to learn skills and behaviours.
Simulation is required in healthcare education for a number of reasons. The traditional method of training our healthcare professionals – unstructured clinical experience – was shown to be educationally ineffective. Much healthcare education and training has therefore shifted to a competency- or outcomes-based model of teaching and learning, where objectives and outcomes, assessment and feedback, practice and supervision became the norm. Concurrently, reduced availability of patients for teaching and learning healthcare, due to changes in healthcare delivery as well as increased emphasis on protecting patients from unnecessary harm, has placed limits on the nature of patient contact (particularly for relatively inexperienced learners). Last but not least, in many countries, including the UK and the rest of Europe, hours of training have now been strictly controlled by working time legislation, leading to increased interest in alternative ways of learning. SBE addresses all of these issues – decreasing reliance on training on real patients, allowing for instant feedback for correction of errors and for directing learning, optimising use of valuable clinical time, enhancing the transfer of theoretical knowledge into the clinical context, and ensuring learners are competent before exposure to real patients.
SBE has a long history in psychology teaching. The classic studies of Milgram and Zimbardo involved simulated environments. Undergraduate psychology teaching uses simulation in the form of case studies, role playing and interviewing. Yet, relatively speaking, the limited amount of research into this topic in psychology indicates that SBE is either underused, or underresearched. For example, a 2007 review by Flanagan and colleagues of 458 articles on simulation in healthcare found 95 per cent used simulation in medical populations but extremely limited use in allied health groups, which encompassed clinical psychology.
The vast field of SBE research in medical education shows a positive relationship between SBE and learning outcomes, including the development of technical (e.g. inserting a cannula) and non-technical skills (e.g. communication, team working), learner confidence and, critically, patient outcomes and patient care practices (see, for example, a 2004 review led by William McGaghie). However, while it is absolutely crucial to know what works in SBE, limiting the focus of research to outcome and effectiveness studies means understanding of SBE remains limited. We need to extend the range of approaches to researching this field – if simulated training is based on limited models of learning, it risks inadequately preparing medical and other healthcare trainees for practice. With this in mind, my story focuses on my involvement in SBE.
A cold call
One day, I received a call from a surgical consultant based in Inverness, Ken Walker. Ken and colleagues had designed the ‘Highland Surgical Boot Camp’, an intensive, four-day simulation experience using experiential learning and hands-on practice to learn new skills and knowledge in a safe environment, based on the principles of a military boot camp. Highland Surgical Boot Camp was the first of its kind in the UK, aimed at new surgical trainees, and designed to accelerate learners’ transition from the Foundation Programme (the first two years of generic training after medical school) into the surgical training pathway. The educational content included simulation-rich training in non-technical, communication and operative surgical skills, including a simulated ward round, letter-writing sessions and role-play of difficult consultations. ‘Memorable case’ narrative sessions were designed to recreate the coffee-room discussions of the apprenticeship model. Formal social events were incorporated into the programme, and informal socialisation among learners was encouraged.
Ken and his colleagues worked hard to establish this innovative programme, and after a couple of successful Boot Camps, had time to catch their breath and start thinking about evaluating Boot Camp. Ken had been given my name as an ‘expert’ in all things medical education research, a description that is flattering, if untrue. He asked if I would be interested in planning this evaluation and applying for a research grant to do so.
We wrote a funding bid that focused on individual learning processes. Specifically, and drawing on the work of Tony Artino and his colleagues, our project proposed a theory-driven approach, self-regulation learning-microanalytic assessment and training (SRL-MAT), to assess how participants generated and used feedback about their learning to optimise their strategic pursuit of personal goals during Boot Camp activities. We were successful in a competitive funding process, and obtained a grant to support this work from the Clinical Skills Managed Educational Network.
And so the study commenced. Our first task was obtaining ethical approval for activities including: pre-camp telephone interviews with the participants (to assess motivations for signing up, and paying for Boot Camp, as well as very specific questions to do with self-regulated learning), questionnaire completion, observation and questioning during Boot Camp tasks and social sessions. The Faculty were keen that I observe all sessions, to generate new ideas for research and to give them feedback on the educational approaches they had designed and implemented. They talked freely and enthusiastically about why Boot Camp was important and innovative, and how it fitted with wider surgical training. The participants, who I had spoken to in advance on the telephone, talked to me and between themselves over coffees, during breaks, etc. I also attended various social events, although I politely declined the offer of tubing on a local river. In winter. In Scotland. It was a fascinating, fun few days.
On analysing the data we identified individual variation in SRL ability but no clear relationship between SRL ability and self-efficacy. We found that poorer learners use outcome measures to judge their performance, but good learners use process measures (in keeping with previous studies). Weak and good learners were identified very quickly by Faculty, in line with the SRL-MAT data. This was fascinating stuff. Yet at the same time, something was at the back of my mind – that there was more to Boot Camp than met the eye.
I started to question what we should be evaluating. It was dawning on me that my assumptions about surgical education, my relative lack of expertise in SBE and my excitement about having an opportunity to use self-regulation theory and measurement, had led me to look at Boot Camp solely in terms of a means of individual, cognitive and acquisitive learning. Of course, this is an essential perspective on SBE research, but is it the only one?
I examined the interview data afresh, and reread the notes I had taken throughout the Boot Camp. I was also editing some chapters sent in by colleagues for inclusion in my book Researching Medical Education. These chapters referred to various theories used in exploring and understanding the influence of people and context on learning, according to Vygotsky’s notion that learning is a socially constructed process.
The penny dropped: although designed to accelerating individual learning and skills acquisition, Boot Camp was inherently a social activity, bringing together groups of trainees/residents (the learners) and Faculty, in a residential situation away from the everyday clinical environment. By recognising this explicitly, we could start to understand how the relationships between Faculty, participants and activities during Boot Camp influenced learning, the nature and influence of what Frederic Hafferty and Ronald Franks have called the ‘hidden curriculum’. We needed to grasp the cultural context, the wider sociocultural, institutional and historical settings in which Highland Surgical Boot Camp was situated.
After communicating this new approach to our funder, who generously continued to support the work, and ensuring necessary ethical amendments, we ran a parallel study. Our stance was that learning at Boot Camp was participative rather than merely acquisitive, that environment, rules, tools and social relations are important to learning and knowing, and there are different valid perspectives on reality. We used the theoretical resources of Bourdieu and Engeström nested within an overarching framework of complexity theory to help us tease out some of the key elements via ethnographic observation and interviews. These theories helped us make sense of a lot of data, and provided contrasting perspectives on Boot Camp.
We found very powerful messages of ‘welcome to our world’ (the world of surgery) sent by a number of formal and informal activities. And we identified why trainees signed up to Boot Camp – not just as a means of gaining skills and knowledge but also ‘insider information’ on how best to progress in surgical training and assessment. Participants gained cultural capital in the form of learning what knowledge, skills and values were needed to succeed in the surgical training system. They also acquired social capital in terms of extending their networks of influence and support. We teased out the complexity, and influence, of the surgical training context on the nature and success of Boot Camp. We were able to explain how Boot Camp nested within a myriad of systems (including Royal Colleges, the NHS, the General Medical Council) and to survive and thrive had to adapt in response to the multiple voices of the various organisations as well as those of surgical trainees. Then there was a context of many competing demands on trainees’ and residents’ time and money, where trainees tended to be very strategic in what training they attended.
Rather than assuming that simulation occurs in a rarefied, predictable atmosphere, where there are no confounding variables or unpredictability, this was the first empirical study of a surgical Boot Camp to make explicit the social and cultural factors that were likely to influence learning in its broadest sense. The ethnographic data highlighted the explicit and hidden curricula of Boot Camp, of enculturation and socialisation into surgical training, and a way of participants gaining social capital in relation to both progressing in training and seeing what life was like for consultant surgeons.
The paper of this study was published via open access early in August 2016. It won the inaugural Copenhagen Academy for Medical Education and Simulation (CAMES) prize for innovation in simulation research later that month. The paper from this work is already stimulating discussion and thinking in the field, and I hope, will lead to SBE architects acknowledging and addressing the social and cultural aspects of learning when planning similar enterprises across healthcare education.
Back to psychology
As someone with a broad training in psychology, including clinical psychology, and having worked in this subspecialty for many years, I cannot help but muse on the differences between SBE in medical education and applied psychology. Things do not seem to me to have moved on in psychology in the decade or so since Flanagan et al.’s paper: a quick literature search identified that there remain few studies of simulation education in this area and calls for more of it – for example from Pieter Nel in 2010 – seem to have been largely ignored.
Yet, like medical education and training, psychology is now positioned within an outcomes-based pedagogic model where trainees are required to achieve relevant standards of proficiency (overarching competencies). So why, in the British Psychological Society’s Standards for Doctoral Programmes in Clinical Psychology, is simulation mentioned only once, in relation to supervisor or programme staff observation and assessment of clinical skills in simulated situations (e.g. role plays involving service users, colleagues or actors)? Surely it is just as important for clinical psychology trainees as it is surgical trainees to be assessed as baseline competent before working with patients? This is not just about courses teaching empirically supported therapies but also about ensuring trainees are competent in a range of technical (therapeutic) and non-technical skills before being allowed to work first under supervision, and then independently.
My impression is that the barriers to simulation that were present in medical education 10–20 years ago, are probably pertinent to psychology now. Inevitably there remains a lack of evidence for the effectiveness of SBE in applied psychology, but we could draw on the multitude of work on simulation in clinical communication skills training in medicine to move forward, particularly work of this nature carried out in psychiatric settings (see Cleland et al., 2009). Any innovations must of course be accompanied by robust evaluation using different levels of assessment, from learner reactions through Kirkpatrick’s hierarchy, to patient outcomes. The parameters for effective SBE are certainly transferable from medicine to clinical psychology (e.g. repetitive practice, feedback, clinical variation, increasing difficulty: see Issenberg et al., 2005).
The next barrier is a lack of expert faculty, and facilities. Yet there are centres that seem to have embraced this approach to learning within psychology and thus could extend their remit to training the trainers and operationalising SBE. As per Boot Camp, perhaps the best place for simulation in psychology is early in training, prior to exposure to real patients. Simulation is about skills learning: surely it is better to have trainees practise with simulated patients until they can demonstrate basic professional competencies?
Change is never easy, but the irony is that many people developing and evaluation SBE in medical education are psychologists. Embracing SBE and other contemporary pedagogic models will open up new areas of educational practice and research in psychology, and allow us to gather evidence on which to base educational decisions.
- Jennifer Cleland is Professor (John Simpson Chair) of Medical Education Research at the University of Aberdeen, a Chartered Psychologist and Associate Fellow of the BPS
Aggarwal, R., Mytton, O.T., Derbrew, M., et al. (2010). Training and simulation for patient safety. Quality Safety in Health Care, 19, i34-i43.
Artino, A.R., Brydges, R. & Gruppen, L.D. (2015). Self-regulated learning in medical education: Theoretical perspectives and research methods. In J. Cleland & S.J. Durning (Eds.) Researching medical education (pp.155–166). London: Wiley-Blackwell.
Bandura, A. (1977). Self-efficacy: Toward a unifying theory of behavioral change. Psychological Review, 84, 191–215.
Bourdieu P. (1986). The forms of capital. In J.G. Richardson (Ed.) Handbook of theory and research for the sociology of education (pp.241–258). New York: Greenwood Press.
Cleland, J.A., Abe, K. & Rethans, J.J. (2009). The use of simulated patients in medical education: AMEE Guide No 42. Medical Teacher, 31, 477–486.
Cleland, J. & Durning, S.J. (Eds.) (2015). Researching medical education. London: Wiley-Blackwell.
Cleland, J.A., Walker, K., Gale, M. & Nicol, L.J. (2016). Simulation-based education: Understanding the complexity of a surgical training ‘Boot Camp’. Medical Education, 50, 829–841.
Engestrom Y. (2001). Expansive learning at work: Towards an activity theoretical reconceptualization. Journal of Education and Work, 14, 133–156.
Flanagan, B., Clavisi, O. & Nestel, D. (2007). Efficacy and effectiveness of simulation based training for learning and assessment in health care. In Clinical Skills and Simulation. Victorian Government Health Information, Melbourne.
Hafferty, F.W. & Franks, R. (1994).The hidden curriculum, ethics teaching, and the structure of medical education. Academic Medicine, 69, 861–871. http://journals.lww.com/academicmedicine/Abstract/1994/11000/The_hidden_...
Issenberg, S.B., McHgaghie, W.C., Petrusa, E.R. et al. (2005). Features and uses of high-fidelity medical simulations that lead to effective learning: A BEME systematic review. Medical Teacher, 27, 10–28.
Kirkpatrick, D.L. & Kirkpatrick, J.D. (1994). Evaluating training programs. Oakland, CA: Berrett-Koehler.
McGaghie, W.C., Issenberg, S., Barsuk, J.H. & Wayne, D.B. (2014). A critical review of simulation-based mastery learning with translational outcomes. Medical Education, 48, 375–385.
Nardi B. (1996). Studying context: A comparison of activity theory, situated action models, and distributed cognition. In B.A. Nardi (Ed.) Context and consciousness: Activity theory and human–computer interaction (pp.69–102). Cambridge, MA: MIT Press.
Nel, P.W. (2010). The use of an advanced simulation training facility to enhance clinical psychology trainees’ learning experiences. Psychology Learning and Teaching, 9, 65–72.
Vygotsky, L.S. (1978). Mind in society. Cambridge, MA: Harvard University Press.
BPS Members can discuss this article
Already a member? Or Create an account
Not a member? Find out about becoming a member or subscriber