Health workers need tech training – for themselves and their patients
There are huge risks involved if current and future healthcare professionals have to take up new forms of practice such as video consultations without sufficient training
You may also like
The use of computing technologies in healthcare practice is becoming more and more established in the UK and across the globe. For some time, NHS mental health workers have been able to prescribe access to self-help websites such as Fear Fighter, the NHS website now recommends a range of public apps and treatment sessions are routinely conducted through video calls with clients, as described in this blog on providing speech and language therapies over the internet.
These technologies can benefit people who might otherwise struggle to access health services, such as those living in remote areas. They might also be a preferred option for people with social anxiety or physical disabilities, for whom attending a co-located treatment session can be difficult. They will have been helpful during the pandemic where physical access to healthcare services was limited. Barriers to using technology continue to decrease; more and more people own a smartphone or computer, and the capabilities of public networks are improving even in remote areas. It won’t be long until many older people with dementia will be in possession of substantial computing skills developed over a lifetime.
- Emotions and learning: what role do emotions play in how and why students learn?
- Democratic assessment: why, what, and how
- Enabling innovation to flourish in the online and blended learning space
Opportunities for technology-mediated treatment are developing rapidly, and we must be aware of the risks if healthcare professionals are asked to engage in new forms of practice without sufficient training or support.
For example, during the pandemic, many mental health professionals have been required to deliver treatment sessions via video calls, often for the first time. Some have observed suicide attempts without the immediate availability of emergency response teams as in a hospital setting. Others have observed substance misuse during treatment sessions. These are unavoidable and unpredictable situations given the substantial distress that many people live with.
Optimistically, video-based treatment sessions might allow more people to be treated. However, while protocols might be developed to enable the best possible safety response, staff participating in video consultations might be regularly exposed to distressing situations, with less control than in face-to-face practice. Repeated exposure might contribute to secondary traumatisation, already a known danger for healthcare staff and a substantial risk to longevity of practice. More routinely, established clinical procedures for matching medication to health problems may not work so well for apps containing complex health-related content. A busy GP will often lack the personal knowledge to recommend an appropriate app for a specific health condition.
Technology seems unlikely to disappear from treatment processes, especially when health services are structurally underfunded when it comes to mental health treatment. As such, current and future healthcare workers need to be prepared to work with technologies, through practices that are as effective and safe as possible for both patients and staff. We need a substantial national effort to make sure that our current and future healthcare professionals are thoroughly prepared for using technology in their work. As providers of both pre-registration training and continuous professional development (CPD), universities will have a critical role to play in this.
Some of this work is underway already. My own department is innovating through virtual telehealth placements in which students from a range of professional groups are supported through the process of setting up and delivering an online consultation with volunteers who simulate being a patient, providing knowledge and experience to guide their future practice.
For CPD, the design and content of training courses will need to take into account variations in technological literacy and attitudes among current staff. In a focus group study, we found some healthcare professionals expressed doubt about whether people experiencing mental health problems could make use of a new technology. Whether accurate or inaccurate, attitudes about technology will shape how it is used in practice, and university-based education and training can be an important mechanism for enabling attitudinal change that is grounded in high-quality research evidence.
Universities should approach curriculum evolution by developing and maintaining a deep understanding of how technology is being used in health services. This could be enabled by clinical academics to act as a bridge between health systems and teaching. It could include the collection of clinical case studies on a substantial scale, in which a broad range of working practitioners describe their use of technology with specific patients, building on the long-standing use of clinical case studies in healthcare education. As well as informing curriculum development, a database of rich and informative case studies could be used as a teaching aid. It might even be shared between educational providers to maximise scope and minimise delivery cost.
To become embedded, knowledge developed and sustained by universities about technology-mediated practice needs to feed into curriculum review work. At the highest level, professional bodies should regularly assess their standards for university-provided education and training to ensure that these contain sufficient content on technology-mediated practice. Universities should review course structures and module catalogues to identify required changes and to select specific touchpoints where change can be delivered.
Given that technology-mediated practice will bring different forms of emotional labour to face-to-face practice, supporting the personal well-being of healthcare professionals and hence enabling longevity of practice should be an important focus of curriculum development. This might emphasise content on reflective practice, to enable people to process and learn from unexpected or challenging treatment experiences.
Where change is enabled by innovations in university teaching, we might complete the cycle of knowledge production by commissioning longitudinal research studies that evaluate the impact of innovations, and which draw on university-based competence in health research. This could include large-scale quantitative studies to collect evidence on attitudinal change, along with qualitative studies to document the mechanisms through which change occurs. In turn, research-led investigations could feed back into and inform pedagogy around technology.
If we get university-based training for current and future healthcare staff right, over time our health service will develop ever greater knowledge and competence in knowing when technology-mediated practice might benefit patients – and in knowing when it might not. These efforts might produce substantial patient benefit and hence enhance the health and well-being of the nation as well as open up crucial services to those who are currently excluded.
Stefan Rennick-Egglestone is senior research fellow in the School of Health Sciences at the University of Nottingham. This article presents his personal views, and he would like to thank Ada Hui for fact-checking it before publication.
If you found this interesting and want advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the THE Campus newsletter.