This site is intended for health professionals only


Artificial intelligence ‘has potential’ to relieve pressures on NHS workforce, study finds

Artificial intelligence ‘has potential’ to relieve pressures on NHS workforce, study finds

Artificial intelligence has the potential to relieve pressures on the NHS and its workforce,’ but ‘frontline healthcare staff will need bespoke and specialised support before they will confidently use it,’ a study has found. 

A new report, Understanding healthcare workers’ confidence in AI, published today by Health Education England and NHS AI Lab, recommends the development and deployment of ‘educational pathways and materials’ for all healthcare professionals ‘to equip the workforce to confidently evaluate, adopt and use AI’. 

HEE said the NHS ‘aims to be a world leader in the use of emerging technologies like AI’ to help ‘address the backlog in elective procedures and make a difference in helping to detect and manage conditions, such as cancer and cardiovascular disease, earlier.’ 

However, HEE said ‘the vast majority of clinicians are unfamiliar with AI technologies’ and ‘there is a risk that, without appropriate training and support, patients will not equally share in the benefits offered by AI.’ 

Brhmie Balaram, Head of AI Research and Ethics at the NHS AI Lab, warned that ‘we must also be mindful that AI could exacerbate cognitive biases when clinicians are making decisions about diagnosis or treatment.’ 

The report, informed in part by interviews with workers in primary care settings, calls for ‘specialised support’ for healthcare workers to ‘use AI safely and effectively as part of clinical reasoning and decision-making,’ with ‘clinicians supported through training and education to manage potential conflicts between their own intuition or views about a patient’s condition and the information or recommendations provided by an AI system.’ 

A potential hazard is also that ‘a clinician may accept an AI recommendation uncritically, potentially due to time pressure or under-confidence in the clinical task,’ HEE warned. 

Dr Hatim Abdulhussein, National Clinical Lead for AI and Digital Medical Workforce at HEE, said: ‘Understanding clinician confidence in AI is a vital step on the road to the introduction of technological systems that can benefit the delivery of healthcare in the future. 

‘Clinicians need to be assured that they can rely on these systems to perform to levels expected to make safe, ethical and effective clinical decisions in the best interests of their patients.’ 

Ms Balaram said: ‘AI has the potential to relieve pressures on the NHS and its workforce; yet, we must also be mindful that AI could exacerbate cognitive biases when clinicians are making decisions about diagnosis or treatment. It is imperative that the health and care workforce are adequately supported to safely and effectively use these technologies through training and education. 

‘However, the onus isn’t only on clinicians to upskill; it’s important the NHS can reassure the workforce that these systems can be trusted by ensuring we have a culture that supports staff to adopt innovative technologies, as well as appropriate regulation in place.’ 

The report argues that how AI is governed and rolled out in healthcare settings can affect the trustworthiness of these technologies and confidence in their use, including factors such as ‘clear nationally driven regulation and standards.’ 

A second report from this research, to be published later this year, will outline suggested pathways for related education and training. 

The report and partnership with HEE is part of the NHS AI Labs’ AI Ethics Initiative, which was introduced to support research and practical interventions that can strengthen the ethical adoption of AI in health and care. 

In 2020, scientists used artificial intelligence to discover a new antibiotic that kills treatment-resistant diseases. 

See how our symptom tool can help you make better sense of patient presentations
Click here to search a symptom