Skip to content

AI researchers develop world-leading pain detection technology

University of Southern Queensland advance new artificial intelligence approach to pain assessment.

More often than not, doctors and nurses have to rely on patient self-reporting when making important pain management decisions, but a new artificial intelligence approach is now aiming to change that traditional practice.

Researchers from the University of Southern Queensland have developed what they claim is one of the world’s most accurate pain detection models.

The model developed automatically detects pain levels from facial expressions and the researchers are now working to implement the model into an app as a real-time pain assessment tool.

Project leader Professor Jeffrey Soar said it would take a lot of the guesswork out of pain assessment and lead to improved health outcomes.

“Pain is relatively difficult for clinicians to identity and manage using a patient’s self-report, but it’s even more difficult when the patient has limited capacity to describe their pain,” Professor Soar said.

“This tool will allow clinicians like doctors or nurses to objectively evaluate the severity of pain before making effective pain management and treatment decisions.

“It will especially benefit those who are unable to describe their pain, like toddlers, people with dementia and patients in postoperative care or intensive care units.”

The research team, consisting of Dr Ghazal Bargshady, Professor Ravinesh Deo and Dr Xujuan Zhou, created a new machine learning system that can extract, classify and process key information from facial expressions captured on video frames.

In the tests, the system achieved a 92.44 per cent accuracy rate in detecting pain intensity, which was better than any previously published results worldwide.

“Facial recognition and pain detection technologies have evolved significantly over the past 15 years, but still suffer from many challenges like poor image quality and a lack of suitable datasets for algorithm design and testing,” Professor Deo said.

“We were able to overcome these challenges by adopting in our system advanced deep learning neural networks, which create an unending opportunity to better identify relatively complex features and patterns in clinical datasets, including human facial images.

“This has improved the efficiency and effectiveness of the model to be able to detect pain in four distinct levels. It’s a considerable improvement compared to the existing state-of-the-art models which can only detect if the patient is in pain or not.”

Professor Soar said the next step was to bring the findings into clinical practice.

“Working with our partners, we are looking to incorporate the AI model into an app that can be used on any device which has a built-in camera, such as a mobile phone, tablet or laptop,” he said.

“This will make it easier for anyone treating someone in their care to remotely access key information instantly and diagnose and treat patients faster and more accurately than before.

“We hope it will be ready for clinical use in the not so distant future.”

As well as funding from the Australian Research Council (Linkage Project), project support was received by the University of Southern Queensland International PhD Fees Scholarship (lead author Ghazal Bargshady, University of Southern Queensland PhD student‪) and healthcare ICT company Nexus eCare.

The research was published in high-ranking Q1 journal Applied Soft Computing and can be read here

professor standing with diagram
Project leader University of Southern Queensland Professor Jeffrey Soar