13 December,2023 08:11 AM IST | Sydney | IANS
Representational Image. Pic Courtesy/iStock
Australian researchers have developed a portable, non-invasive system that can decode silent thoughts and turn them into text using artificial intelligence (AI).
The technology could aid communication for people who are unable to speak due to illness or injury, including stroke or paralysis.
It could also enable seamless communication between humans and machines, such as the operation of a bionic arm or robot.
Researchers from the GrapheneX-UTS Human-centric Artificial Intelligence Centre at the University of Technology Sydney (UTS) who developed the system claimed it to be a "world-first".
ALSO READ
King Charles III focuses Christmas message on healthcare workers in year marked by royal illnesses
Christmas reunion for widow missing for 9 yrs
Maharashtra minister Sarnaik meets cricketer Vinod Kambli, says he is stable
Vinod Kambli receives Rs 5 lakh from Shiv Sena as part of support package
Fill up non-official members posts in mental health authority: HC to Delhi govt
In the study, participants silently read passages of text while wearing a cap that recorded electrical brain activity through their scalp using an electroencephalogram (EEG).
The EEG wave is segmented into distinct units that capture specific characteristics and patterns from the human brain. This is done by an AI model called DeWave developed by the researchers.
DeWave translates EEG signals into words and sentences by learning from large quantities of EEG data.
"This research represents a pioneering effort in translating raw EEG waves directly into language, marking a significant breakthrough in the field," said Professor C.T. Lin, Director of the GrapheneX-UTS HAI Centre.
"It is the first to incorporate discrete encoding techniques in the brain-to-text translation process, introducing an innovative approach to neural decoding. The integration with large language models is also opening new frontiers in neuroscience and AI," he said.
Previous technology to translate brain signals to language has either required surgery to implant electrodes in the brain, such as Elon Musk's Neuralink, or scanning in an MRI machine, which is large, expensive, and difficult to use in daily life.
These methods also struggle to transform brain signals into word level segments without additional aids such as eye-tracking, which restrict the practical application of these systems.
The new technology is able to be used either with or without eye-tracking. The new research was carried out with 29 participants. This means it is likely to be more robust and adaptable than previous decoding technology that has only been tested on one or two individuals, because EEG waves differ between individuals.
The use of EEG signals received through a cap, rather than from electrodes implanted in the brain, means that the signal is noisier.
The translation accuracy score is currently around 40 per cent, which the team hopes to scale upto 90 per cent. The study was selected as the spotlight paper at the NeurIPS conference, a top-tier annual meeting that showcases world-leading research on artificial intelligence and machine learning, held in New Orleans, US.
This story has been sourced from a third party syndicated feed, agencies. Mid-day accepts no responsibility or liability for its dependability, trustworthiness, reliability and data of the text. Mid-day management/mid-day.com reserves the sole right to alter, delete or remove (without notice) the content in its absolute discretion for any reason whatsoever