Revolutionary advancements in AI and robotics bring us a step closer to making the sense of touch in prosthetics and robots as fast as humans. A recent study conducted at Uppsala University and Karolinska Institutet hints at the possibility of a prosthetic hand and robot being able to perceive touch just like a human hand. The findings help shed light on the potential benefits of this technology in restoring lost functionalities in patients recovering from a stroke.
The study, published in the globally renowned journal Science, serves as a paradigm shift in the realm of AI. "Our system can identify the type of object it comes across in nearly the same timespan as a blindfolded person. It can distinguish if the object in question is, say, a tennis ball or an apple, just by feeling it,” says Zhibin Zhang, a docent at the Department of Electrical Engineering at Uppsala University.
The innovative project is a result of close cooperative efforts between Zhibin Zhang, his colleague Libo Chen, and fellow researchers at the Signals and Systems Division at Uppsala University. They contributed their extensive data processing and machine learning expertise to the cause. Furthermore, a group of scholars from the Department of Neurobiology, Care Sciences and Society, Division of Neurogeriatrics at Karolinska Institutet also played a vital role in the study.
The potent artificial tactile system developed for the project was inspired by neuroscience, replicating the human nervous system's response to touch. The technology harnesses electrical pulses to process dynamic tactile information, mimicking the way a human nervous system operates. "With this technology, a prosthetic hand would feel like part of the wearer's body," explains Zhang.
The artificial system boasts of three core components: an electronic skin (e-skin) equipped with sensors that can gauge pressure through touch; a set of artificial neurons that traspad analogue touch signals into electrical pulses, and a processor for assessing these signals to identify objects. The system, theoretically, can learn to identify an unlimited number of objects. However, in their tests, researchers used 22 different objects for the purpose of grasping and 16 different surfaces to test the sense of touch.
The researchers are also working on developing the system even further. According to Assistant Professor Libo Chen, who headed the study, their aim is to incorporate elements of pain and heat sensitivity in the future. Furthermore, the system should also be able to differentiate between the materials it touches, like distinguishing between wood and metal.
This technology is believed to facilitate safer and more natural human-robot or prosthetic hands interactions. Navigating objects could be made as simple and dexterous as using a human hand.
"Our aim is to provide a complete artificial skin for a robot." Chen shares, "The skin contains millions of receptors. While current e-skin technology cannot deliver as many receptors as a human skin, our technology raises the bar, making it a real possibility.
Aside from improving human interaction with robots, these findings could also benefit the medical world, say the researchers. It could monitor movement disorders in patients suffering from Parkinson's disease and Alzheimer's disease, or could aid in restoring functionality in stroke patients.
Disclaimer: The above article was written with the assistance of AI. The original sources can be found on ScienceDaily.