A robotic arm and a human hand engaged in a handshake, with the robotic system displaying soft and hard materials.

Credit: Sriramana Sankar/Johns Hopkins University.

Robotics

Feeling is believing: Bionic hand 'knows' what it's touching, grasps like a human

Pioneering prosthetic hand carefully conforms and adjusts its grasp to avoid damaging or mishandling whatever it holds

Name
Roberto Molar Candanosa
Email
rmolarc1@jh.edu
Office phone
443-997-0258
Cell phone
443-938-1944

Johns Hopkins University engineers have developed a pioneering prosthetic hand that can grip plush toys, water bottles, and other everyday objects like a human, carefully conforming and adjusting its grasp to avoid damaging or mishandling whatever it holds.

The system's hybrid design is a first for robotic hands, which have typically been too rigid or too soft to replicate a human's touch when handling objects of varying textures and materials. The innovation offers a promising solution for people with hand loss and could improve how robotic arms interact with their environment.

Details about the device appear today in Science Advances.

"The goal from the beginning has been to create a prosthetic hand that we model based on the human hand's physical and sensing capabilities—a more natural prosthetic that functions and feels like a lost limb," said Sriramana Sankar, a Johns Hopkins PhD student in biomedical engineering who led the work. "We want to give people with upper-limb loss the ability to safely and freely interact with their environment, to feel and hold their loved ones without concern of hurting them."

The device, developed by the same Neuroengineering and Biomedical Instrumentations Lab that in 2018 created the world's first electronic "skin" with a humanlike sense of pain, features a multifinger system with rubberlike polymers and a rigid 3D-printed internal skeleton. Its three layers of tactile sensors, inspired by the layers of human skin, allow it to grasp and distinguish objects of various shapes and surface textures, rather than just detect touch. Each of its soft air-filled finger joints can be controlled with the forearm's muscles, and machine learning algorithms focus the signals from the artificial touch receptors to create a realistic sense of touch, Sankar said.

"The sensory information from its fingers is translated into the language of nerves to provide naturalistic sensory feedback through electrical nerve stimulation," Sankar said.

Key Takeaways
  • Johns Hopkins biomedical engineers created a prosthetic hand that can grip objects with near-human precision skill.

  • Unlike traditional robotic hands, this bioinspired hand combines rigid and soft materials, along with three layers of tactile sensors, to mimic real touch.

  • Controlled by forearm muscles and machine learning, the hand successfully manipulated 15 everyday objects.

In the lab, the hand identified and manipulated 15 everyday objects, including delicate stuffed toys, dish sponges, and cardboard boxes, as well as pineapples, metal water bottles, and other sturdier items. In the experiments, the device achieved the best performance compared with the alternatives, successfully handling objects with 99.69% accuracy and adjusting its grip as needed to prevent mishaps. The best example was when it nimbly picked up a thin, fragile plastic cup filled with water, using only three fingers without denting it.

"We're combining the strengths of both rigid and soft robotics to mimic the human hand," Sankar said. "The human hand isn't completely rigid or purely soft—it's a hybrid system, with bones, soft joints, and tissue working together. That's what we want our prosthetic hand to achieve. This is new territory for robotics and prosthetics, which haven't fully embraced this hybrid technology before. It's being able to give a firm handshake or pick up a soft object without fear of crushing it."

To help amputees regain the ability to feel objects while grasping, prostheses will need three key components: sensors to detect the environment, a system to translate that data into nerve-like signals, and a way to stimulate nerves so the person can feel the sensation, said Nitish Thakor, a Johns Hopkins biomedical engineering professor who directed the work.

"The goal from the beginning has been to create a prosthetic hand that we model based on the human hand's physical and sensing capabilities—a more natural prosthetic that functions and feels like a lost limb."
Sriramana Sankar
PhD student, Biomedial engineering

The bioinspired technology allows the hand to function this way, using muscle signals from the forearm, like most hand prostheses. These signals bridge the brain and nerves, allowing the hand to flex, release, or react based on its sense of touch. The result is a robotic hand that intuitively "knows" what it's touching, much like the nervous system does, Thakor said.

"If you're holding a cup of coffee, how do you know you're about to drop it? Your palm and fingertips send signals to your brain that the cup is slipping," Thakor said. "Our system is neurally inspired—it models the hand's touch receptors to produce nervelike messages so the prosthetics' 'brain,' or its computer, understands if something is hot or cold, soft or hard, or slipping from the grip."

While the research is an early breakthrough for hybrid robotic technology that could transform both prosthetics and robotics, more work is needed to refine the system, Thakor said. Future improvements could include stronger grip forces, additional sensors, and industrial-grade materials.

"This hybrid dexterity isn't just essential for next-generation prostheses," Thakor said. "It's what the robotic hands of the future need because they won't just be handling large, heavy objects. They'll need to work with delicate materials such as glass, fabric, or soft toys. That's why a hybrid robot, designed like the human hand, is so valuable—it combines soft and rigid structures, just like our skin, tissue, and bones."

Other authors include Wen-Yu Cheng of Florida Atlantic University; Jinghua Zhang, Ariel Slepyan, Mark M. Iskarous, Rebecca J. Greene, Rene DeBrabander, and Junjun Chen of Johns Hopkins; and Arnav Gupta of the University of Illinois Chicago.

This research was funded by the grant "Neuromorphic Feedback: A Strategy to Enhance Prosthesis Embodiment and Performance" from the Department of Defense through the Orthotics and Prosthetics Outcomes Research Program (W81XWH2010842) and the National Science Foundation.