In the opening scene of the Oscar-winning movie The Hurt Locker, a remote-controlled military robot (a diminutive tank with a camera and pincer arm attached) ambles its way over a rubble-strewn Baghdad street toward a roadside bomb. The bot pulls a tiny wagon filled with explosives—to detonate the bomb—but stops short of its destination as the wheels literally come off the cart. When technology fails, a soldier dons a blast suit to enter the "kill zone" and finish the job. But just moments after the tech places the explosives, a man with a cellphone remotely detonates the bomb to disastrous effect.
Although a work of fiction, the scenario is culled from real-life combat encounters.
If only that bot could have navigated the terrain better or been better equipped to handle the situation so that the soldier didn't have to come to the rescue.
To this end, the Johns Hopkins Applied Physics Laboratory wants to create a better, smarter, and more adept robot that can operate in the thorniest of situations, and in the most humanlike fashion possible.
Meet Robo Sally, a can-do fearless machine. Don't let her Johnny Five from Short Circuit appearance fool you; she is one tough collection of gears and wires.
Robo Sally is the nickname for the Lab's Bimanual Dexterous Robotic Platform program that focuses on mimicking human capabilities with a remotely based operator. The ultimate goal of APL's approach is to provide the operator with the ability to perform necessary actions, manipulations, and tasks at safe distances in harmful or dangerous scenarios such as explosive ordnance disposal or investigation of a chemical or biological threat.
Many injuries and fatalities can occur even when the personnel in harm's way are highly skilled. And repetitive or mundane tasks can become dangerous or harmful when a human operator—such as a maintenance person at a nuclear reactor or a sentry at a checkpoint—lets his guard down.
APL began working in 2007 with what it calls "human capabilities projection" to remove the human from the hazardous environment but still complete the necessary task. A key to the approach was to leverage the upper limb and dexterous hand system technology from the Lab's revolutionary Modular Prosthetic Limb program.
The early version of the robot, referred to as Segway Sally, used a Segway mobility platform that could be controlled through a USB interface to an onboard computer. The team mounted a humanlike torso atop the platform to provide a logical structure for prosthetic arms and two anthropomorphically spaced cameras mounted to a pan-tilt neck. The camera eyes provide a video feed that streams into an immersive display for the operator.
At first, the robot could perform only the most basic functions, such as picking up an object, and had to be controlled every step (or tire tread) of the way. She's come a long way.
The robot's hands now feature haptic feedback, thanks to a series of sensors and actuators that allow the robot to "feel" and relay the sensation back to the operator, says Matthew Johannes, project manager for the Lab's Modular Prosthetic Limb system and co-principal investigator on the Human Capabilities Projection IR&D (internal research and development) program. APL has since developed a tactile feedback device to send these signals back to the user when performing telemanipulation tasks such as picking up an item and knowing what force to apply to grasp it.
"We wanted Robo Sally to be tactile, for the robot to physically interact and sense the environment," says Johannes. "To replicate the human hand is a tall order, but all the hooks are in place to control the fingers and apply the necessary force to perform sensitive motions. We are, for example, working on fine-control precision like grasping a grain of rice or small ball bearing."
The latest version of Robo Sally can be controlled through a variety of control modalities, from conventional and advanced "joysticks" to human motion mapping. Depending on conditions, the human operator can be as far as a half mile away to control the robot, which can travel at speeds of up to 20 miles an hour.
For legs, Robo Sally's torso can be attached to Segway wheels, a military-fielded robotic mobility platform, or a four-wheeled Synbotics platform.
The APL team now wants to reduce Robo Sally's cognitive load by leveraging recently developed autonomous and semiautonomous manipulation and mobility techniques. This means that the robot will be enhanced with obstacle detection so it knows to move around objects (a rock, a wall) in its path. By looking through the robot's computer-eye view of the world, an operator could simply tell the robot to pick up an object, grab a tool, or move something.
"We want the robot to have a supervised autonomy so that the user watches and only has to control the high-level functions while the robot takes care of the basics," Johannes says. "Moving from point A to point B requires a high level of user input, so it would be useful to remove that burden from the operator."
Johannes says this research is leading to an understanding of the necessary capabilities for systems in the field, some of which may be available within five years. Logical clients for the project would be the Department of Defense and law enforcement agencies. The work has also helped establish APL's role in the Navy-funded Advanced Explosive Ordnance Disposal Robotic System program, with the goal to develop the next generation of open architecture–based EOD robotic systems.
As for uses, Robo Sally can work on bomb disposal, chemical leaks, security checkpoints, or whatever else is needed. She also might one day become nurse Sally.
"We can foresee the day the platform might address a wounded soldier and take the place of a battlefield medic, tending to a gunshot wound or shrapnel injury," he says. "Right now, that level of capabilities might seem far off, but the end goal is to develop a system that can replicate human functions in the harshest environments."
Posted in Science+Technology
Tagged robotics, applied physics laboratory