It took a couple tries. At one point the robotic helicopter accelerated upwards, hitting the top of the net, then crashed abruptly to the floor.
But later during the robotics demonstration at Johns Hopkins University's Krieger Hall, the small quadcopter successfully hovered over its target landing pad and settled safely.
"If you never crash a robot, you're probably not pushing the envelope hard enough," said Louis Whitcomb, a Johns Hopkins professor of mechanical engineering whose students demonstrated their independent robotics projects at labs across the Homewood campus on Wednesday afternoon.
In addition to the quadcopter that found its own target, there was another controlled remotely by human motions, via virtual-reality headgear and a myoelectric armband. One arm movement triggered the helicopter into a quick mid-air flip.
Other demos by students in Whitcomb's graduate-level Robot Systems Programming course included two small self-driving cars that could independently travel to pre-set destinations, avoiding obstacles like humans and trash cans along the way. Then there were two "turtlebots"—basic personal robots—one programmed to map out and reach a target, the other programmed to pick up and deliver an object by verbal command, such as "Fetch me the water bottle."
For another project, an aquatic robot navigated the depths of a giant water tank—designed for testing oceanographic underwater robots and instruments—that many people probably don't realize exists in the basement of Krieger Hall.
Whitcomb's students spent the latter half of the semester working in teams to assemble and program their robots. They followed some fairly broad directives: The robots had to be able to perform two tasks, to operate both independently and autonomously, and to be equipped with at least two sensors.
"It's an open-ended design problem," says Whitcomb, who chairs the Department of Mechanical Engineering at Hopkins. "They're not just memorizing knowledge—they define their own robotic system they want to construct with hardware and software."
Whitcomb created this course three years ago, gearing it to grad students and advanced undergrads. He says he wanted to offer a robotics experience that's "not scripted, where everyone does the same project."
The student teams choose the type of robot they want to work with, either using a model already available in the labs or ordering a new one. Next they add additional sensors, actuators, or human-interface devices to their robot systems. With the self-driving cars, for example, the teams used basic remote-control cars as their foundation, then added a computer, odometer, Lidar scanner (something like an optical radar), GPS, cameras, and a magnetic compass. Finally, the students use and write software that enables the robots to navigate and perform tasks.
Whitcomb says he's seen students from his course continue with more advanced graduate studies in robotics, or go to work directly in the rapidly expanding robotics industry—in diverse areas such as medicine, oceanography, homeland security, and manufacturing.
A full list of yesterday's demo projects:
- Preliminary control and navigation ROS package for the OpenROV 2.8 underwater robot vehicle, by Shahriar Sefati and Laughlin Barker
- Quadrotor autonomy, by Rodolfo Finocchi and Azwad Sabik
- Virtual reality control of drone quadrotor, by Zach Sabin and Dave Morra
- Control of autonomous model car, by Ryan Howarth and Rachel Hegeman
- Autonomous motion self-driving car, by Greg Langer, Stefan Reichenstein, and Ted Staley
- Turtlebot following human and picking up the ball, by Sipu Ruan and Zhe Kang
- Turtlebot autonomous map building and object delivery, by Hao Yuan and Bo Lei
Posted in Science+Technology
Tagged robotics, robots, louis whitcomb