|
EVAN CARAVELLI/Arizona Daily Wildcat
Electrical and computer engineering graduate student Shafik Amin demonstrates his robot's ability to levitate with the aid of a helium-filled blimp. The robot attached to the bottom uses a tiny camera to help it maneuver autonomously in its environment.
|
|
|
By Kris Cabulong
Arizona Daily Wildcat
Wednesday, December 1, 2004
Print this
Today's robots work in car factories tirelessly, defuse bombs and explore hostile environment on other worlds - but they're all about as blind as bats.
Though scientists have been working on the robotic eye for decades, Charles Higgins, assistant professor in the UA Department of Electrical and Computer Engineering, is breaking new ground with an innovative approach to robotic visual navigation: neuromorphic engineering.
"Neuromorphic engineering covers a wide spectrum of disciplines," Higgins said.
Higgins' research roughly combines his two chief specialties - electrical engineering and neurobiology. In addition to his faculty position with the ECE Department, Higgins is also an assistant professor at the Arizona Research Laboratory for Neurobiology.
With this background, Higgins and his team are working on building robots inspired by biology. The electronic brains in his robots are modeled after theories on how biological brains work, and Higgins hopes his research will both test those theories and develop a better design for robotic visual processing.
|
Robots are nowhere near being able to take over the world. There's no way a robot can take a human in a fight. – Charles Higgins
|
|
But Higgins said he has serious doubts his research will lead to the kind of robot uprisings Hollywood cinema warns of.
"I think robots are nowhere near being able to take over the world," he said. "There's no way a robot can take a human in a fight," he said.
Still, when Higgins chose an aerial model for his visually navigating robots, he didn't take any chances.
"We went with blimps because they're non-lethal," he said. "Helicopters can be a little dangerous to have flying around."
The parallel processors on the blimp's computer are modeled after a fly's brain because of its simple structure, Higgins said. A fly's sesame seed-sized brain has about 1 million neurons, while a human brain's neuron count approaches 100 billion. Higgins said where it would take 20 years to study how a fly processes visual information, it would take at least 100 years to understand how a human brain processes that information.
However, this fly-based design, with its motion-sensing cameras and parallel processors, is still doing some things that have never been done before.
"The blimp, which is autonomous, can avoid obstacles, walls and follow objects," Higgins said. "It measures distances, how far it's traveling and at what speed, and this is done purely with vision navigation."
Shafik Amin, a graduate student in electrical and computer engineering, has worked with Higgins on the visual navigation project for two and a half years.
"(The aerial model) is innovative in the way vision sensing works," Amin said. "It's not a conventional video camera, there's no picture taking or video pixels."
Modeled after insect vision, the cameras detect the motion of their surroundings relative to the robot's movement in order to help the processors navigate through its environment.
Higgins' team is also working with a ground-based robotic model named "Gimli" that moves around on four wheels and looks about with a head that moves independently of its body.
Because Gimli has only one camera, or "eye," it uses what Higgins called "motion parallax" to navigate.
"You know how when you close one eye and move your head left to right you can kind of tell how far away things are? That's how Gimli measures how far away an obstacle is," Higgins said.
Leslie Ortiz, graduate student in electrical engineering, said Gimli can currently track small moving objects. She said Gimli actually follows objects with its head.
Ortiz said Gimli has two vision chips that operate as the robot's eyes, or visual processors. These Very Large Scale Integration (VLSI) chips have the advantage of being cheaper and smaller than more conventional chips because they operate through parallel processing, Higgins said.
Multiple VLSI processors working in parallel are better suited to certain tasks, including visual recognition and sensation, than a single, more powerful serial processor, Higgins said.
The parallel processing model is how the human brain, with 10 billion neurons, processes information. Amazingly, the human brain processes only at a speed of a few hundred hertz, Higgins said.
"Not even the very first computers operated as slow, but the human brain can do countless things even the fastest computers today barely hold a candle to," he said.
Like seeing things.
Higgins' research team was recently filmed by a Discovery Channel film crew, which Ortiz said was very exciting for her. She had only been with the team for four months.
"When the semester started, I never thought something like this would happen to me," Ortiz said. "I just feel really lucky to be part of this lab, and to be learning a lot about neuromorphic engineering, something I really enjoy."
Amin called his experience with this kind of graduate research "invigorating."
Higgins said about 30 prominent researchers worldwide are working on similar neurobiological robotic research.
"It's really interesting in that it's different from what's been done in traditional robotic vision (research)," Amin said. "This is a completely new approach."