Scientists from Florida Atlantic University’s (FAU) Machine Perception and Cognitive Robotics Laboratory (MPCR) have combined 3D printing, deep learning, artificial intelligence and robotics to bring to life Astro, an intelligent robotic dog that can see and hear.
The Frankenstein-eque project actually uses an existing robotic configuration, Boston Dynamic’s quadruped robot, and adds a 3D printed dog head that makes Astro look like a Doberman pinscher—a breed known for its intelligence and trainability. The head is not only for appearance’s sake, however, as it contains a powerful computerized brain that gives Astro its ability to learn and operate.
As the research team points out, Astro is designed to learn like a dog. In other words, he can be trained. Instead of using treats and clicks, however, the robodog learns by using a deep neural network that enables it to learn from experience.
Astro weighs about 100 pounds and is loaded with high-tech equipment, like sensors, radar imaging, cameras and a directional microphone. Currently, the robodog is just starting its training and can respond to commands like “sit,” “stand” and “lie down.” The FAU scientists say that in the future Astro will also be able to understand hand signals, detect different colors, understand multiple languages, distinguish human faces (or dogs!) and even coordinate with drones.
Not unlike the doberman pinscher, which has been used in military and policing contexts, Astro can be programmed to be an information scout and to assist police and military personnel in detecting guns, explosives and gun residue, among other things. The robodog can also be programmed for other uses, such as acting as a service dog for the visually impaired or to provide medical diagnostic monitoring. Astro is also being trained to function as a first responder for search and rescue missions.
In short, Astro is a multi-purpose quadruped robot whose capabilities can be enhanced or customized by adding various sensors and instrumentation. As a first responder, for instance, Astro will be equipped with over a dozen sensors for environmental detection, including optical, sound, gas and radar.
All the information picked up by the sensors is ultimately processed by the robodog using a set of Nvidia Jetson TX2 graphics processing units with a combined four teraflops of computing power. Providing about four trillion computations per second, Astro’s processing units act as its brain, generating autonomous behavioral decisions and enabling the pup to learn.
“Our Machine Perception and Cognitive Robotics laboratory team was sought out by Drone Data’s Astro Robotics group because of their extensive expertise in cognitive neuroscience, which includes behavioral, neurophysiological and embedded computational approaches to studying the brain,” commented Ata Sarajedini, PhD, Dean of FAU’s Charles E. Schmidt College of Science. “Astro is inspired by the human brain and he has come to life through machine learning and artificial intelligence, which is proving to be an invaluable resource in helping to solve some of the world’s most complex problems.”
As one can imagine, a lot of work and expertise went into bringing Astro to life. In fact, it was a highly interdisciplinary team that successfully created the 3D printed robodog, which included neuroscientists, IT experts, artists, biologists, psychologists and students of all levels (from high school to graduate students).
The Astro project was led by Elan Barenholtz, PhD, associate professor in FAU’s Department of Psychology, co-director of FAU’s MPCR laboratory and a member of FAU’s Brain Institute (I-BRAIN); William Hahn, PhD, assistant professor in FAU’s Department of Mathematical Sciences and co-director of FAU’s MPCR laboratory; and Pedram Nimreezi, director of intelligent software in FAU’s MPCR laboratory, CTO for RedGage and an expert in martial arts.