Threading the Needle with GRASP’s Latest Quadrotor Trick
The steady stream of viral quadrotor videos shot by GRASP researchers tend to have one thing in common: their robotic stars are bathed in the infrared light of motion-tracking cameras. Covering the entire flying space from multiple angles, the suite of cameras tell the robots exactly where they are at all times. With weight at a premium, this means that the quads don’t need to carry their own cameras and trajectory-calculating computers to navigate. These lighter, nimbler robots can pull off some pretty aggressive moves.
PERCH, the Penn Engineering Research and Collaboration Hub that occupies the third floor of the recently opened Pennovation Center, features a bigger flying space and a new set of infrared cameras, but there was no need to turn them on this time.
Giuseppe Loianno, a research scientist in the GRASP lab, and Vijay Kumar, professor and Nemirovsky Family Dean of Penn Engineering, just debuted their latest creation: a quadrotor that can pull off those aggressive maneuvers using only on-board sensors and computation.
Working with Gary McGrath and Chris Brunner from Qualcomm, they’ve outfitted a quad with a forward-facing camera and the kind of processor and inertial measurement unit normally found in a smartphone. Using only data pulled in from those sensors, the robot can figure out how and when to bank around slalom poles and through angled windows not much wider than itself.
Loianno and Kumar are currently pre-loading the robots with the relative location of objects in the flying space, but say that real-time identification of unknown obstacles is just around the corner. That’s the kind of ability that will take flying robots out of labs and into the unpredictable real world.