Pigeons and Robots

For this project, UCLA’s Comparative Cognition Lab led by Dr. Aaron Blaisdell has partnered with roboticist and filmmaker Andrew McGregor, an expert on using robotics in service of interspecies communication.

Andrew’s expertise in this area stems from his background leading the Oomvelt research project that develops wearable technology to help African giant pouched rats trained and deployed by APOPO to communicate to a human when they find a landmine or a survivor in a collapsed building).

The goal of the project is to ascertain whether pigeons can teleoperate a robot through a labyrinth from a first-person perspective and then if they can extrapolate that mental map to guide the robot from a bird’s eye view.

How it Works

The Pac-Man-esque image above is what the pigeons currently use to navigate an electronic maze by pecking on the screen. We will beam in the video from the robot’s camera into the operant conditioning chamber (the box where the robot and pigeon are together. You can see a video screen in it) and overlay that with something similar to the Pac-Man screen. The pigeon will then peck on parts of the overlay to move the robot around in the real world. Then if it works the pigeon will navigate the robot through the same maze with a bird’s eye view!

The Story

In December 2016, Netflix released a TV series following in the footsteps of MythBusters. The new series was called “White Rabbit Project”. Episode 6 focused on “Crazy WW2 weapons”. Of the six “crazy” ideas of novel weaponry to defeat the axis of evil, one was the brainchild of notable behavioral psychologist B.F. Skinner. He referred to it as Project Pigeon, and with it he tried to train pigeons to guide a missile dropped from an airplane to contact and destroy enemy ships in the Pacific Arena. His project was a success in that pigeons learned to steer a simulated missile to enemy ships viewed through a small window. They did so by pecking at the image of the ship in front of them, and the pecks controlled steering flaps on the missile so that it stayed on course directly to the ship.

Luckily for the pigeons, they were never deployed on these suicide missions. By the time the project was ready, other teams had finally figured out how to implement self-navigation instrumentation. Many pigeon lives were, ahem, saved.

Back to the Netflix show. The producers wanted to do a conceptual reenactment of Skinners’ Project Pigeon. They called on the Blaisdell lab to help. We trained 3 pigeons to peck at a red-and-white concentric circle target on the touchscreens in our lab. It only took a couple of weeks to train these birds to reliably peck at the target whenever it appeared on the screen. We varied the location and size of the target so that they would generalize their pecking behavior during filming.

On the day of filming, we drove the pigeons to the studio in Down Town LA where the reenactment was to be recorded. The studio consisted of a huge room, like a giant empty warehouse, with a real target hung on the wall at the far end. At the other end there was a drone with a camera mounted under it. The feed from the camera was relayed to a touchscreen in a clear operant box set up behind a partition. The pigeon pilots were ready! They were placed, one at a time, in the pilot seat (operant box) where they came face to face with the display showing the camera feed from the drone. In the display, they oriented toward the tiny image of the target at the far end of the studio. Each pigeon immediately began to peck at the target, thereby navigating the drone all the way to the other end of the studio where the drone finally contacted the target!

Although all three pigeons showed decent rates of pecking at the image of the target, Darwin was by far the star of the show – literally. She was the best pilot of all, and an image of her pecking at the target with host Kari Byron watching in excitement were used as promotional materials for the show.