Like R2-D2, Only Smarter
Three thousand miles away from the 2010 Apple Worldwide Developers Conference in San Francisco in June, Computer Science Professor Bruce Maxwell and his team, along with a robot, waited in anticipation. With Bogumil Giertler ’12 at the conference on the West Coast to demonstrate his robot interface, Maxwell was in the Colby Museum of Art, about to see his idea come to life. Controlled by Giertler using an iPad, the robot would roam the museum, allowing viewers in California to virtually browse the art in the Maine galleries.
Three years ago, Maxwell, chair of the Computer Science Department, and Professor William Smart of Washington University in St. Louis, had an idea. They wanted to develop robot avatars that would explore places that people could not otherwise get to. After receiving a grant from the National Science Foundation, the team got to work.
Maxwell had planned to work with Giertler on an Apple application that would allow viewers to remotely control the robot with their iPhones and to see what the robot was seeing. But when the iPad came out in January, Giertler was intrigued, to say the least. “It’s exactly what we needed for our project,” he said. He spent that month at home in Warsaw, Poland, developing the interface with a programming language called Objective-C, and, when he returned in the spring, he pitched the idea to Maxwell. By the end of the semester, with Giertler’s programming, the robot project was well on its way to accomplishing what Maxwell and Smart had envisioned.
In mid-May Giertler heard from Apple that he had received a highly competitive student scholarship to the Worldwide Developers Conference for his work. “[There were] Apple engineers, as well as the vice president for worldwide developer relations and all the professors and students from various top universities from all over the globe,” Giertler said. Initially he was offered just the scholarship, but when organizers learned more about the project, they asked him to do a demonstration.
Back at Colby the robot was in the museum waiting to be activated by Giertler’s iPad in San Francisco. Maxwell and research assistant Martha Witick ’12 had spent the entire day mapping the museum and preparing for Giertler’s moment. Every necessary gadget had been checked and double-checked, but there was still no word from Giertler. Finally, at 9:35 p.m., the phone rang, piercing the restless silence. It was Giertler, and he was about to present.
In the museum Maxwell and his student researchers waited, and, as they sat there staring at the robot, it came to life. It began to glide around the polished wooden floors controlled entirely by viewers at the conference in San Francisco, allowing them to marvel at Louise Bourgeois’s bronze sculpture, Robert Mangold’s vibrant acrylic paintings, David Salle’s unique pen-and-ink sketches—and catch the occasional glimpse of the team sitting among the art.
With the help of students, Maxwell has enabled people to wander through the Colby museum and view art through his robot’s eyes. Ultimately Maxwell wants to see this move online. “You will be able to tell the robot to visit a particular place in the museum,” explained Maxwell. “Your request will be put into a queue, and then the robot will move from painting to painting.”
Because of Giertler’s programming, what was once a local robotics project has quickly transformed into a global interface—a worldwide virtual art gallery—that could change the way that we view and experience the world around us.