A lot more goes on in the museums of the world at night, after everyone has vacated the premises and the guards have locked up the place, than one can imagine. The situation may not be as dramatic as what Ben Stiller shows us in the movie, “Night at the Museum,” but still, it does warrant a serious investigation. This is what Tate Britain has done with its After Dark project with help from the inaugural IK Prize.
Tate Britain has one of the largest art collections in the world. In August 2014, it organized a project After Dark, where visitors could experience the thrill of a prohibited voyage, without once stepping into the museum. For 25 hours, more than 100,000 viewers across the globe saw live streaming video over the Internet from four robots let loose in the darkness of the museum. Additionally, 500 people could take control of the robots for approximately 12 minutes each, guide them as they like and see what the robots were witnessing.
RAL Space has engineered the robots, which are based on the tiny single board computer, the Raspberry Pi or RBPi. Working alongside the UK Space Agency or UKSA, RAL Space is one of the world’s leading centers for the research and development of space exploration technologies.
RAL Space worked in close collaboration with Tate Britain, and the team behind the project After Dark combined the latest software with the bespoke RBPi hardware. They designed and engineered the robots, creating a world-first, one of a kind experience and attracted audiences from all over. The Workers, a digital product design studio, designed the Web interface for After Dark.
For the late night explorations within the museum, people from all over the world get to guide four robots by taking control of any one of them. RAL Space has designed the robots to select new operators for driving them every few minutes. As long as the event is live, people can request control of a robot from the project website. The robots know you are waiting, and as soon as a slot frees up, will try to take you on a ride. Even while you wait, you can watch the video of the event being streamed live and appearing on the project website, and on Livestream.com.
You can use the on-screen buttons on the web-based control interface or the arrow keys on your keyboard for controlling the robot. You can make the robot move forward or turn, and even make it look up or look down. The robot senses obstacles around it, feeding this information back to you. Therefore, even though it is nearly dark, you, the navigator, can operate the robot easily.
If you take the robot too close to an object, it will stop moving and inform you through the web-based control interface. Once that happens, you still have control over the robot, as you can make it turn on the spot and let it move forward, continuing with the journey, provided the path ahead is clear.