Recently, a small group from Google visited Cogswell College to teach a class about and demonstrate Project Tango. What is Project Tango exactly? Project Tango is a brand new mobile device that has the ability to navigate the world, similar to us humans. It’s an Android device/platform that has spatial perception, and accomplishes this by using advanced computer vision, image processing, and special vision sensors.
One may think, “That’s all fine and cool, but what does this thing do exactly? And what practical applications can be derived from such a technology?”. To put it simply, what Tango does is continuously scan the surrounding environment as the user walks around with the hand-held device, and then it uses the data it collects to construct a 3D model of the environment. It uses motion tracking to give real time info about it’s motion through 3D space. It also uses depth sensors for depth perception (allowing for interactions between the real world and the virtual world Tango has built/is building) and it can learn new areas intelligently. If there are any errors with the 3D model it has created, Tango can use visual cues around the space to re-render and fix trouble areas. The potential applications for such a technology includes, but isn’t limited to, new Virtual Reality or AR games and applications, video games, rapid environment generation, and more. The potential is limited only by imagination.
The Cogswell class, which is currently learning to use Unity, was tasked with testing the devices, running an application, and providing feedback as they went along. Google called it a Code Lab event, the purpose of which was to iron out and fix any bugs in the system before this year’s Google I/O conference in roughly two months. Students debugged Google’s sample instructions, tested and ran the code, and followed development environment instructions – all while under the direction of watchful and curious Google engineers. Google had previously tested these devices at nVidia, but were impressed to see that Cogswell students produced better results and provided higher quality feedback.
For the demo, students were each provided with a Tango tablet device, USB cables, a pin code to unlock the devices, and a URL in which they found further instructions. The students booted up their devices and launched a 3D mapping app which tracked their movements. As they moved around the classroom, the students got feedback through interactive mapping, cloud arrays and more. The students then returned to their workstations to continue the demo. The students went through a series of environments to get an app running in Unity, importing that app into their devices, then running that app on their devices as they moved around. Some students continued with the instructions provided to them, whilst others conversed with the Google engineers.
The success rate of the tests and demos was 100%, much better than other tests that Google had done before. Whenever any of the students would get stuck, they would first work together for a solution, using the Google engineers as a last resort. Working together led all students to success. As the class progressed, the Code Lab event slowly turned into a seminar where students brainstormed potential applications for the technology. Ideas included a music application for better microphone location/tracking within a room and an app capable of creating 3D sound environments.
All in all, it was a successful demo and everyone walked away with new knowledge – and an amazing experience!
Based on notes from John Duhring’s experience
with additional content and editing by Juan Rubio