The thing about building IKEA furniture is that sometimes you just get stuck. You can’t seem to orient the bookcase parts so it looks like the figure in the directions. Out of habit you pick up your smartphone, beacon of all knowledge, and unlock it. And then you just stare at the home screen for a second because there’s nothing your smartphone can do to help you. Google is trying to change that.
Project Tango is a collaboration between researchers at multiple companies and institutions to produce a smartphone that gives “mobile devices a human-scale understanding of space and motion.” And it’s led by the Advanced Technology and Projects group at Google, which is the one Motorola devision that Google didn’t sell to Lenovo. Basically Project Tango’s prototype is a 5-inch phone that’s loaded with sensors and processing power to interpret the environment and objects in it in three-dimensions.
That could allow the phone to, say, dynamically deliver audio directions and detailed information to a vision-impaired person no matter where she goes and whether or not she has been there before. The phone can assess its surroundings and figure out exactly where it is in human space. And it could scan the two pieces of a bookcase that you have left and then use an IKEA database to show you a customized animation of how they fit together.
The possibilites seem limitless. Video games that utilize the physical space around you! Body-position monitoring to check how well you’re working the room at networking events! But if the potential of the device isn’t immediately clear to you that’s OK, too. The Project Tango website explains,
While we may believe we know where this technology will take us, history suggests we that should be humble in our predictions. We are excited to see the effort take shape with each step forward.
You don’t need Google Translate to know what that means: No one knows what Project Tango can achieve—or what negative consequences lurk unforeseen. This is real research and only 200 developers are going to get to try the first generation of prototypes. (Google will choose the lucky 200 by March 14 based on applications.) The future of this technology is totally up in the air, which feels kind of uncomfortable, but also awesome.