With pretty much all the glory going to Google Glass right now, people are becoming more aware of the wearable technology than ever before. That’s not to say, however, that this is a one-horse race. Epson has been playing around with concepts for quite some time now and is ready to announce some new stuff at Google I/O 2013.
I had a chance to sit down with members of Epson and APX today and see just how the new concept works and what lies ahead for developers. Everything will be introduced at the developer conference, especially if you’re involved in developing with/for YouTube.
Before going further I should point out that this is not a consumer-minded concept (yet) and that it’s still going through some design changes. Physically the glasses are still big and bulky and the test versions still use an older version of Android (2.2). With that in mind, the team I spoke with today is very optimistic about the eventual form factor believes we’re not that far from the sleek and sexy stuff that consumers might expect.
In a nutshell, the concept that Epson and APX will introduce at Google I/O revolves around touchless gestures for selecting and viewing YouTube videos. Imagine seeing a carousel-like roll of YouTube videos to choose from, right in front of you. You can look left and right to scroll through and then a long press-like focus will automatically start the video. Look up and the clip stops – tilt left for rewind, right for advance. It’s pretty intuitive stuff and works well.
What does it look like? Instead of the picture-in-picture stuff that Google Glass provides, this is like having a big television display sitting right in front of you. I think the official description calls for an 80-inch TV around 5 meters away. It’s big and easy to see yet doesn’t clutter or obscure your view.
Sure, the demo unit I tried on was a little heavy and wonky but the big picture scenario is fun to imagine. Immediately I thought of how cool it would be to watch YouTube videos or other clips without interrupting others in the room or making things too bright. Then I thought of seeing Google Maps overlayed with POIs or other apps. However, this is getting ahead of things and looking at more consumer-minded focus. Rather, this is more enterprise and business-like.
There were two devices shown off today, one with a front-facing camera and microphone and the other without. Given that the YouTube stuff did not use either, the model I wore was without. Both, however, featured a 9-axis sensor for looking around in all directions.
I also saw a demonstration of how one might see a pump assembled and disassembled in 360-degrees, moving around the unit in real-space. Imagine a workforce being able to see what is in boxes on trucks, where staff is in real-time, or what is on the shelf. Imagine a fleet of drivers knowing where everyone else is or a doctor using augmented reality to see where instruments are inside a patient.
My understanding is that developers will soon be able to get their hands on SDK’s and kits later this summer. Yes, you can purchase the BT-100 for around $699 through Epson today (only $400 through Amazon), but I sense much better stuff is coming. Given that the team was ready to show press and other developers the proof-of-concept stuff, it feels like we’re nearing that retail-ready rollout. Along those lines, there are specific Android devices which host the content and provide the apps; in this case YouTube was delivered from a Samsung Nexus 10. Once things go live, developers will get a clear idea as to which products provide the needed connections.
I asked about how developers will be able to develop and distribute apps and it appeared that nothing concrete has been established. The partners are looking for the most viable options around, including Google Play and direct. In the spirit of Android, I was told that Epson and pals want to keep things very open.