Like many people, I was very impressed by a video over the weekend of the Word Lens real-time translation app for iPhone. It struck with a viral bang, and within a few days racked up over 2 million YouTube views. What particularly made me smile was digging backwards through the twitter stream of a key Word Lens developer whom I follow, John DeWeese, and finding this pearl of a tweet (right) from several months ago, as he was banging out the app out in my old stomping grounds of the San Francisco Bay Area. That’s a hacker mentality for you :)
But one thought I had in watching the video was, why do I need to be holding the little device in front of me, to get the benefit of its computational resources and display? I’ve seen the studies and predictions that “everything’s going mobile,” but I believe that’s taking too literally the device itself, the form-factor of a little handheld box of magic.
Filed under: Government, innovation, Microsoft, R&D, Society, Technology | Tagged: Alex Howard, Apple, AR, ASL, augmented reality, Chris Niehaus, hack, hacks, HCI, ISR, IT, John DeWeese, Kinect, Kinecthacks, Microsoft, mobile, NUI, Oliver Kreylos, openkinect, OSX, research, robotics, ron, Second Life, sign language, SL, STARMAC, tech, Technology, virtual, virtual reality, VR, Word Lens, youTube | 9 Comments »