This is CHI 2010 week, the Association for Computing Machinery’s Conference on Human Factors in Computing Systems in Atlanta. Top researchers in human-computer-interaction (HCI) are together April 10-15 for presentations, panels, exhibits, and discussions. Partly because of our intense interest in using new levels of computational power to develop great new Natural User Interfaces (NUI), Microsoft Research is well represented at CHI 2010 as pointed out in an MSR note on the conference:
This year, 38 technical papers submitted by Microsoft Research were accepted by the conference, representing 10 percent of the papers accepted. Three of the Microsoft Research papers, covering vastly different topics, won Best Paper awards, and seven others received Best Paper nominations.
Let’s take a quick look at three MSR projects being presented at the conference.
One of the Best Paper awards went to the “Skinput” project, one of the most popular demonstrations at last month’s MSR TechFest in Redmond (see the WIRED coverage of that, Skinput Turns Your Arm into a Touch-Screen). Skinput, an example of a Human-Body User Interface, is described in the paper Skinput: Appropriating the Body as an Input Surface by Chris Harrison of Carnegie Mellon University and Desney Tan and Dan Morris of Microsoft Research Redmond. In the photo at right, a sensing armband and a pico projector enable interactive elements to be rendered onto a person’s skin.
Skinput uses bio-acoustic technology, enabling the location and vibration of finger taps on hands and arms to be analyzed and used as signals. The paper outlines the technology, including its accuracy and limitations revealed in a 20-participant study. As we users demand more computational capability when we’re mobile, it’s going to be worthwhile to realize that, in the paper’s words, “The body is portable and always available, and fingers are a natural input device.”
Apple’s iPad launch got a lot of coverage recently, and yet one thing I notice missing from it is any means of stylus input, which is common in Microsoft tablet PCs as a more precise and multifaceted input alternative to just finger swipes. As MIT’s Technology Review pointed out yesterday, “Touch screen interfaces may be trendy in gadget design, but that doesn’t mean they do everything elegantly. The finger is simply too blunt for many tasks.”
A neat MSR project tying together hardware, software, stylus, and human touch is outlined in the CHI paper “Manual Deskterity: An Exploration of Simultaneous Pen + Touch Direct Input,” with eight co-authors led by Ken Hinckley of Microsoft Research Redmond. The digital-drafting-table prototype makes innovative use of Surface technology, with multitouch and gesture-recognition as Ken describes:
We explore a division of labor between pen and touch that flows from natural human skill and differentiation of roles of the hands. We also explore the simultaneous use of pen and touch to support novel compound gestures.
We advocate a division of labor between pen and touch: the pen writes, touch manipulates, and the combination of pen+touch yields new tools. This articulates how our system interprets unimodal pen, unimodal touch, and multimodal pen + touch inputs, respectively. We contribute novel pen + touch gestures that leverage the strengths of both pen and touch; utilizing both modalities to complement one another also enables us to largely sidestep the weaknesses inherent in each modality as well.
Click the video above to see Manual Dexterity in action, and then watch this year as new Tablet PCs and hybrid devices are rolled out to compete with the iPad, running Windows 7 enabling multitouch gestures and pen input.
Embodied Social Proxy
Finally, there’s a CHI paper about MSR’s Embodied Social Proxy (ESP) telepresence project, which I described in an earlier post. I mention it because the ESP project came up in conversation last week with a Microsoft colleague whom I encounter practically daily, but almost exclusively virtually – even though we work in the same city.
A little back-story: In June 2008, I met a smart young guy named Mark Drapeau when he was at the National Defense University, on a AAAS post-doctoral research fellowship. We were participating in a Strategic Communications Exercise at NDU’s Center for Technology and National Security Policy. Dr. Drapeau was hired in 2009 by Microsoft, as Director of Innovative Social Engagement in our U.S. Public Sector outfit. We’re both based in Washington DC, although my actual office is in a Microsoft building in Reston, Virginia, while he’s in a different Microsoft building in Chevy Chase, Maryland.
As colleagues, we’ve worked on several little projects together – but we rarely see each other in person. Although I first knew him in person, I now see far more of the virtual Mark, through his widely-followed persona on Twitter (@cheeky_geeky and also @Microsoft_Mark) or his blog.
Friday we both turned up at the same event: the kickoff of DC Codeathon for Citability, a neat project I blogged about last week. We chatted about the fact that we rarely see each other face-to-face but tend to collaborate on projects in a virtual-teaming way, though mostly by happenstance.
I mentioned to him the MSR ESP project, and he was fascinated by the possibilities, with some new ideas for its use within a large distributed enterprise like Microsoft. Watch this video of ESP – the action begins around the 3-minute mark, and I think you’ll see the potential in these new means of virtual collaboration through telepresence.
Gregory Huang writing about ESP today on the Xconomy blog says, “The Microsoft project builds on years of social science and communications research. It also seems decidedly old-school and low-tech. That is part of what makes it interesting, as a complement to social-media efforts…”
In essence, I could deploy an ESP rig, preferably on a pair of robotic wheeled tracks, in our offices to enable just about any Microsoft researcher from around the globe to participate as a virtual embed in Microsoft Institute meetings and projects. Neat idea.
Filed under: innovation, Microsoft, R&D, Technology | Tagged: ACM, CHI, CHI2010, Chris Harrison, citability, cloud computing, computer, computers, Dan Morris, Desney Tan, ESP, Gregory Huang, HCI, innovation, IT, Ken Hinckley, Mark Drapeau, Microsoft, Microsoft Research, MIT, MSR, multitouch, NDU, penbased, R&D, research, science, Skinput, social media, social networks, Surface, tech, TechFest, Technology, Technology Review, touch, Twitter, xconomy, youTube |