Playing with virtual data in spatial reality

For the past few months, when I’ve had visitors to Microsoft Research on the Redmond campus one of the things I’ve enjoyed demonstrating is the technology behind the new system for Xbox 360 – the controller-free gaming and immersive entertainment system that Microsoft is releasing for the holiday market in a month or so. In particular, I’ve enjoyed having Andy Wilson of MSR talk with visitors about some of the future implications in non-gaming scenarios, including general information work, and how immersive augmented-reality (AR) could transform our capabilities for working with information, virtual objects, and how we all share and use knowledge among ourselves.

We’re further along in this area than I thought we’d be five years ago, and I suspect we’ll be similarly surprised by 2015.

In particular, there is great interest (both in and out of the government circles I travel in) in the “device-less” or environmental potential of new AR technologies. Not everyone will have a fancy smartphone on them at all times, or want to stare at a wall-monitor while also wearing glasses or holding a cellphone in front of them in order to access other planes of information. The really exciting premise of these new approaches is the fully immersive aspect of  “spatial AR,” and the promise of controlling a live 3D environment of realtime data.

From paper cited at left: A unified 3D-mesh from three depth-cameras. Two people in the space are tracked separately, and can interact with and control data on the tabletop, the wall, or in-between.

This week Andy is giving that same kind of demo at a conference in New York City, and has just released a new video showing some of the same stuff we’ve been playing with in the lab, and describing the implications.  His research presentation is titled Combining Multiple Depth Cameras and Projectors for Interactions On, Above, and Between Surfaces.

The video below shows Andy and colleague Hrvoje Benko as they describe and play with their “work,” a project called LightSpace which ace blogger Todd Bishop says takes us “one step closer to the Star Trek Holodeck.”

This video version, by the way, appears to end at about the five-minute mark, but in reality there’s a bonus “technical appendix at the 6:15 mark – wherein Andy shows some behind-the-scenes detail of how the systems work.  Think of it as a hidden bonus track 🙂

The conference they’ve been presenting at, by the way, is UIST 2010, the Association for Computing Machinery’s (ACM’s) Symposium on User Interface Software and Technology, with presentations and demos on topics ranging from traditional graphical and web-user interfaces, to VR an AR software, to natural user interfaces (NUIs).  Twitter has been full of #UIST2010 tweets all week, so check that for other activity there.  Other collaborative Microsoft Research presentations this year include the following:

Share this post on Twitter

Email this post to a friend

3 Responses

  1. […] This post was mentioned on Twitter by Lewis Shepherd, 3D News Aggregator. 3D News Aggregator said: Playing with virtual data in spatial reality http://bit.ly/dujGKD […]

    Like

  2. […] 2010 I could look ahead (“Playing with virtual data in spatial reality“) and see clearly where we are heading based on […]

    Like

  3. […] 2010 I could look ahead (“Playing with virtual data in spatial reality“) and see clearly where we are heading based on […]

    Like

Leave a reply to Tweets that mention Playing with virtual data in spatial reality « Shepherd’s Pi -- Topsy.com Cancel reply