MSR explores wearable touch and gesture

Microsoft Research Redmond researchers Hrvoje Benko and Scott Saponas have been investigating the use of touch interaction in computing devices since the mid-’00s. Now, two sharply different yet related projects demonstrate novel approaches to the world of touch and gestures. Wearable Multitouch Interaction gives users the ability to make an entire wall a touch surface, while PocketTouch enables users to interact with smartphones inside a pocket or purse, a small surface area for touch. Both projects will be unveiled during UIST 2012, the Association for Computing Machinery’s 24th Symposium on User Interface Software and Technology, being held Oct. 16-19 in Santa Barbara, Calif.

Make Every Surface a Touch Screen

Wearable Multitouch Interaction turns any surface in the user’s environment into a touch interface. A paper co-authored by Chris Harrison, a Ph.D. student at Carnegie Mellon University and a former Microsoft Research intern; Benko; and Andy Wilson—describes a wearable system that enables graphical, interactive, multitouch input on arbitrary, everyday surfaces.

“We wanted to capitalize on the tremendous surface area the real world provides,” explains Benko, of the Natural Interaction Research group. “The surface area of one hand alone exceeds that of typical smart phones. Tables are an order of magnitude larger than a tablet computer. If we could appropriate these ad hoc surfaces in an on-demand way, we could deliver all of the benefits of mobility while expanding the user’s interactive capability.”

The Wearable Multitouch Interaction prototype is built to be wearable, a novel combination of laser-based pico projector and depth-sensing camera. The camera is an advanced, custom prototype provided by PrimeSense. Once the camera and projector are calibrated to each other, the user can don the system and begin using it.

Continue reading about Wearable Multitouch at Microsoft Research...

PocketTouch: Through-Fabric Input Sensing

PocketTouch: Through-Fabric Capacitive Touch Input—written by Saponas, Harrison, and Benko—describes a prototype that consists of a custom, multitouch capacitive sensor mounted on the back of a smartphone. It uses the capacitive sensors to enable eyes-free multitouch input on the device through fabric, giving users the convenience of a rich set of gesture interactions, ranging from simple touch strokes to full alphanumeric text entry, without having to remove the device from a pocket or bag.

Benko also stresses that both Wearable Multitouch Interaction and PocketTouch are evolutionary steps of a larger effort by Microsoft Research to investigate the unconventional use of touch in devices to extend Microsoft’s vision of ubiquitous computing.

Continue reading this article at Microsoft Research... Main image from Compliance Research

An interview with Asta Roseway at Microsoft Research

I recently visited Microsoft Research (MSR) to meet some of the researchers and designers who are doing some amazing work with wearable technology. One of the designers I met with was Senior Research Designer Asta Roseway (MSR). She recently collaborated with User Experience Designer Sheridan Martin Small (Xbox) on a project called The Printing Dress, which won Best Concept and Best in Show at ISWC 2011 in San Francisco last month.

Here's a look at their creation, how they made it, and what Asta's thoughts are about the future of wearable technology.

The Printing Dress You are probably familiar with the old saying, “You are what you eat” but how about, “You are what you tweet?” What if this concept were incorporated into garments of the future?

The "Printing Dress" is an artistic piece that explores the notion of wearable text and its potential impact on the future of fashion, as well as our social identity. Built almost entirely of paper, the dress enables the wearer to enter "thoughts" on to its fabric and wear them as public art. While constructed from materials of the past, the dress looks towards the future with a message indicating that we are entering into a new realm of social accountability, where you literally wear what you tweet.

The Dress is powered by four Lilypad Arduinos, a laptop, a short throw projector and uses a Processing sketch to display and animate the text.

Interview Participants Asta Roseway - Senior Research Designer, Microsoft Research Sheridan Martin Small - User Experience Designer, P10 Incubations/Xbox Tom Blank - Hardware Engineering Manager, Microsoft Research Desney Tan - Senior Researcher, Microsoft Research

Special thanks to Artefact, Microsoft Research, Xbox, and Issara Willenskomer at Dos Rios.

Also featured on Engadget, Cnet, PSFK, talk2myshirt, Ecouterre, Microsoft News Center.

Always-available natural user interfaces

I met with Desney Tan at Microsoft Research today, who walked me through a few mind-blowing demos and prototypes he has recently developed. One of his prototypes demonstrates the capabilities of using on-body musclecomputer inputs that can be integrated into garments. Listening to Tan articulately describe a bloom of possibilities and how he sees the evolution of interfacing was incredibly inspiring. Just imagine the potential.

Read Tan's publication for more info.