In a second installment of what’s going on in Mersive’s R&D shop – we are exploring gestural interfaces and the role they can play in a media-centered collaboration. Gesture has been an important topic in the computer science community for a long time and, once it became popularized in the movie “Minority Report” as a valid and intuitive interface, it has steadily made its way into the mainstream. Several of our customers who are are using Solstice to bring in dozens of live sources from desktops, iPads, and live video cameras has asked for it. Their need makes sense – when a group of users wants to make decisions around that much visual data, it is important that they are able to control how media is arranged.  In building Solstice, one of the metaphors we’ve drawn upon is multiple users placing physical paper onto a table – I can grab and move media from one place to another to facilitate the discussion.  The Solstice clients are already designed around this intuitive notion.

Lately, we’ve been looking at gesture control and how that metaphor can be taken even further. By simply standing in front of a shared display, users should be able to grab and drag media sources in and out of view, page through “stacks” of media on the display with a wave of the hand, etc. Some of our customers are very interested in this type of natural interface and what it might mean for knowledge transfer, group collaboration, and deeper understanding of complex data sets being generated from a variety of sources.  This week, we’ve delivered a first system to several partners who will be using gesture control to do just that.  Take a look at this short video, by using my hand I am able to bring media onto screen, re-arrange sources, and even page through a medial dataset of MRI images and video.  It’s easy, natural and intuitive so that the interface stays somewhat transparent.

It’s interesting that we could pull this off with a commodity game sensor, and the approach is in stark contrast to some of the systems out there that rely on very complex room-sized tracking systems to provide the same thing. Is this the interface of the future? Probably not yet, but it does enable our customers to be one step closer to Minority-Report-like control, in their everyday conference room.

Solstice Gesture Control Demo from Mersive on Vimeo.

Share
About Christopher Jaynes

Jaynes received his doctoral degree at the University of Massachusetts, Amherst where he worked on camera calibration and aerial image interpretation technologies now in use by the federal government. Jaynes received his BS degree with honors from the School of Computer Science at the University of Utah. In 2004, he founded Mersive and today serves as the company's Chief Technology Officer. Prior to Mersive, Jaynes founded the Metaverse Lab at the University of Kentucky, recognized as one of the leading laboratories for computer vision and interactive media and dedicated to research related to video surveillance, human-computer interaction, and display technologies.

Submit Comment