Apple has done more than any company to promote touch as a way of interacting with devices, but Microsoft showed on Tuesday it could go literally above and beyond the screen in future displays.
In a keynote speech at the Society for Information Display’s SID 2010 annual conference in Seattle, Steve Bathiche, research director in Microsoft’s Applied Sciences group, showed not only the usual futuristic concept video of screens appearing on every surface, but also previewed some working demonstrations of its new technology for the first time.
“Displays are becoming much more immersive, reaching out beyond their pane of glass,” he said in a keynote speech.
The aim was to “break the fourth wall”, he said, referring to the theatre term of dissolving the invisible boundary between an audience and what they see on stage.
Microsoft has been attempting this in a number of ways, including Project Natal - the motion-sensing capabilities being brought to the Xbox 360 later this year – and its Surface computer, a table-top touchscreen with cameras beneath that recognise objects placed above.
Mr Bathiche showed a video of how partners were using Surface in imaginative ways – playing a kind of checkers, stacking bricks on the screen and having them recognised in an architect’s design, placing a book there for its cover to be recognised and more information to be revealed on the display, placing a leaf on the surface and having it classified.
There were rougher videos of what was happening in the labs. 3D cameras and haptic displays allowing researchers to reach into imaginary objects and feel them, cloth representations feeling like cloth and being torn apart.
Collaboration was shown with two pairs of hands, one present, another in another country working on the same 3D-displayed keyboard. Mr Bathiche showed facial recognition technology and how hardware could make the display show different images to different people looking at it (see video below). This used wedge optics – basically a flat lens – to create the effect. The different views can also be sent to the left and right eye for an autostereoscopic effect.
“In the future, the user would be immersed in the story the display is trying to tell,” he said.
“Displays will turn around and control where light goes to give each viewer their own unique perspective and image, displays are becoming bi-directional relays of lights.”
These kinds of display would make a person across the globe seem as if they were just across the room (pictured).
“Touch is just scratching the surface of what natural interactions will look like,” he concluded.
“Working together across the industry, we’ll be able to create immersive experiences that truly break the fourth wall of human-machine interfaces.”
It was a visionary speech in every sense and tremendous integration will be needed across the industry to achieve the aims. But the technology seems close enough to reach out and touch.