The way we interact with our devices has changed in many ways over the last couple of decades, but there remains one constant that continues to define the user experience. While eye-tracking, voice control and various other interaction methods have gradually become more commonplace, the core of the experience remains based upon touch.
But while the quality of touch input support has improved greatly over the years, the feedback that we, as users, get from touching a screen has not advanced significantly on the devices that we own. Some devices offer haptic feedback, but the basic vibrations that they send to our fingertips as we touch a screen are still fairly rudimentary.
This is why Microsoft wants to take touch to a whole new level, expanding the field of computer haptics to "fully engage the sense of touch in user-device interaction."
Hong Tan, senior researcher and manager of the Human Computer Interaction Group at Microsoft Research Asia, points out that despite the advancements in touch and haptics over the years, almost everything that we do relies on our eyes seeing the screen to be able to fully interpret our actions and the device's reactions to our inputs. "With sight alone, most people are perfectly fine interacting with computing devices today," she says. "But how much more efficiently, how much more enjoyably, can we interact with computers? How much more accessible can we make them?"
Tan's team is exploring ways to redefine touch experiences, providing more 'relevant' feedback than a simple vibration, and even changing the way that the user interprets the 'texture' of the glass on a smartphone or tablet.
For example, Microsoft removed the standard glass front from a Nokia Lumia 520 and added piezoelectric actuators under the bezel. "Now, when you're typing," Tan explained, "the glass literally bends instantaneously - very small bending, but that's enough to tell your finger that it feels like a key-click."
Her team did the same with a Surface Touch Cover keyboard, allowing it to provide the same sensation of actually pushing a button, rather than simply tapping a flat surface. She says that this kind of haptic feedback allows users to type faster, and with fewer errors, compared with devices that lack the ability to provide such sensations to the user.
In a further demo, Tan showed a Lumia 920 that had been integrated into a testing rig to showcase some much more sophisticated touch experiences, based on the same underlying technologies. On the checkerboard, the black squares feel 'sticky' when swiping your finger across it, whereas the white squares feel much smoother.
In this example, an additional glass layer is placed on top of the device, hooked up to piezoelectric actuators. The glass is vibrated at very high frequency, but as you run your finger across it, the glass can trap a tiny layer of air between the surface and your skin, which changes the way that you perceive the texture of the glass itself, by increasing or reducing the amount of friction your skin encounters as it moves across the glass.
A fourth demo showcases electrostatic haptics, which take advantage of the fluids in your fingertips. When your finger touches the screen, a conducting layer above the glass on the device can localise the electrostatic charge in a particular area on the screen, attracting the opposing charge of the fluids in your fingers. This creates a force that gently pulls the finger downwards, generating friction and again changing the way that the touch feels when touching or swiping across the screen.
"The thing that's really, really cool," says Tan, of all the work that Microsoft is doing in this field, "is to take a smooth piece of glass but make it feel different - it's almost magic."
Watch the full video about the work of Hong Tan and her team at Microsoft Research below.
At this stage, you may well be thinking this sounds absolutely awesome - or, perhaps, you may think it's all a bit lame. After all, how will any of this actually make any real-world difference when you come to actually use your device?
Microsoft explains: "Imagine experiencing a clicking sensation when pressing an on-screen button, sensing the weight of folders when dragging and dropping, and perhaps even feeling the texture of a sweater for sale online." These are just examples, but they give a broad sense of how such a simple idea could translate into a transformative user experience - with more research, and plenty of imagination.
These technologies also have enormous potential to make computing a much more engaging and empowering experience for those with poor eyesight, or none at all. In fact, Tan has been working closely with Microsoft's Accessibility teams, to get a better understanding of how blind people interact with computing devices, and with objects and spaces in the wider world.
Exactly when we'll see these emerging technologies become broadly available remains ambiguous, for now - but we know that Microsoft has developed advanced prototypes for new touch experiences under its McLaren project. The '3D Touch' system implemented on that now-suspended Windows Phone handset was different to the haptics-based approach revealed here, but it does nonetheless indicate that Microsoft is seriously working to bring new touch experiences to market.
When they finally arrive, they could well make interacting with our devices a whole lot more engaging and immersive. We can't wait.
Source: Microsoft Research | images via Microsoft, except where indicated
39 Comments - Add comment