We help our customers deliver customer delight & engagement through breakthrough technology.
Last week I visited Wired Live 2018 conference. There were a series of fantastically inspiring speakers which I’ve written about here, but during the coffee breaks we got to get hands on with the latest cutting-edge tech demos and products from around the globe. Here are four I’ve picked out as having particular relevance and inspiration value for the future of consumer products.
Creating truly unique experiences through physiological-characterising technology
Customisation and personalisation is a trend across all corners of our consumer business from drink dispense to skin care. Nura headphones run a one minute characterisation of each user where they play audio content and analyse the otoacoustics that the ear produces in response to sound. There is more variation in the way each of us hear than in the way we all speak, so there is a lot to gain by characterising an individual’s physiological response to audio and then tailoring the way music is played to them in order to create a tailored experience better than anything your ears have heard before. We know that smell and taste is different for everyone, so I look forwards to a similar experience involving scanning my nose and tongue at some point in the future!
How would you feel to be given a 6th sense, and a 7th, 8th…?
Cyborg Nest have a product that hints at the future for augmented senses. It’s a small badge you wear against your skin which vibrates when you align with north. After a couple of days, the vibration becomes a second nature and the user builds into themselves an intrinsic new mapping of their directional alignment relative to North. It’s the psychology-device interaction which is exciting – like the study that showed if users wear glasses which invert your vision upside down, after a week it becomes second nature to move you hand up in your vision when you want it to move down etc – the point being that the body and brain adapt to adjusted or augmented senses in a remarkably short time and one can only imagine how our brains will take and adapt to new senses and capabilities provided by technology. Zoom vision, 360 degree field of view, enhanced smell, multispectral vision etc. From what I’ve seen and read so far, it seems like our brains are quite open to upgrades.
Human eye resolution VR is as close as we have got to teleportation
I’ve particularly enjoyed being a guinea pig for projects involving VR which we are developing. Sometimes latency is key, sometimes a physical and virtual combination experience is key, sometimes it’s the physiological response to VR that we’re interested in. Usually resolution doesn’t matter too much – riding on a low res Minecraft-esque roller coaster is still immersive and fun, riding mountain bikes in VR on a real bike at Wired a couple of years ago was terrifying but not super high res, exploring a Tron-like virtual world where I could see my arms and legs and walk around a large room was amazing, but again the resolution wasn’t outstanding but didn’t impact. This year at wired though, trying out Varjo’s human-eye resolution VR was quite astonishing. The world was static, no roller coasters or cycling down ski-jumps this time, I was in a long-occupied artist’s studio with paint smears everywhere, sculptures, tools, easels all around me and within a couple of seconds my brain was content as to that’s where I was – I was teleported into this room. I could physically walk around in it, put my head right up to objects and indeed it was eye-resolution even close-up. It involves some clever realtime eye tracking to help provide the reality that your brain does not question as not real. They’re up to £45m from series A and B funding as of October so keep a high-resolution eye out for them and their ‘Bionic Display’ technology. Our technology developers have got an exciting future once this level of immersion is combined with all the other clever things they’ve been doing with VR systems!
How can you grab a virtual object and feel it, without wearing a peripheral on your hands?
Having worked on projects to reduce cognitive load on users when they interact with devices, and using haptics to improve accessibility to industrial products, it was great to get hands on with Ultrahaptics’ ultrasonic haptic technology. Wearing an AR head set you can see and interact with a wheel of virtual paint pots which you spin then grab-select to apply a new colour to a virtual sports car at the centre of your vision. As you grab the car, rotate it and spin the wheel of paint pots, with your hand in mid-air, there is an array of 256 ultrasonic transducers triangulating ultrasonic interference and creating a sensation of something physical on your fingertips when you interact with the virtual objects.
Current BOM costs of around $1 per transducer are currently grounding it in automotive, industrial and professional products, but the cost will inevitably reduce over time and it’s exciting to imagine the rich new interaction experiences that can be provided to consumer devices – whether it involves an AR component or just physically interacting with devices at range and receiving tactile feedback of that interaction. The creator of the Minority Report UIs was a speaker at the event this year also, he now runs a company developing systems that can enable close-to minority report experience, but Ultrahaptics would take that concept to the next level.