Showing posts with label SDK. Show all posts
Showing posts with label SDK. Show all posts

Oct 2, 2013

Elliptic Labs Launches Android SDK For Its Ultrasound-Powered Mid-Air Gesture Tech — Phones With ‘Touchless’ UI Landing In 2H 2014

Elliptic Labs, a startup founded back in 2006 which uses ultrasound technology to enable touchless, gesture-based interfaces, has finally pushed its tech into smartphones. It’s been demoing this at the CEATEC conference in Japan this week (a demo of Elliptic’s tech running on a tablet can also be seen in this TC video, from May) but today it’s announcing the launch of its first SDK for Android smartphones.
Elliptic’s technology is able to work with any ARM-based smartphone, confirmed CTO Haakon Bryhni in an interview with TechCrunch. “That is completely new to us, that we’re able to make the technology available on a low-powered platform,” he said. ”A major part of our technology development for the past half year has been to optimise our algorithms for smartphone use.”
Gesture-based user interfaces which turn mid-air hand movements into UI commands have pushed their way into console-based gaming, thanks to Microsoft’s Kinect peripheral, and also mainstream computing via the likes of the Leap Motion device and webcam-based alternatives. Mobiles haven’t been entirely untouched by ‘touchless’ interfaces — Samsung added limited mid-air gesture support to the Galaxy S4 earlier this year, for instance (and back in 2009 the now defunct Sony Ericsson tried its hand at motion-sensitive mobile gaming) — but most current-gen smartphones don’t have the ability to respond to mid-air swiping.
That’s set to change in 2014, as Elliptic Labs is currently working with several Android OEMs that are building devices that will include support for a gesture-based interface. Bryhni would not confirm the exact companies but said he expects several gesture-supporting mobile devices to hit the market in the second half of next year.
“We are currently working very closely with three OEMs, in advanced prototyping stages with the objective of getting our technology into handsets — one tablet and two smartphone manufacturers,” he told TechCrunch. “We are also talking to some laptop manufacturers. But it is the smartphone and tablet vendor that are the most aggressive.”
As well as increased numbers of mobile devices packing gesture support next year, the technology is going to get more powerful thanks to the capabilities of ultrasound, according to Bryhni. He argues that the Galaxy S4′s gesture support is more limited, being as it’s powered by an infrared sensor which requires the user to be relatively close for it to function.
By contrast, Elliptic’s embedded ultrasound tech (which basically consists of microphones and a transducer, plus the software) can support gestures within an 180 degree sphere — in front of and around the edges of a phone, and at distances that could be customised by the user – allowing for a range of “natural gestures” to be used to control the UI, interact with apps or play games.

ULTRASOUND’S BACKERS & CHALLENGERS

Ultrasound also contrasts favourable to a camera-based gesture technologies like Leap Motion, according to Bryhni, which requires the user to perform their hand movements within a relatively narrow “cone” where the camera can see them. “If you put cameras onto the screen — let’s say integrated into the bezels — then you need to hold your hand at 90 degrees so it’s super inconvenient,” he said, discussing the drawbacks of using camera-based systems to enable gestures on mobile devices. “The benefit with our technology is it works with the sensor placed flat and invisible, hidden within the bezel of the screen.
ultrasound“Using ultrasound enables a very natural way of gesturing. And also the big benefit that we can work on a smartphone and a tablet, and we’re not dependent on any high powered lights or cameras.”
“Ultrasound uses a fraction of power in comparison to optical 3D technologies.  Even in low light or in the dark, you can use the same natural hand movements you use every day,” added Elliptic Labs CEO Laila Danielsen in a statement. “With our software SDK we are giving smartphone manufacturers a way to easily and cost effectively include consumer-friendly touchless gesturing into their phones.”
Elliptic is not the only company looking at using ultrasound to extend user interfaces in new ways.Chipmaker Qualcomm acquired digital ultrasound company EPOS last November — perhaps with a view to pushing ultrasound tech into styluses, which would allow for a nearby mobile device to detect the position of the pen and pick up notes being made on a paper notepad, for instance. Qualcomm is also evidently interested in how ultrasound can be used to support gesture interfaces on mobile devices.
In terms of competing with Qualcomm, Bryhni argues that the EPOS’ pen tracking technology Qualcomm acquired is different to what Elliptic Labs has been focused on. “We’ve been dedicated on gesture recognition for eight years. We’ve seen this coming,” he said, adding: “We have the time and expertise in the market.” He also points out that Elliptic is offering device makers who make their own processors — as Samsung and Apple do, for instance — an alternative to having to buy all Qualcomm chips. “Our customers are quite interested in having an independent chipset for gesture-recognition technology,” he added. “The vendors tend to like that flexibility.”
Another area of flexibility is that Elliptic has made its technology available within an off-board DSP — the Wolfson 5110 – which allows an OEM to create a device that supports gesture controls even when the phone’s main processor is sleeping (i.e. so that a gesture interface does not compromise other power efficiency technologies which help to improve battery longevity on a mobile device). “A trend in modern smartphones and tablets is you offload some of the heavy, single processing to a dedicated DSP,” he said. “We have done that at this point… with a very high powered and super small DSP.”

GESTURE INJECTION FOR EXISTING APPS

As well as today’s Android SDK, which lets developers and OEMs build new software that can take full advantage of Elliptic Labs’ ultrasound tech’s capabilities, it is offering the ability to ‘retro-fit’ the tech to existing applications. It’s calling this ability to map mid-air gestures to existing apps “gesture injection”.
“For example if you wave left to right you create an arrow left event. If you swipe from the top of the screen and down you generate a close application event, for instance. And if you detect a gesture coming in from the right into the screen we for instance engage a menu, so in this way a legacy game such as Fruit Ninja… [can be gesture-mapped],” said Bryhni.
elliptic-prototype“It’s much more fun slashing fruits in the air than swiping on the screen,” he added.
Fruit Ninja mid-air swipes certainly sound fun but that’s just one application. Does the mobile space generally need gesture-based interfaces? As noted above, OEMs have dabbled here already with relatively uninspiring results. Smartphone touchscreens continue to engage their users with evolving on-screen gestures. So off-screen gestures are likely going to need some killer apps to get the users fired up — something that makes mid-air finger wiggling as cool as pinch to zoom was, when that first aired. But what are those gesture-powered apps going to be?

KILLER APPS: FROM COMMS TO CARS & WATCHES

Bryhni sees two main use-cases for gesture-based interfaces on mobiles. Firstly, controlling the UI, so things like changing apps, engaging menus, browsing up and down, selecting images and so on; and secondly: custom applications, such as games or mapping apps, or switching between productivity apps. He also sees potential for the tech to allow our devices to pick up on some of the unspoken communication that’s conveyed by things like hand gestures and body language.
“If you watch people communicate a good fraction of their communication is actually gestures… So gestures are actually quite an important part of expressing yourself and we think computers should detect this and include it in the general user interface,” he said. “It’s been a major change in smartphones when the touch panel was invented… and we believe that new user interfaces that can make it more natural to interact with your device actually has the potential to… strongly influence the market.”
But perhaps the biggest pull on the technology in the mobile space at least is the need for Android OEMs to add something different to their devices so they can stand out from each other and the rest of the industry. “I would say the ones that really need this are the OEMs,” Bryhni added. “They have a very strong need to differentiate themselves.”
Asian mobile makers are likely to continue to be at the fore of smartphone-based gesture interfaces, according to Bryhni. “We are a European and American company but the Asians are quite aggressive when it comes to introducing new technology,” he said, noting that Elliptic, which has offices in Norway and Silicon Valley, will be opening an office in the region soon, to support Asian OEMs.
samsung-s4“It should be our turf,” he added, discussing how innovation is shaking out in the smartphone space, with Asia leading the charge when it comes to pushing new technologies into devices. “They are more willing to try. [The U.S. and Europe] can’t afford to let Asians completely rule this business.”
Moving beyond mobile, Bryhni said he sees potential for ultrasound-powered gestures to elbow their way into cars — as a hands-free way to command in-car entertainment or navigation systems, for instance. “Car applications is a use-case that we are pursuing. We are working with automotive manufacturers to put ultrasonic touchless gesturing into cars. The automotive use case is highly relevant because it works in the dark and in changing light situations (such as when you are driving with the sun coming into a window or at night),” he said.
Asked Specifically about smartwatches, which have obvious screen size constraints and could therefore benefit from a gesture-based interface that doesn’t require the user’s fingers to block on-screen content, he said it would certainly be possible to mount the tech in a wrist-based mobile device.
“The process is feasible to make this happen and that is something we are envisioning but we are not actively working with anyone at present,” he told TechCrunch. “It’s an opportunity that could work because nobody has looked into it before because of power consumption. With our technology, it becomes feasible to command a new smart-watch and make it touchless. It could be a distinctive new feature that could differentiate one vendor from another.”
Courtesy: techcrunch


Powered by Blogger.

 

© 2013 Technology Update News!. All rights resevered. Designed by BDpython

Back To Top