April 19, 2012 at 12:14 PM EDT
At AT&T Labs, universal translators and wearable keys
Perhaps not surprising for a telephone operator, AT&T has some cool stuff in the works around speech. The company showed off some of its newest stuff out its AT&T Labs Thursday and some of the biggest news was around the work being done in voice-related technologies.
AT&T may spend a lot of time talking about mobile data but it still has some cool stuff in the works around voice and speech. The company showed off some of its newest stuff at an AT&T Labs event in New York Thursday, and some of the most interesting work is being done on voice-related technologies.
AT&T announced that in June it would be opening up its Watson speech recognition APIs, which will enable developers to build their own speech-enabled apps and services. AT&T has licensed that technology in the past to companies such as Vlingo but this represents an opportunity for all kinds of developers to speechify their apps. It also brings a challenge to Nuance, which has been lining up developers.
The APIs will focus initially on web search, local business search, Q&A, voice mail to text, SMS, U-Verse’s electronic programming guide, and a dictation API for THE general use of speech recognition. Other APIs for gaming and social networking will be available in the future.
At the event, AT&T demonstrated how its translation technology allows for simultaneous translated conversations between two people speaking Spanish and English on different devices. Users can speak into a VoIP app using one language and the AT&T technology is able to simultaneously transcribe and then translate the content into another language. AT&T already has a Translator mobile app that works with six languages but that requires both users to speak into one device. This holds the promise of people carrying on translated conversations anywhere in the world, similar to how TDD works for the speech or hearing impaired. The translation API is not yet available but will be released at a later date for developers.
One of the other cool demonstrations at the AT&T Labs event involved bio-acoustic data transfer. The technology uses sensors in a phone to recognize and transmit signals through a body using bone conduction. When touching a door handle, a user could put their finger on a transducer on their phone and push out a digital key that gets transmitted through their body and is received by a sensor in the door, which unlocks when it recognizes the key. The door would only unlock when it receives the unique signal that’s created by that person’s skeletal structure transmitting the original key. That could be an interesting alternative to other access methods that rely on biometrics or NFC.
But another intriguing scenario is being able to transfer data back and forth between two users with a handshake. If both people have their hand on their phone’s sensor and an app running, they can transfer a small amount of data back and forth through their touch. The two devices would recognize each other when the two people come in contact and would negotiate the transfer. Only a small amount of data — enough for a business card or small image — could be transferred in the time it takes to shake a hand. But it could an alternative to a Bump or NFC data transfer. Some companies are working on transferring data through galvanic skin response but AT&T said that requires daily calibration and can be more inconsistent.
Related research and analysis from GigaOM Pro:
Stock Market XML and JSON Data API provided by FinancialContent Services, Inc.
Nasdaq quotes delayed at least 15 minutes, all others at least 20 minutes.
Markets are closed on certain holidays. Stock Market Holiday List
By accessing this page, you agree to the following
Press Release Service provided by PRConnect.
Stock quotes supplied by Six Financial
Postage Rates Bots go here