AT&T Labs Fellowship Award Winner
Course you most regretted not taking? None, yet. Designing Multimodal Interfaces for More Natural ... summer project at AT&T Research centers on multimodal
At AT&T Labs - Research, we apply our speech, language and media technologies to give people with disabilities more independence, privacy and autonomy.
The AT&T speech mashup is a web service that implements speech tasks for web applications, enabling users of smart phones and other devices to use and hear voice communications.
Research of interactive video indexing and retrieval, multimodal interfaces (text, image, speech, touch, visual gesture), content processing, machine learning, biometrics, data mining, and NLP.
Living rooms getting smarter with multimodal and multichannel signal processing
that speech and multimodal interfaces could provide a more natural means for addressing such challenges ... “Living rooms getting smarter with multimodal and multichannel sign
AT&T Labs Research explores the technological possibilities in networking and communications to evaluate what is possible and help guide where the company should pursue development and deployment.
iMiracle: Multimodal Speech-Enabled Mobile Video Search
Bernard Renger, Bernard Renger, Bernard Renger, Bernard Renger, Bernard Renger, Bernard Renger iMiracle Multimodal SpeechEnabled Mobile Video Search March Spoken Query Voice Search Workshop 2010 SQ2010