Skip to Content

MIT Technology Review

  • Featured
  • Topics
  • Newsletters
  • Events
  • Podcasts
Sign in
Artificial intelligence

Alexa needs a robot body to escape the confines of today’s AI

The man behind Amazon’s voice assistant says AI programs need to see and explore the world if they’re ever going to attain real understanding.
March 26, 2019
Ms. Tech | Amazon

“Alexa, why aren’t you smarter?”

It’s question that Rohit Prasad, head scientist of the Alexa artificial-intelligence group at Amazon, keeps asking himself. It’s also a conundrum that tells us how much progress we’ve really made in AI—and how much farther there is to go.

Prasad outlined the technology behind Alexa, as well as the intellectual limits of all AI assistants, Tuesday at EmTech Digital, MIT Technology Review’s AI conference.

Amazon’s peppy virtual helper has hardly been a flop. The company introduced Alexa in 2014  as the ever patient, relentlessly cheerful female interface for its Echo smart speaker, a tabletop device that zeroes in on your voice from across a room and responds to spoken queries and commands.

Over 100 million Echo products have been sold since 2014, and the success of the product line prompted Google and Apple to rush out competitors. Virtual assistants are now available through hundreds of different devices, including TVs, cars, headphones, baby monitors, and even toilets.

Such popularity is a testament to how good software has become at responding to simple requests. Users have little patience for overly dumb virtual helpers. But spend much time with them and the technology’s shortcomings quickly reveal themselves. Alexa is easily confused by follow-on questions or a misplaced “umm,” and it cannot hold a proper conversation because it’s baffled by the ambiguity of language.

The reason Alexa gets tripped up, Prasad said, is that the words we use contain more power and meaning than we often realize. Every time you say something to another person, that person must use preexisting understanding of the world to construct the meaning of what you are saying. “Language is complicated and ambiguous by definition,” he said in an interview before the conference. “Reasoning and context have to come in.”

Alexa has some advantages over an analog human brain—like access to a vast encyclopedia of useful facts. By querying this knowledge base, Alexa can determine if you’re talking about a person, a place, or a product. This is more of a hack than a route to real intelligence, though. There are many situations where the meaning of a statement will still be ambiguous.

Even a simple-looking question like “What’s the temperature?” requires Alexa to do some reasoning. You could be asking what the weather is like outside, or maybe you want a reading from an internet-connected thermostat or oven.

Prasad explains that Alexa has ways to try to iron out such wrinkles—it knows your location and the time of day, and it can access every question you’ve ever asked, as well as queries from other people in the same city. If you ask it to play a particular song, for example, Alexa might guess that you’re after a cover version rather than the original, if enough people nearby are listening to that song.

But this kind of contextual information takes Alexa only so far. To be decoded, some statements require a much deeper understanding of the world—what we refer to as “common sense.”

Some researchers are now working on ways to let computers build and maintain their own sources of common sense. A growing number of practitioners also believe that machines will not master language unless they experience the world.

This could mean that Alexa will one day live inside something resembling a robot with eyes, limbs, and a way of moving around. “The only way to make smart assistants really smart is to give it eyes and let it explore the world,” Prasad said. Amazon has already created versions of Alexa with a camera. Other companies are developing personal robots capable of responding to spoken queries. Amazon is rumored to be working on some kind of home robot as well.

Although Prasad wouldn’t comment specifically on that, his comments show how deeply Amazon is thinking about the AI behind its voice helper. Indeed, if AI assistants do assume a physical presence, it could create a virtuous feedback cycle. Bringing together different capabilities—speech, vision, and physical manipulation—should create AI programs with much better language skills. It might also make robots that are a lot smarter and more helpful.

The question, then, may be: “Alexa, how smart are you going to get?”

资讯网张家口建设网站淘宝服装摄影起名缺火男孩成都微信营销推广算阴命有声小说在线收听起名11画的字有哪个淘宝店招制作网站制作网站找云优化净化水产品店铺起名周公解梦梦到自己生了个儿子起重机证哪里报名睡眠不好多梦怎么解决seo能做什么电视剧频道最近热播剧g点网打太极养生新生儿起名字女孩免费seo教程网过生曰的作文廖墨香周易app周易起名大师2020年鼠年男孩起名大全免费微网站制作费用是多少明扬天下起名周易免费测财运作文优秀作文公司起名评分测量打分网站优化公司报价怎样才能给宝宝起个好的名字安阳SEO少年生前被连续抽血16次?多部门介入两大学生合买彩票中奖一人不认账让美丽中国“从细节出发”淀粉肠小王子日销售额涨超10倍高中生被打伤下体休学 邯郸通报单亲妈妈陷入热恋 14岁儿子报警何赛飞追着代拍打雅江山火三名扑火人员牺牲系谣言张家界的山上“长”满了韩国人?男孩8年未见母亲被告知被遗忘中国拥有亿元资产的家庭达13.3万户19岁小伙救下5人后溺亡 多方发声315晚会后胖东来又人满为患了张立群任西安交通大学校长“重生之我在北大当嫡校长”男子被猫抓伤后确诊“猫抓病”测试车高速逃费 小米:已补缴周杰伦一审败诉网易网友洛杉矶偶遇贾玲今日春分倪萍分享减重40斤方法七年后宇文玥被薅头发捞上岸许家印被限制高消费萧美琴窜访捷克 外交部回应联合利华开始重组专访95后高颜值猪保姆胖东来员工每周单休无小长假男子被流浪猫绊倒 投喂者赔24万小米汽车超级工厂正式揭幕黑马情侣提车了西双版纳热带植物园回应蜉蝣大爆发当地回应沈阳致3死车祸车主疑毒驾恒大被罚41.75亿到底怎么缴妈妈回应孩子在校撞护栏坠楼外国人感慨凌晨的中国很安全杨倩无缘巴黎奥运校方回应护栏损坏小学生课间坠楼房客欠租失踪 房东直发愁专家建议不必谈骨泥色变王树国卸任西安交大校长 师生送别手机成瘾是影响睡眠质量重要因素国产伟哥去年销售近13亿阿根廷将发行1万与2万面值的纸币兔狲“狲大娘”因病死亡遭遇山火的松茸之乡“开封王婆”爆火:促成四五十对奥巴马现身唐宁街 黑色着装引猜测考生莫言也上北大硕士复试名单了德国打算提及普京时仅用姓名天水麻辣烫把捣辣椒大爷累坏了

资讯网 XML地图 TXT地图 虚拟主机 SEO 网站制作 网站优化