Amazon reportedly making its own AI chips

Amazon Echo devices currently need a plugged in power source to work

Amazon Echo devices currently need a plugged in power source to work

AI tasks, because they are so computationally intensive, often need custom-designed chips for the devices themselves and even custom-designed servers for data centers where AI algorithms are often trained, developed, and deployed from the cloud. Amazon reportedly has about 450 people with chip expertise working on the effort.

At this time, most of the Alexa's processing happens in the cloud and by doing part of this processing right on the device using the AI chips, Amazon will significantly reduce Alexa's response time.

For over a year now, Vizio TV owners have been able to control their SmartCast TVs with Google Home, and now the company is bringing another digital assistant to the table: Amazon Alexa. The company ditched its long-time GPU producer previous year to create the in-house graphics chips found in the iPhone 8 and iPhone X. The Cloud Cam and Echo now need a plug-in power source to operate. It's similar to the Tensor Processing Unit Google announced in 2016, the platform it used to defeat Lee Sedol, the world's greatest Go player.

Blink's cameras are still available on Amazon, and the company was planning a video doorbell for release in 2018, similarly powered by standard batteries. It acquired chip designer Annapurna Labs in 2015 whose ARM-based Alpine chips are designed for routers and connected home products.

"To shop with Alexa, customers must ask Alexa to order a product and then confirm the purchase with a "yes" response to purchase via voice", an Amazon spokesperson told The Sun Online.

As first reported by The Information, Amazon plans to create proprietary AI chips and place them in its Amazon Echo, Amazon Dot, and every other speaker in their lineup.