The Amazon Echo Show 15 not only hangs on your wall but can learn to recognize your face. That's because it has a new piece of Amazon-designed silicon inside dubbed the Amazon AZ2 Neural Engine.
Yes, Amazon custom designs ARM chips. The AZ2 isn't even the first one (hence the 2), but it's a lot more capable than the AZ1, which powers some of the best Alexa speakers and offers something new for Amazon — edge computing.
If you're not sure what edge computing is, this chip and what it does actually makes it easy to understand. All the processing to learn and recognize your face is done using machine learning through the chip itself and nothing needs to be sent across the internet to make that happen.
I still think any computer learning to recognize human faces is pretty creepy but doing it locally instead of through a remote server is pretty cool. Also, you have to opt-in for this feature, so you can still buy Amazon's new Echo Show 15 even if you think it's creepy like I do. But enough about creepy stuff.
What the AZ2 can do — on paper anyway — is pretty impressive. Consider the last-gen AZ1, which was able to recognize your voice without Amazon needing to send that data through the cloud. The new model does that, of course, but it's also capable of performing 22 times the amount of operations each second.
The AZ2 Neural Engine can work 22-times faster than Amazon's last-generation processor.
This means it has plenty of local bandwidth to learn your face as well as your voice. In fact, Amazon says it can process speech and facial recognition simultaneously. A big reason for this is because it's a neural edge processor. Those sound like the kind of words tech companies like to throw around, but they do mean something — the "neural" part means it's a chip used with algorithms for machine learning and the "edge" part means it can do it without calling for backup from some server.
By doing things locally, there is almost zero latency, which means there is virtually zero wait time between operations. We haven't seen how well it actually operates but based on its capabilities, it looks like the perfect chip to put inside something like an Echo Show.
Edge computing is not only better for privacy, but it's faster, too.
Speaking of that, the Echo Show 15 is the only device that will use the new AZ2 Neural Edge chip for now. We expect that to change as Amazon brings its Visual ID feature to other devices. Maybe even drones or robots.
Whether you love Amazon products or hate them, you can't help but be impressed with the new AZ2. It's easy to forget that Amazon is also part of Big Tech, but things like this serve to remind us that some top-level engineers work a lot of hours to build those Echo devices so many people love.
0 Response to "You Can See More: What you need to know about the Amazon AZ2 Neural Engine"
Post a Comment