Amid a barrage of Amazon-branded tablets and Alexa-powered tech, Dave Limp, SVP of Amazon Devices and Services, announced the company’s digital assistant will soon tap into a purpose-built large language model (LLM) for almost every new Echo device.
Amazon set out to design the LLM based on five foundational capabilities. One of these is ensuring interactions are “conversational,” and the company claimed it “studied what it takes to make a great conversation. It’s not just words; it’s body language, it’s understanding who you’re addressing, it’s eye contact and gestures.” Still waiting on Amazon to add eyes and hand gestures to its Echo devices. Has anyone seen recently?
Based on the demos at Amazon’s showcase, however, it’s got some work to do. When Limp asked Alexa to compose a quick message to invite friends over for BBQ, the assistant requested his friends’ attendance for “BBQ chicken and sides” — which is how we invite humans over for dinner, right? Alexa also outright ignored the Amazon SVP’s requests at points during the presentation, but I’ll put those issues down to the fraught nature of voice assistant demos in a live setting. We’ve pulled all of Amazon’s announcements together right here.
— Mat Smith
The biggest stories you might have missed
Freedom from touching your screen.
With the Apple Watch Series 9, Apple is introducing a new method of interaction: Double Tap. It’s also rolling out on-device Siri processing, which will let you ask the assistant for your health data and to log your daily stats. When both hands, or at least your watch hand, are occupied, Double Tap will obviously not be helpful. You’ll need to have at least your thumb and index finger available to pinch. But when Engadget’s Cherlynn Low is cleaning her apartment, holding a side plank, raising a single dumbbell or reading a book, it makes her life easier. Also, it’s worth noting that the Apple Watch Series 9 and Ultra 2 are the company’s first carbon-neutral products. Read on for our full verdict.
But the full damage of the attack remains unclear.
All MGM Resorts hotels and casinos are back up and running as normal, nine days after a cyberattack shut down systems across the company. The ALPHV ransomware group took credit for the attack shortly after systems went offline. The group claimed it used social engineering tactics, using a bit of LinkedIn knowledge and a short phone call to access crucial systems across casinos. Worryingly, the attacks both started through identity management vendor Okta – and at least three other Okta clients have been hit by cyberattacks, according to a Reuters report.
It’s also bringing on-screen translations to Alexa calls on its smart displays.
Amazon announced two new accessibility features coming to its devices later this year. First is Eye Gaze on Alexa, which will let those with mobility or speech disabilities use their gaze to perform preset actions on the Fire Max 11 tablet. This is the first time Amazon has worked on gaze-based navigation of its devices, and it will use the camera on the Max 11 to keep track of where a user is looking. The preset actions include smart home controls, media playback and making calls. Eye Gaze will be available on the Max 11 later this year at no additional cost, although the company did not otherwise elaborate on how Eye Gaze actually works.
Amazon is also adding a new Call Translation feature that will transcribe Alexa calls on Echo Show devices. It can convert them into over 10 languages, including English, French, Spanish and Portuguese. The feature will also launch later this year.