Amazon is both famous and notorious for making it extremely easy to buy things with just a single click, even using a “1-Click” button exactly for that. It seems, however, that it may just be just as simple to compromise one of Amazon’s most popular products. Its AI-powered personal assistant Alexa can be found in so many smart speakers and smart home products and all it takes is a well-crafted innocent-looking link for a hacker to get control of an Alexa device and the owners’ data associated with it.
Smart assistants have always carried with them an amount of privacy and security risk considering they almost always communicate with a remote server to do their magic. Even when it does things like voice recognition on-device, getting information, and controlling other appliances will almost always involve communicating via the Internet. Even more so when the user wants to install a new app or skill, which is where this Alexa vulnerability starts.
Check Point Research discloses Amazon’s Alexa-related subdomains are particularly vulnerable to Cross-Origin Resource Sharing (CORS) and Cross-Site Scripting (XSS). In a nutshell, this means that hackers will be able to extract some important pieces of information, like some tokens and IDs, whenever Amazon’s subdomains communicate with each other to perform certain tasks.
The researchers’ example involved clicking on a malicious link craftily-disguised as an Alexa skill installer. All it takes is for the unwitting user to click on that link and a series of communication between remote servers will yield data that can be used by a hacker to inject code into Amazon’s Alexa skill store to get access to a user’s account. From there, the intruder can install or remove Alexa skills and even get the victim’s personal information.
Unfortunately, the post doesn’t mention if Amazon has already taken to secure these weak points. With smart assistants and smart speakers becoming more ubiquitous, it’s critical that every step of the data processing flow should be as secure as possible.