Published: May 23, 2021 on our newsletter Security Fraud News & Alerts Newsletter.
Home assistants are now an everyday part of countless households, helping dig up information from the weather to help with paying bills. Convenient? Definitely. But there’s a price some pay for that convenience when Alexa can be fooled by hackers into doing things you would never request yourself. The problem is that Alexa allows duplicate invocation names for its services, which can spell big trouble for some users. Depending on the invocation used, a user can trigger a wrong skill that ends up activating a problematic or insidious action. At the heart of the problem is that Alexa allows multiple skills that just need the same phrase to get started.
What’s an “invocation phrase?” It’s the phrase you use to ask Alexa (or any assistant) to perform a task, such as “Hey Alexa, what’s the weather today?”
Research by academics in the U.S. and Germany analyzed 90,194 skills in seven countries, including the U.S., Canada, the UK, and France. Their research found 9,948 skills sharing the same invocation name in the U.S. store. All seven skill stores proved only 36,055 had unique invocation names. This issue, warns the researchers, can allow a bad actor to activate the wrong skill using well-known business names. It can bypass verification defenses and can, over time, replace it with malicious functions through future updates.
The study also found that Amazon’s permission model used to protect Alexa data that is sensitive, can be circumvented by an attacker directly requesting data from Alexa. That information can include Amazon Pay data, phone numbers, and more. That information is originally designed with permission application programming interface (API), but doesn’t prevent a criminal developer from getting that information directly from the user. APIs allow two applications to communicate with each other to access data. The researchers identified 358 skills that were not protected by the API, allowing bad actors to request information they should not have access to.
Why Amazon allows duplicate invocations for Alexa is unknown. The researchers claim, “While such applications ease users’ interaction with smart devices and bolster a number of additional services, they also raise security and privacy concerns due to the personal setting they operate in.” Looking at the situation from Amazon’s perspective, the duplicates model has less of a burden on their platform, but there are drawbacks for Amazon, and confusion for Alexa users.
In the meantime, remember that your home or other electronic assistant may be listening at any given time. Always review purchases made through Alexa to make sure they are indeed the ones you ordered and not fraudulent. Report anything you see that’s in question to Amazon and follow up with your financial institution, if needed.
Keep up to date: Sign up for our Fraud alerts and Updates newsletter
Want to schedule a conversation? Please email us at firstname.lastname@example.org