Skip to main content

Malicious phishing apps for Google Assistant could get user’s passwords [Video]

There are more apps for Google Assistant than you’ll ever use, but apparently it’s also not that difficult to slip malicious apps through the cracks. Apparently, a security research lab managed to develop apps for Google Assistant and Alexa designed for phishing for user’s private information.

ArsTechnica details an experiment from Germany’s Security Research Labs that developed eight different apps/actions for Google Assistant and Alexa. They all passed through Google and Amazon’s respective security screenings.

The apps themselves were designed to function as horoscope checkers — one was a random number generator — but secretly contained the ability to eavesdrop on users and phish for their passwords. Each app works a bit differently, but they all essentially worked the same way.

Basically, the user would say something along the lines of: “OK Google, ask My Lucky Horoscope to give me the horoscope for Taurus.” At that point, the apps would respond with that information and play a prerecorded version of the “end” sound that Google plays after a third-party app has been closed, giving the impression that the app had stopped running. After that point, the malicious apps would continue to record audio for over 30 seconds and send those recordings to a server.

In another example, shown in the video below, the app immediately gives the impression that it is not working by mimicking the voice used by Google Assistant. This would give the impression that nothing is running on the device, but the app would actually wait roughly a minute and then mimic the Assistant’s voice once again to phish for the user’s Google account password.

This second attack should be obvious for the majority of users, but the first would likely never be detected. The apps used in this phishing experiment have been removed from Google Assistant and Alexa by the researchers in the time since they were created, but still, it shows that Google needs to be a bit more careful about what can be attached to Assistant. ArsTechnica has more details on how these attacks work.

More on Google Assistant:

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Google — experts who break news about Google and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Google on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Check out 9to5Google on YouTube for more news:



Avatar for Ben Schoon Ben Schoon

Ben is a writer and video producer for 9to5Google.

Find him on Twitter @NexusBen. Send tips to or encrypted to

Ben Schoon's favorite gear