• Midnitte@beehaw.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    A charger can be manipulated to control voice assistants via inaudible voice commands…

    This seems like the scarier attack, to be honest…

    Though, surely there’s filtering that can be performed to prevent that as an attack vector

    • Skull giver@popplesburger.hilciferous.nl
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      Using ultrasonic frequencies to induce vibration and transfer sound humans can’t hear to voice assistants has been demonstrated a fee years ago. With the right equipment (nothing you can’t find on AliExpress) this isn’t too difficult.

      With modern smart assistants, you’ll also need to take the owner’s voice, though AI can do that if you record just one conversation at a decent quality.

      In practice, assistants are quite useless, though. Ask them anything dangerous, such as leaking contacts or sending files, and the phone will start showing you results from Google rather than actually doing something.

      You could trick the phone into opening a website with an exploit kit, but then your target needs to be vulnerable anyway, and there are other options to do that (i.e. buying ads with a very specific profile that only matches your target).

      The physical harm of a fire is probably worse than anything you should expect out of a voice assistant attack.

      • Midnitte@beehaw.org
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        Right, and Google uses those frequencies to pair Chromecasts - my point was that if they’re using it (and aware of it), surely they have a way to detect (and filter) it.