Researchers in the UK claim to have translated the sound of laptop keystrokes into their corresponding letters with 95 percent accuracy in some cases.

That 95 percent figure was achieved with nothing but a nearby iPhone. Remote methods are just as dangerous: over Zoom, the accuracy of recorded keystrokes only dropped to 93 percent, while Skype calls were still 91.7 percent accurate.

In other words, this is a side channel attack with considerable accuracy, minimal technical requirements, and a ubiquitous data exfiltration point: Microphones, which are everywhere from our laptops, to our wrists, to the very rooms we work in.

  • Pons_Aelius@kbin.social
    link
    fedilink
    arrow-up
    8
    ·
    7 months ago

    Because of different placement on the keyboard and different finger pressure, each key press has a slightly different sound.

    The telling thing in this story is this

    with 95 percent accuracy in some cases.

    For some people (those with a very consistent typing style on a known keyboard) they were right 95% of the time.

    In the real world this type of thing is basically useless as you would need a decent sample of the person typing on a known keyboard for it to work.

    To go from keystroke sounds to actual letters, the eggheads recorded a person typing on a 16-inch 2021 MacBook Pro using a phone placed 17cm away and processed the sounds to get signatures of the keystrokes.

    So to do this you need to have physical access to the person (to place a microphone nearby) and know what type of device they are typing on and for it to be a device that you have already analysed the sound profile of.

    • ILikeBoobies@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 months ago

      You don’t need physical access, just some malware that has access to the microphone

      We would hope researchers “discovering” this wouldn’t have a production ready product as their proof of concept. So there is room from improvement but military contractors would love to invest in this

      • Pons_Aelius@kbin.social
        link
        fedilink
        arrow-up
        4
        ·
        7 months ago

        You don’t need physical access, just some malware

        Which you still need to have previously installed…

        If the person has allowed malware to be installed just install a keylogger (which gives you 100% accuracy every time) rather than jump through more hoops with this.

    • agent_flounder@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 months ago

      The article says

      The researchers note that skilled users able to rely on touch typing are harder to detect accurately, with single-key recognition dropping from 64 to 40 percent at the higher speeds enabled by the technique.

      Hm. Sounds like “some cases” are hunt and peck typists or very slow touch typists.

      I don’t know if training for each victim’s typing is really needed. I get the impression they were identifying unique sounds and converting that to the correct letters. I only skimmed and I didn’t quite understand the description of the mechanisms. Something about deep learning and convolution or…? I think they also said they didn’t use a language model so I could be wrong.

      • Pons_Aelius@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        7 months ago

        The problems is that even with up to 95% accuracy that still means the with a password length of 10 there is a 50/50 chance that one character is wrong.

        A password with one character wrong is just as useless as randomly typing.

        Which character is wrong and what should it be? You only have 2 or 3 more guess till most systems will lock the account.

        This is an interesting academic exercise but there are much better and easier ways to gain access to passwords and systems.

        The world is not a bond movie.

        Deploying social engineering is much easier than this sort of attack.