Sorry for the short post, I’m not able to make it nice with full context at the moment, but I want to quickly get this announcement out to prevent confusion:

Unfortunately, people are uploading child sexual abuse images on some instances (apparently as a form of attack against Lemmy). I am taking some steps to prevent such content from making it onto lemm.ee servers. As one preventative measure, I am disabling all image uploads on lemm.ee until further notice - this is to ensure that lemm.ee can not be used as gateway to spread CSAM into the network.

It will not possible to upload any new avatars or banners while this limit is in effect.

I’m really sorry for the disruption, it’s a necessary trade-off for now until we figure out the way forward.

  • AeroLemming@lemm.ee
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    How is that not extremely problematic? What stops someone from using Tor and a bunch of dummy accounts to send CSAM to someone else and get them arrested?

    • PM_Your_Nudes_Please@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      And that’s pretty much where we are now. Bad actors creating bot accounts on multiple instances, to spam the larger (most popular) instances with CSAM.

    • ZodiacSF1969@sh.itjust.works
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      edit-2
      1 year ago

      I think they have oversimplified the situation to the point that it is wrong.

      1. Arguably, Lemmy instance providers (depending on where they live) are protected in the same way Facebook or other content hosts are. So long as you are acting in good faith you are protected against any illegal content your users upload. This does mean you need to remove illegal content as you become aware of it, you can’t just ignore what your users are doing.

      2. There have been cases where although a user technically ‘possessed’ CSAM, it was shown that they did so unknowingly via thumbnails or it being cached. The police do investigate where it came from. It’s not as simple as just sending it to someone and you can have them convicted.

      • AeroLemming@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Oh okay, that’s good. So if you could show that you were trying to block it, you’d be safe.

        • ZodiacSF1969@sh.itjust.works
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          1 year ago

          Yes, you’d just need to show that you actively moderate/apply content policies.

          This will vary by jurusduction, but most of the West has laws similar to this I believe.