I grew up during the dial up era of internet and remember how insane it was each time the technology improved, broadband, dsl, fiber etc.

I wouldn’t expect pages to instantly load, but I have to imagine all the data farming is causing sites to be extremely bogged down.

  • shnurr@fludiblu.xyz
    link
    fedilink
    English
    arrow-up
    95
    arrow-down
    1
    ·
    10 months ago

    Well if you just try to load a news website with and without an ad blocker you will usually notice a huge difference. So yes.

    But also, technology has become much more complex compare to the beginning of the internet. So every piece of software is more bloated than it used to be, sometimes for a good reason, sometimes less so.

    • Khalmoon@lemm.eeOP
      link
      fedilink
      arrow-up
      10
      ·
      10 months ago

      Thank you, I wasn’t sure if I was just getting impatient with websites and not appreciating how far we’ve come since DSL. It’s made sense in my head but it always felt like a mildy dumb question

  • Treczoks@lemmy.world
    link
    fedilink
    arrow-up
    40
    arrow-down
    2
    ·
    10 months ago

    Lets put it this way: A typical news page pulls a few megabytes of HTML, CSS, their own images, web framework scripts, advertising, etc. For showing about 500-1000 bytes of actual text.

    • qupada@kbin.social
      link
      fedilink
      arrow-up
      16
      ·
      10 months ago

      Worse still, a lot of “modern” designs don’t even both including that trivial amount of content in the page, so if you’ve got a bad connection you get a page with some of the style and layout loaded, but nothing actually in it.

      I’m not really sure how we arrived at this point, it seems like use of lazy-loading universally makes things worse, but it’s becoming more and more common.

      I’ve always vaguely assumed it’s just a symptom of people having never tested in anything but their “perfect” local development environment; no low-throughput or high-latency connections, no packet loss, no nothing. When you’re out here in the real world, on a marginal 4G connection - or frankly even just connecting to a server in another country - things get pretty grim.

      Somewhere along the way, it feels like someone just decided that pages often not loading at all was more acceptable than looking at a loading progress bar for even a second or two longer (but being largely guaranteed to have the whole page once you get there).

    • schzztl@lemmy.nz
      link
      fedilink
      arrow-up
      3
      ·
      10 months ago

      You WILL watch this flashy, totally necessary popup video on 4G and you WILL like it!

  • zik@lemmy.world
    link
    fedilink
    arrow-up
    21
    ·
    edit-2
    10 months ago

    Run ublock origin and turn off all the ads and trackers. Then you can see for yourself how much faster it is.

    The answer is… it depends on the page but in some cases a lot, in other cases not much.

    • IverCoder@lemm.ee
      link
      fedilink
      English
      arrow-up
      5
      ·
      10 months ago

      I have been using uBlock for an extremely long time now and sites seem to become faster over time even though I have never upgraded my laptop all these years.

      When I tried to disable uBlock for a day, my laptop forcefully shut down after 30 minutes.

      Software optimization over time improves performance but is outweighted by the tracking crap they put in there.

  • bob_wiley@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    2
    ·
    10 months ago

    A lot of it is all the frameworks. Early on a simple site might only be 1 small file, or 3 if they wanted to separate things a bit. Now when you load a page it is fetching other code libraries, fonts, and other non-sense, in addition to all the ads and tracking that surrounds that. A very simple page could be very large and bloated.

    In addition to this, a lot of sites these days are driven by APIs. So when you load the page it may make dozens of additional API calls to get all the resources. Looking at a Lemmy post, it’s making 34 API calls. It looks like they’re making a separate call for each user avatar in the comments, for example.

    You can open up the Inspect panel in your browser and see this. Go to the network tab and reload the slow page you’re on. You’ll see all the stuff it’s actually fetching and how long it takes.

    • Rikudou_Sage@lemmings.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      10 months ago

      It looks like they’re making a separate call for each user avatar in the comments, for example.

      The browser does that. How would you expect to get the images except for one by one? It has always been the case for as long as images have been supported.

  • MikeT@lemm.ee
    link
    fedilink
    English
    arrow-up
    11
    ·
    10 months ago

    It is not entirely data farming, a lot of this is due to use of heavy assets like fonts, frameworks, images, videos, etc. A lot of that is downloaded as part of loading the site initially and then the browser has to render/compute the site’s use of JavaScript, CSS, etc.

    Fonts and some JS assets are cached by the browsers and CDN to try to minimize redownloading it but it doesn’t change the fact that average websites today are much heavier than it was back in 90s.

    See how fast this site loads: https://text.npr.org/

    Or https://tildes.net/ compared to Reddit.

  • jjjalljs@ttrpg.network
    link
    fedilink
    arrow-up
    10
    ·
    10 months ago

    Most of that stuff is async so probably not a lot. Like, you load the page and it sends a request off to pendo, but the page doesn’t wait for that to finish before doing the next thing.

    There are a lot of ways to make pages perform badly, though, especially as they get more dynamic.

    At my job the home page was loading extremely slowly for a while until we realized a refactor had made the query backing it extremely stupid. Like it accidentally was doing a separate query for every user associated with every post you could see, and then not even using the results. Oops. Fixed that and got a huge performance increase, but it had nothing to do with data tracking.

    • TrenchcoatFullOfBats@belfry.rip
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      10 months ago

      And install the NoScript extension and see exactly how many additional companies are getting your data when you just want to learn 15 fascinating facts about frogs.

  • PeachMan@lemmy.one
    link
    fedilink
    arrow-up
    1
    ·
    10 months ago

    Yeah, try using any Microsoft product in the browser. You type in the URL and it goes through like 19 redirects before showing anything.

  • PeterPoopshit@lemmy.world
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    edit-2
    10 months ago

    Probably not. I think the timeouts are set up differently or something. Back in the day even if you had 1.2kb/s dial up internet, you could reliably load webpages and all their css if you were patient. Nowadays, if your internet speed dips by even a little bit, everything stops working and being patient about it only results in error screens.