- cross-posted to:
- technology@lemmy.ml
- cross-posted to:
- technology@lemmy.ml
The web is fucked and there’s nothing we can do about it. Kev Quirk looks back fondly at Web 1.0.
Pseudo romanticizing of the old web. Yeah, I don’t like that we’re heading into a corporate super controlled web but as of now, that still vastly better than it used to be before search engines were a thing. I also only look back with a nostalgic eye at the time of gaming magazines because it was fun, but it’s so much better to be able to Google stuff now. I don’t miss dealing with web design out of order, wild west style.
Old website navigation was often bad and ugly. Everyone had a forum but you never found what you were looking for. And web design unavoidably had to change to allow better mobile access. You could no longer load in font size 6 blue on top of blue as that would (correctly so) annoy people and make them stop visiting your page, when ther was a better site available.
Now social media isn’t necessarily bad, we’re on one after all, but there are definitely harmful social media who are just made for ragebait, like Twitter and Facebook.
It’s a fact though that you need more googlefu now, to find what you’re looking for.
When you see an article dismiss problems with “but that was all part of the fun”, you know that usability isn’t high on the author’s agenda.
The fact that he seems to be praising GeoCities websites says it all
Special shout-out has to go to my local newspaper websites The Derby Telegraph and Nottingham Post .
They are a virtually unreadable mess, due to the layers of advertising and other JavaScript interruption.
With a raspberry pi and pihole you won’t see a single ad on both sites. I just tested it on an iPad.
deleted by creator
The article acknowledges this in the conclusion (emphasis mine):
I’m done. There you have it. That’s my opinion about how ____ed the web is. Look, we will never get the web of old back. Let’s be honest, it wasn’t perfect either. The web of today is more accessible, more dynamic and pretty much a cornerstone of our society.
Accessibility wasn’t the main topic discussed in the article. It was mostly pointing out that the current web is too centralised.
Accessibility wasn’t the main topic discussed in the article
That’s part of the problem. All these rants about the glory of Web 1.0 are ignoring the fact that Web 1.0 wasn’t usable for anybody with accessibility issues and the modern web is better for them. A tiny acknowledgement at the bottom of their rant shows how they value accessibility lower than all of their other concerns.
I don’t think accessibility is meant in term of disabled people.
I understood it as accessible in terms of technical knowledge. Anyone can whip out their phone and access the internet… or at least use an app which needs internet.
Eternal September is another term for it.
Accessibility almost always refers to disabled people, especially in web development. I’ve never heard anyone in the industry refer to accessibility in any other way, without explicitly making that clear.
If they meant the reading you took from it, that’s even worse and my point is even more pertinent.
If they meant the reading you took from it, that’s even worse and my point is even more pertinent.
Why? The internet is a powerful tool and there are plenty of morons using it without knowing anything about it.
my original point was that the main idea of the article down plays the accessibility gains of the modern web. Your reading was that the author meant a different definition of accessibility and not A11y, which would mean the author didn’t just down play it, they completely ignored it. The author is complaining that the modern web is awful, while ignoring the huge gains for people who need these accessibility features and how awful web 1.0 was for them
I think the author used both meanings at different times.
First time they mention interesting website designs at the cost of accessibility.
But the second time they mean how low the technical barrier is to access the modern (and bland) web and how it tries to caters to lowest common denominator.
The article wasn’t really about Web 1.0 as much as it was about the time that Web 1.0 was around. The author could remove “Web 1.0” and replace it with “late 1990s to early 2000s Internet”.
That’s part of the problem.
No, thats just the angle that the article wanted to take. Just because it ignores an aspect of something doesn’t mean that its position is moot.
Are you asking for every article ever to have a section discussing accessibility? I’d rather we let the author speak their mind, and focus on what they want to say.
Are you asking for every article ever to have a section discussing accessibility?
No. I’m asking that when they complain about how the modern web is “fucked” and web 1.0 was better, they don’t try to act like that is an absolute, since that’s an opinion that is not widely applicable.
No, thats just the angle that the article wanted to take. Just because it ignores an aspect of something doesn’t mean that its position is moot.
Ignoring part of a topic makes your argument weaker.
they don’t try to act like that is an absolute
Again, to write an article means to cut out things that don’t matter to the core argument. You’re asking for the writer to complete a thesis.
Ignoring part of a topic makes your argument weaker.
And again, this is an opinion piece, not a well developed thesis. What you are asking for is both unreasonable and impractical when writing an opinion piece.
deleted by creator
Yeah, then sadly, they missed the boat on web 3.0 which is decentralized, resilient, static, and doesn’t require blockchain.
Out of curiosity, I have always thought text only web pages would have been way more accessible at the time were RSS was still a thing, then the blinking ad ridden pages you get nowadays.
You tell me that wasn’t a thing?
deleted by creator
deleted by creator
Living somewhere now where many of the local websites are terribly dated and while the initial nostalgia factor was nice the lack of functionality/accessibility is seriously a problem. Not to say you can’t make a functional/accesible site with old web standards, but some things changed for a reason.
“Fuck you for wanting a sterile web where everything is boring”
…said no one ever.
deleted by creator
“hacker” “news” is a big fan of anything that inflicts pain and misery to anyone that’s not exactly like them (men working in high paying vc funded tech startups that will inevitably go out of business or sell out to some giant and cash out a big fat check)
It was kind of inevitable that it would go this way. The business model that people expect the Internet to work in is directly at odds with how people want to use the Internet. Tech companies have been enjoying a 20 year long honeymoon period where they have effectively infinite money and no regard for sustainable profit.
I disagree with his definition of web3. Some devs are working on decentralizing the web, that’s the real web3. IPFS is blockchain-less. My new peer-to-peer search engine is blockchain-less. Yes, blockchain people are trying to put blockchains everywhere, but we musn’t let them build their vision web3. And that means, you need to help the blockchain-less vision, you need to find projects to contribute to. Let’s make the web uncensorable and anonymous together
we all miss the blink element.
\For the win\
deleted by creator
I see a NASA project in the 1960s, goggles ai project and a Russian movie?? ༼ つ ◕_◕ ༽つ
I’m not endorsing this, because it just doesn’t look great, but here is what they were referencing, but apparently unable to link: https://gemini.circumlunar.space/
deleted by creator
I have a few sites and a blog, all minimal with semantic simple HTML and written in Markdown on backend. Publishing them as Gemini capsules means rewriting them all and dual stacking. Gemini made themselfs incompatible with anything already existing.
For blogs, reciepies and news pure text is good and we’re safe as long as noone mess up TCP/IP stack. But what are we going to do with other online services? For example domain registrar, even most freedom and Small Web respecting one only has big web as only option for frontend now.
deleted by creator
For text things Gemini is cool and I am using it.
I think it’s time for a new standard to replace HTML and Javascript for the web that formalizes all the most used functions and gives developers way less freedom to make a crappy website.
Alternatively train AI to recognize crappy websites and severely punish them in search.
Or use AI to reformat websites into something user friendly. Considering the coding skills of GPT-4, I don’t think that’s too far away.
Between GDPR prompts, auto-generated articles, banner ads, normal ads, filler content, related articles, the web has become unusable.
Really it’s Google’s fault for not cracking down on these practices. And their competitiors for not doing so either.
Search engines in general have become beyond useless. I barely find anything anymore and it’s not just the fault of bad web design. Even if the search results followed human friendly design, they don’t even contain anything related to my search.
My only hope is that retrieval-augmented LLMs can fix this mess. Basically they read all these crappy websites for you and extract the actually useful information.
Replacing HTML and JavaScript does nothing to stop people from creating bad websites. People would still post auto-generated content or ads, filler content, related articles, etc… And having LLMs summarize bad content will only give you a shortened version of bad content.
This is exactly what I want. A simple text-based protocol. Sure, throw in support for images, too. Provide basic layout options so you can do proper wrapping and scaling for different size screens. Nothing else.
The web is too bloated for the basic use cases at this point. With HTML5 and JavaScript and CSS you can do anything, and honestly, that’s great. It’s great that I can run an entire OS emulator in a web browser. It’s great that I can run games and paint apps and everything you can imagine. But why the ever-loving hell is the same platform used for all of that and plain-text news? Madness.
AI-powered reformatting and extraction is bound to come, which is probably one of the reasons Google is pushing for their web DRM bullshit.
well there’s gopher or gemini, but I was thinking it should have some more modern features.
Like if you took all the features of all the best modern websites and apps, and condense them down into a fully integrated stack replacing everything from http up. Only allowing elements that don’t get in the way of UX.
Ideally it should completely preserve privacy and anonymity, so perhaps bolt something on like I2P. Make it pretty much impossible to track people beyond them voluntarily giving their information or doxxing themselves.
But then also have 99% of the conveniences of the best of modern web/app design. But beyond those fixed functions, you have zero freedom as a webdeveloper.
Fixing the content of the websites is then another problem entirely seperately. This is just to fix UI/UX.
Like if a zoomer designed gopher.
Together with more effective search engines, whose actual goal it is to bring quality content to users, you could fix the web forever. I think if you handed control over page rankings over to users, you could fix search engines too. You have to create incentive structures to align the interests of the search provider with that of the users. Currently Google has little incentive to actually provide you good search results, that doesn’t neacessarily make them money.
Decentralization existed because anyone could make an equally shitty website without losing much in it. If you really miss that web 1.0 crap, you can go to i2p or freenet.
I agreed until the “fuck blockchain” comment in the article. How else would you solve the byzantine generals problem in computer science?
By using standard implementation of cryptographic message signing?
what? how does that solve a double spend?