Ok boomer
I have 13 sites whitelisted to allow JS. The internet is fairly usable for me without JS.
Same. This is the way.
People in this thread who aren’t web devs: “web devs are just lazy”
Web devs: Alright buddy boy, you try making a web site these days with the required complexity with only HTML and CSS. 😆 All you’d get is static content and maybe some forms. Any kind of interactivity goes out the door.
Non web devs: “nah bruh this site is considered broken for the mere fact that it uses JavaScript at all”
Ehhhhh it kinda’ depends. Most things that are merely changing how something already present on the page is displayed? Probably don’t need JS. Doing something cool based on the submit or response of a form? Probably don’t need JS. Changing something dynamically based off of what the user is doing? Might not need JS!
Need to do some computation off of the response of said form and change a bunch of the page? You probably need JS. Need to support older browsers simply doing all of the previously described things? Probably need JS.
It really, really depends on what needs to happen and why. Most websites are still in the legacy support realm, at least conceptually, so JS sadly is required for many, many websites. Not that they use it in the most ideal way, but few situations are ideal in the first place.
A lot of this is just non-tech savvy people failing to understand the limitations and history of the internet.
(this isn’t to defend the BS modern corporations pull, but just to explain the “how” of the often times shitty requirements the web devs are dealing with)
Stop, can only get so erect. Give me that please than the bullshit I have to wade trough today to find information. When is the store open. E-mailadress/phone. Like fuck if I want to engage
😆 F—ck, I hear you loud and clear on that one. But that’s a different problem altogether, organizing information.
People suck at that. I don’t think they ever even use their own site or have it tested on anyone before shipping. Sometimes it’s absolutely impossible to find information about something, like even what a product even is or does. So stupid.
You can say fuck on the internet
I also have the right to self-censor myself for effect. 👍👍
“nah bruh this site is considered broken for the mere fact that it uses JavaScript at all”
A little paraphrased, but that’s the gist.
Isn’t there an article just today that talks about CSS doing most of the heavy-lifting java is usually crutched to do?
I did webdev before the framework blight. It was manual php, it was ASP, it was soul-crushing. That’s the basis for my claim that javascript lamers are just lazy, and supply-chain splots waiting to manifest.
CSS doing most of the heavy-lifting java is usually crutched to do
JavaScript you mean? Some small subset of things that JavaScript was forced to handle before can be done in CSS, yes, but that only goes for styling and layout, not interactivity, obviously.
I did webdev before the framework blight. That’s the basis for my claim that javascript lamers are just lazy
There is some extremely heavy prejudice and unnecessary hate going on here, which is woefully misdirected. Well get to that. But the amount of time that has passed since you did web dev might put you at a disadvantage to make claims about web development these days. 👍
Anyway. Us JavaScript/TypeScript “lamers” are doing the best with what we’ve got. The web platform is very broken and fragmented because of its history. It’s not something regular web devs can do much about. We use the framework or library that suits us best for the task at hand and the resources we are given (time, basically). It’s not like any project will be your dream unicorn project where you get to decide the infrastructure from the start or get to invent a new library or a new browser to target that does things differently and doesn’t have to be backwards compatible with the web at large. Things don’t work this way.
Don’t you think we sigh all day because we have to monkey patch the web to make our sites behave in the way the acceptance criteria demand? You call that lazy, but we are working our knuckles to the bone to make things work reasonably well for as many people as we can, including accessibility for those with reduced function. It’s not an easy task.
… “Lazy.” I scoffed in offense, to be honest with you.
It’s like telling someone who made bread from scratch they’re lazy for not growing their own wheat, ffs.
Let’s see you do better. 👍👍👍👍👍👍
I would argue that a lot it scripting can and should be done server side.
If you want to zoom into a graph plot, you want each wheel scroll tick to be sent to the server to generate a new image and a full page reload?
How would you even detect the mouse wheel scroll?
All interactivity goes out the door.
That would make the website feel ultra slow since a full page load would be needed every time. Something as simple as a slide out menu needs JavaScript and couldn’t really be done server side.
When if you said just send the parts of the page that changed, that dynamic content loading would still be JavaScript. Maybe an iframe could get you somewhere but that’s a hacky work around and you couldn’t interact between different frames
JS is just a janky hotfix.
As it was, HTML was all sites had. When these were called “ugly”, CSS was invented for style and presentation stuff. When the need for advanced interactivity (not doable on Internet speeds of 20-30 years ago), someone just said “fuck it, do whatever you want” and added scripting to browsers.
The real solution came in the form of HTML5. You no longer needed, and I can’t stress this enough, Flash to play a video in-browser. For other things as well.
Well, HTML5 is over 15 years old by now. And maybe the time has come to bring in new functionality into either HTML, CSS or a new, third component of web sites (maybe even JS itself?)
Stuff like menus. There’s no need for then to be limited by the half-assed workaround known as CSS pseudoclasses or for every website to have its own JS implementation.
Stuff like basic math stuff. HTML has had forms since forever. Letting it do some more, like counting down, accessing its equivalent of the Date and Math classes, and tallying up a shopping cart on a webshop seems like a better fix than a bunch of frameworks.
Just make a standardized “framework” built directly into the browser - it’d speed up development, lower complexity, reduce bloat and increase performance. And that’s just the stuff off the top of my head.
Something as simple as a slide out menu needs JavaScript and couldn’t really be done server side.
I’m not trying to tell anyone how to design their webpages. I’m also a bit old fashioned. But I stopped making animated gimmicks many years ago. When someone is viewing such things on a small screen, in landscape mode, it’s going to be a shit user experience at best. That’s just my 2 cents from personal experience.
I’m sure there are examples of where js is necessary. It certainly has it’s place. I just feel like it’s over used. Now if you’re at the mercy of someone else that demands x y and z, then I guess you gotta do what you gotta do.
https://htmx.org/ solves the problem of full page loads. Yes, it’s a JavaScript library, but it’s a tiny JS library (14k over the wire) that is easily cached. And in most cases, it’s the only JavaScript you need. The vast majority of content can be rendered server side.
So, your site still doesn’t work without JS but you get to not use all the convenience React brings to the table? Boy, what a deal! Maybe you should go talk to Trump about those tariffs. You seem to be at least as capable as Flintenuschi!
While fair, now you have to have JavaScript enabled in the page which I think was the point. It was never able having only a little bit. It was that you had to have it enabled
Skill issue - on the devs side.
A lot of pages even fail if you only disable 3rd-party scripts (my default setting on mobile).
I consider them broken, since the platform is to render a Document Object Model; scripting is secondary functionality and having no fallbacks is bad practice.
Imagine if that were a pdf/epub.Personally, I love server-side rendering, I think it’s the best way to ensure your content works the way YOU built it. However, offloading the processing to the client saves money, and makes sense if you’re also planning on turning it into an electron app.
I feel it’s better practice to use a DNS that blocks traffic for known telemetry and malware.
Personally, I used to blacklist all scripts and turn them on one at a time till I had the functionality I needed.
But they’re not pdf/e-pub, they’re live pages that support changing things in the DOM dynamically. I’m sorry, I’m not trying to be mean but people not wanting scripting on their sites are a niche inside a niche, so in terms of prioritising fixing things that’s a very small audience with a very small ROI if done they might require a huge rewrite. It’s just not financially feasible for not much of a reason other than puritan ones.
All modern browsers have Javascript enabled by default. A good dev targets and tests for mainstream systems.
no fallbacks is bad practice.
This is how you know they’re extra lazy – no “please enable javascript because we suck and have no noscript version”.
It reminds me of flash when it first gained popularity.
“Please enable flash so you can see our unnecessary intro animation and flash-based interface” at, like, half of local restaurant websites
wild thing is that with modern css and local fonts (nerdfonts, etc), you can make a simple page with a modern grid and nested css without requiring a single third party library or js.
devs are just lazy.
Devs are lazy but also product people and design request stuff that even modern CSS cannot do
devs are just lazy.
*cost-efficient. At this point it’s a race to the bottom.
and its not even the devs. its the higher ups forcing them to do shit that won’t work.
I use uBlock medium mode, and if I can’t get a website to work without having to enable JavaScript, then I just leave the website.
I generally do the same. In fact, on desktop, uBO is set to hard mode. Unfortunately, I do need to access these sites from time to time.
because modern webdevs cant do anything without react
I’m a webdev. I agree. I like react.
I disagree,I did fullstack for years without react, I used the much superior Vue.js
I like React, but Svelte really hits the spot. But no matter what framework you use, let’s all be glad that we’re not like those reality averse people complaining in this thread 🙏
True, lol
because
modernyoung/unskilled webdevs cant do anything without react
Yes.
Many people won’t even know what we’re talking about; to them it’s like saying “the sheer amount of websites that are unusable without HTML”. But I use uBlock Origin in expert mode and block js by default; this allows me to click on slightly* fishy links without endangering my setup or immediately handing my data over to some 3rd party.
So I’m happy to see news websites that do not require js at all for a legible experience, and enraged that others even hide the fucking plain text of the article behind a script. Even looking at the source code does not reveal it. And I’m not talking about paywalls.
* real fishy links go into the Tor browser, if I really want to see what’s behind them.
Said it on a top-level comment as well, but I use “medium mode” on uBlock (weirdly not advertised, but easy enough to enable: https://github.com/gorhill/ublock/wiki/Blocking-mode:-medium-mode). I’ve found it to be a good middle ground between expert mode which is basically noscript, and rawdogging it.
If I encounter a site that I can’t visit unless I enable JS, then I leave.
If I’d want to write a site with js-equivalent functionality and ux without using js, what would my options be?
HTML and CSS can do quite a lot, and you can use PHP or
cgi-bin
for some scripting.Of course, it’s not a perfect alternative. JavaScript is sometimes the only option; but a website like the one I was trying to use could easily have just been a static site.
The problem is that HTML and CSS are extremely convoluted and unintuitive. They are the reason we don’t have more web engines.
htmx or equivalent technologies. The idea is to render as much as possible server side, and then use JS for the things that can’t be rendered there or require interactivity. And at the very least, serve the JS from your server, don’t leak requests to random CDNs.
Htmx requires JS. At that point you already failed in the eyes of the purists. And CDNs exist for a reason. You can’t expect a website to guarantee perfect uptime and response times without the use of CDNs. And don’t get me started on how expensive it would be to host a globally requested website without a CDN. That’s a surefire way to get a million dollar bill from amazon!
WASM and cry because you can’t directly modify the DOM without JS.
You can’t use web assembly without JavaScript to initialize it.
You can’t modify the DOM.
But
somemost dynamicity can stay - sites can be built freely server-side, and even some “dynamic” functionality like menus can be made using css pseudoclasses.Sure, you won’t have a Google Docs or Gmail webapp, but 90% of stuff doesn’t actually need one.
A basic website doesn’t require js.
A webshop, for example, does for the part around adding to cart and checkout - but it doesn’t for merely browsing.
For a web store you probably only need Javascript for payment processing. Insofar as I’ve seen pretty much all of the widgets provided by the card processors outright require Javascript (and most of them are also exceedingly janky, regardless of what they look like on the outside to the user).
You definitely don’t need Javascript just for a shopping cart, though. That can all be done server side.
I mean you could build a site in next.js, ironically. Which is very counter intuitive because it literally is js you are writing, but you can write it to not do dynamic things so it effectively would be a static server rendered site that, if js is enabled, gets for free things like a loader bar and quick navigation transitions. If js is disabled it functions just like a standard static site.
I just use NOSCRIPT to do this and its annoying to visit websites that need Javascript, but its handy with noscript cause I just turn on the Javascript the website needs for functionality (this should also speed up load times)
Sometimes if am using a browser without extension support (like Gnome WEB) I just disable Javascript on Websites or frontends that dont need it like Invidious (if am facing issues)i just add any site that breaks without js to my list of sites to eradicate adguard filter and send it to /dev/null
As a web developer, I see js as a quality improvement. No page reloads, nice smooth ui. Luckily, PHP times has ended, but even in the PHP era disabling jQuery could cause problems.
We could generate static html pages It just adds complexity.
Personally I use only client-side rendering, and I think, that’s the best from dev perspective. Easy setup, no magic, nice ui. And that results in blank page when you disable js.
If your motivation is to stop tracking.
- replace all foreign domain sources to file uris. e.g.: load google fonts from local cache.
- disable all foreign script files unless it’s valid like js packages from public CDNs, which case load them from local cache.
If your motivation is to see old html pages, with minimal style, well it’s impossible to do them reliably. If you are worried about closed-source js. You shouldn’t be. It’s an isolated environment. if something is possible for js and you want to limit its capability, contribute to browsers. That’s the clear path.
I can be convinced. What’s your motivation?
This community is full of older people who have never done modern development
Fuck yeah!
Bookmarked for future use. CSS has developed a lot since I started getting aquainted with it.
I didn’t read it completely, is browser coverage addressed in the article?
The only non-heated comment. I appreciate it. I will read it.
The only non-heated comment.
You mean people replying to you? I wouldn’t call those heated, rather derisive. Just like your own original comment. You come across as presumptuous and pretending to be more knowledgeable than you really are. People react.
If your motivation is to see old html pages, with minimal style
Huh? i just want to see a web page. Usually a news article, i.e. text with few styling elements. In other words, HTML.
For most use cases JS is not required.well it’s impossible to do them reliably
Huh again? Why?
If you are worried about closed-source js.
Isn’t it always open, i.e. one can read the script the browser loads if one is so inclined? No, that’s not the point at all. JS increases the likelihood of data mining, by ordes of magnitude. And most addons that block js also block 3rd party requests generally.
Use as much js as you like (most third party stuff is not really up to the web dev anyhow), but the page must always fail gracefully for those who do not like it, or browse the web in some non-standard way. An empty page is not an option.
Please also read some of the other (top level) comments here.
You were completely fine with slow page reloads blinding you when the theme was dark. I’m speaking to those who appreciate modern tech.
But anyways, unfortunately javascript obfuscation is a common thing.
Obfuscation, OK.
Look, I’m willing to have a conversation with you, but you need to address my points first, that is if you want one too.
I can’t take it seriously because of the noise in your text like “Huh?”. If you like to have a conversation, please be more open next time.
Source code is the code before some kind of transpilation. Obfuscated code is not source code.
I get it, you just need the content. But why would you reload the page when you’re just about to get the next news in the page. Isn’t it better to just update that part?
Why is it “impossible to do them reliably” - without js presumably?
why would you reload the page when you’re just about to get the next news in the page. Isn’t it better to just update that part?
Sounds like you’re thinking about web apps, when most people here think about web pages.
As a web dev, and primarily user, I like my phone having some juice left in it.
The largest battery hog on my phone is the browser. I can’t help wonder why.
I’d much rather wait a second or two rather than have my phone initialize some js framework 50 times per day.
Dynamic HTML can be done - and is - server-side. Of course, not using a framework is harder, and all the current ones are client-side.
Saying making unbloated pages is impossible to do right just makes it seem like you’re ill informed.
On that note - “Closed-source” JS doesn’t really exist (at least client-side) - all JS is source-availiable in-browser - some may obfuscate, but it isn’t a privacy concern.
The problem is that my phone does something it doesn’t have to.
Having my phone fetch potentially 50 MB (usually 5-15) for each new website is a battery hog. And on a slow connection - to quote your words, “great UX”.
The alternative is a few KB for the HTML, CSS and a small amount of tailor-made JS.
A few KB’s which load a hundered times faster, don’t waste exorbitant amounts of computing power - while in essence losing nothing over your alternative.
“Old pages with minima style” is a non-sequitur. Need I remind you, CSS is a thing. In fact, it may be more reliable than JS, since it isn’t turing-complete, it’s much simpler for browser interpreters to not fuck it up. Also, not nearly the vulnerability vector JS is.
And your message for me and people like me, wanting websites not to outsource their power-hogging frameworks to my poor phone?
Go build your own browser.
What a joke.
You can build some very light pages with JavaScript. JavaScript isn’t the issue, it is the large assets.
Who said making unbloated pages impossible? Your comment would be more serious without your emotions.
Source code is the source code which gets transformed to some target code. An obfuscated code is not source code.
A reminder, in the past, large pages downloaded all stuff at once. In contrast, with dynamic imports the first load is much much faster. And that matters most. And any changes in dynamic content would just require the dynamic data to be downloaded. My phone lasts at least 2 days with one charge (avg usage), but I charge it every night, that’s not an issue.
Source code is the code devs write.
For compiled languages like C, only the compiled machine code is made available to the user.
JS is interpreted, meaning it doesn’t get compiled, but an interpreter interprets source code directly during runtime.
Obfuscsted code, while not technically unaltered source code is still source code. Key word being unaltered. It isn’t source code due to the virtue of not being straight from the source (i.e. because it’s altered).
However, obfuscated code is basically source code. The only things to obfuscate are variable and function names, and perhaps some pre-compile order of operations optimizations. The core syntax and structure of the program has to remain “visible”, because otherwise the interpreter couldn’t run the code.
Analyzing obfuscated code is much closer to analyzing source code than reverse-engineering compiled binaries.
It may not be human-readable. But other programs systems can analyze (as they can even compiled code), but more importantly - they can alter it in a trivial manner. Because it’s source code with basically names censored out. Which makes evaluating the code only a bit harder than if it were truly “closed-source”.
That’s why website source code is basically almostsource-available.
A reminder, in the past, large pages downloaded all stuff at once. In contrast, with dynamic imports the first load is much much faster. And that matters most. And any changes in dynamic content would just require the dynamic data to be downloaded.
Unfortunately, you’re very mistaken.
In the past, pages needed to download any stuff they want to display to the user. Now, here’s the kicker: that hasn’t changed!
Pages today are loaded more dinamically and sensibly. First basic stuff (text), then styles, then scripts, then media.
However, it’s not Angular, React Bootstrap or any other framework doing the fetching. It’s the browser. Frameworks don’t change that. What they do, instead, is add additional megabytes of (mostly) bloat to download every day or week (depending on the timeout).
Any web page gets HTML loaded first, since the dawn of the Web. That’s the page itself. Even IE did that. At first, browsers loaded sequentially, but then they figured out it’s better UX to load CSS first, then the rest. Media probably takes precedence to frameworks as well (because thet’s what the user actually sees).
Browsers are smart enough to cache images themselves. No framework can do it even if it wanted to because of sandboxing. It’s the browser’s job.
What frameworks do is make devs’ lives easier. At the cost of performance for the user.
That cost is multiple-fold: first the framework has to load. In order to do that, it takes bandwidth, which may or may not be a steeply-priced commodity depending on your ISO contract. Loading also takes time, i.e. waiting, i.e. bad UX.
Other than that, the framework beeds to run. That uses CPU cycles, which wastes power and lowers battery life. It’s also less efficient than the browser doing it because it’s a higher level of abstraction than letting the browser do it on its own.
With phones being as trigger-happy about killing “unused” apps, all the frameworks in use by various websites need to spin up from being killed as often as every few minutes. A less extreme amount of “rebooting” the framework happens when low-powered PCs run oit of RAM and a frameworked site is chosen by the browser to be “frozen”.
What a framework does is, basically, fill a hole in HTML and CSS - it adds functionality needed for a website which is otherwise unattainable. Stuff like cart, checkout, some complex display styles, etc.
All of this stuff is fully doable server-side. Mind you, login is so doable it didn’t even slip onto my little list. It’s just simpler to do it all client-side for the programmer (as opposed to making forms and HTML requests that much more often, together with the tiny UX addition of not needing to wait for the bac(-and-forth to finish.
Which itself isn’t really a problem. In fact, the “white flashes” are more common on framework sites than not.
When a browser loads any site, it loads HTML first. That’s “the site”. The rest is just icing on the cake. First is CSS, then media and JS (these two are havily browser dependent as far as load priority goes).
Now comes the difference between “classic”, “js-enhanced” and “fully js-based” sites.
A classic site loads fast. First HTML. The browser fetches the CSS soon enough, not even bothering to show the “raw HTML” for a few hundered miliseconds if the CSS loads fast enough. So the user doesn’t even see the “white flash” most of the time, since networks today are fast enough.
As the user moves through different pages of the site, the CSS was cached - any HTML page wishing to use the same CSS won’t even need to wait for it to load again!
Then there’s the js-enhanced site. It’s like the classic site, but with some fancy code to make it potentially infinitely more powerful. Stuff like responsive UI’s and the ability to do fancy math one would exoect of a traditional desktop/native app. Having JS saves having to run every little thing needing some consideration to the server when the browser can do it. It’s actually a privacy benefit, since a lot less things need to leave the user’s device. It can even mend its HTML, its internal structure and its backbone to suit its needs. That’s how powerful JS is.
But, as they say, with great power comes great responsibility. The frameworked-to-hell site. Initially, its HTML is pretty much empty. It’s less of like ordering a car and more of building a house. When you “buy the car” (visit the site), it has to get made right in front of your eyes. Fun the first few times, but otherwise very impractical.
A frameworked site also loads slower by default - the browser gets HTML first, then CSS. Since there’s no media there yet, it goes for the JS. Hell, some leave even CSS out of the empty shell of the page when you first enter so you really get blasted by the browser’s default (usually white, although today theme-based) CSS stylesheet. Only once the JS loads the framework can the foundation of the site (HTML) start being built.
Once that’s been built, it has CSS, and you no longer see the white sea of nothing.
As you move through pages of the site, each is being built in-browser, on-demand. Imagine the car turning into a funhouse where whenever you enter a new room, the bell rings. An employee has to hear it and react quickly! They have to bring the Buld-A-Room kit quickly and deploy it, lest you leave before that happens!
Not only is that slow and asinine, it’s just plain inefficient. There’s no need for it in 99% of cases. It slows stuff down, creates needless bandwidth, wastes needless data and wastes energy.
There’s another aspect to frameworked sites’ inefficiency I’d like to touch.
It’s the fact that they’re less “dynamic” and more “quicksand”.
They change. A lot. Frameworks get updates, and using multiple isn’t even unheard of. Devs push updates left and right, which are expected to be visible and deployed faster than the D-Day landings.
Which in practice means that max resource age is set very low. Days, maybe even hours. Which means, instead of having the huge little 15 MB on-average framework fetched once a week or month, it’s more like 4 to dozens of times per week. Multiply by each site’s preferred framework and version, and add to that their own, custom code which also takes up some (albeit usually less-than-frameork) space.
That can easily cross into gigabytes a month. Gigabytes wasted.
Sure, in today’s 4K HDR multimedia days that’s a few minutes of video, but it isn’t 0 minutes of nothing.
My phone also reliably lasts a day without charge. It’s not about my battery being bad, but about power being wasted. Do you think it normal that checking battery use, Chrome used 64% according to the abdroid settings?
You bet I tried out Firefox the very same day. Googling for some optimizations led me down a privacy rabbit-hole. Today I use Firefox, and battery use fell from 64% to 24%. A 40% decrease! I still can’t believe it myself!
I admit, I tend to use my phone less and less so my current 24% may not be the best metric, but even before when I did, the average was somewhere between 25% and 30%.
There’s a middle-ground in all of this.
Where the Web is today is anything but.
The old days, while not as golden they might seem to me are also not as brown as you paint them out to be.
even in the PHP era disabling jQuery could cause problems.
WTF. Do you think jQuery is what JavaScript used to be called or something? Pretty much everything you wrote is insane, and I specifically think that because I’ve been building webpages for 25 years. You sure never heard of progressive enhancement.