C#, .Net and Azure

Performance matters


Even though hardware has constantly been improving, performance of software - especially websites - has reduced compared to where it was a decade ago.

Unless you have bleeding-edge hardware, your laptop and phone will struggle to render modern websites built with the latest and “greatest” frontend frameworks. Yet - content wise - these websites aren’t displaying much more than they did 10 years ago.

I would even argue that most websites are displaying less (useful) content while at the same time trying to shove every possible ad, tracker and auto playing video down your throat in a lazy attempt to monetize their content.

I’m glad I’m not the only one annoyed by this “status quo”.

Website vs. web application

Modern browsers are no longer just “dumb document viewers” and instead serve as sophisticated content delivery platforms.

It’s now possible to run the entire office suite in the browser without having to install it locally. I can even write a document with multiple people at once using Google Docs or Word Online. All this is only possible because they are highly dynamic web apps whos focus goes way beyond just displaying content.

Meanwhile I would argue that pretty much every other type of website (especially news, weather, blog, video and media sites) should be just a website since their main purpose is to deliver content.

Yet according to the current insanity that is frontend development, everything must be a web app which leads us to..

Lag, lag everywhere

All performance benefits gained from faster networks, globally distributed CDNs and load balanced servers are immediately offset by being forced to first download 5MB+ of JavaScript, one full resolution (and uncompressed) png “hero” image as well as various advertisement SDKs which all then need to be interpreted and executed in the browser locally before eventually the 10kB actual content the website is supposed to show can be dynamically loaded from the server in an asynchronous and “performant” manner.

State of the art web

If you run the package install command for the first time (npm, yarn or otherwise) on any moderately sized frontend application you will end up with a 20min+ process of downloading 2000+ packages from 1000+ contributors. Any one of them could secretly be harvesting your users creditcard information (or chose to do so in some minor update that no one bothers to audit anyway) and there’s no way of knowing.

With frontend development every year the tooling gets exponentially more complex just to do the most basic stuff and, in my opinion, the major root causes are because its all built on JavaScript and npm, a language that was never meant to do more than a few simple DOM manipulations and doesn’t even contain proper primitive types and a package management system that is so bad it should have never gained traction yet here we are.

I can’t help but feel that it’s ridiculous that we build so much tooling on top of a shitty language just because “it’s the only language that will run in the browser”.

I’m glad we are now at least trying to create a proper standard (Webassembly with bytecode), but plenty of companies and individuals are happy to just pile more and more tooling on top of JavaScript (ES6, TypeScript, webpack, pnpm), hoping to drench the smell of the underlying problems and make it somewhat usable.

I personally am happy that I am not responsible to maintain a frontend website for multiple years (or even at all).

With the way the tooling is right now, it is almost guaranteed to be a hopeless stream of firefighting and upgrading to keep the website barely functional notwithstanding the array of security vulnerabilities npm brings along.

I can’t help but feel that both a lack of education and care is responsible for where we are now: stuck in this crazy cycle of frontend development.

Case in point: Reddit

Here’s how reddit’s redesign worked out from a purely technical perspective (I’m not even going to tear into the UI changes they brought upon us):

new.reddit.com: New

And here’s how it was before (hint, you can still use the old design by going to old.reddit.com):

old.reddit.com: Old

Protip: If you disable all trackers and JavaScript on reddit (e.g. by using uBlock Origin and uMatrix) the old.reddit.com site becomes sort of usable again.

old.reddit.com without JavaScript: No JS

So let’s break that down:

Site | Requests | % | Load | % | Finish | % — | — | — No JS | 33 | 100% | 1.10s | 100% | 1.25s | 100% Old | 45 | 136% | 1.62s | 147% | 1.83s | 146% New | 130 | 394% | 4.98s | 453% | 10.09s * | 807%

* The new design technically never finished loading thanks to consistent tracking callbacks to the server despite me not even moving the mouse.

This is ridiculous.

Worse, frontend developers claim that these numbers are acceptable.

Their arguments include ridiculous statements such as “after the first load it will be much faster” (as if this somehow makes a 10MB+ initial request ok).

First of all you won’t even get the chance for a second load if your users leaves your website out of frustration before seeing any content and secondly, the “faster” second load is still dog slow compared to server side rendering (in reddits case, the second load of new.reddit.com comes in at 5MB+ which manages to still be 685% of the no js baseline).

And yet this behaviour is everywhere.

The web version of netflix is close to unusable. If I decide to use it on my laptop it lags up my whole system so much that even typing in notepad becomes laggy (and I have an 8th gen i7). The same happens for twitch.tv.

I also haven’t used amazon video streaming for over a year because once a video player is open (it doesn’t even have to play a video) every click on amazon video is registered - at best - with a three second delay.

If I visit a blogpost I’m usually greeted by an uncompressed “hero” image that covers my entire screen. If I’m lucky I also get to see the header of the post and I definitely first have to scroll down a full page height before I see any of the actual content.

If I happen to have JavaScript enabled as well, scrolling behaviour will be hijacked by a “smooth scrolling effect” that only makes my CPU fan go off and doesn’t manage to be smooth at all, thanks to the choppy framerate it runs at.

Mobile version != substitute for a shitty frontend

Often times the sad solution ends up being “our frontend is too slow, let’s just ignore mobile end users and build apps for them”.

First of all, this doesn’t magically make your frontend faster for non-mobile users (every 100ms costs 1% of your users).

Secondly if you are not able to maintain a fast frontend how is maintaining a frontend and a mobile application at the same time going do you any good? You now have two slow frontends because the mobile app development world is almost as bad, and most clients won’t pay top dollars for either of them.

If it where up to me, all developers would be forced to use the internet with at most GPRS speed once per week to realize how bad their websites are.

JavaScript was meant as an optional enhancement of the web, not as the primary delivery mechanism. Yet it is being abused to do just that.


There is a saying:

If all you have is a hammer, ..

..everything looks like a nail.

Single page applications almost always needlessly increase the complexity of your website and hurt performance.

Eventually something will break in these giant JavaScript piles and no one knows why, so the unironic solution is to force a page reload to “fix it” after being hellbent on never reloading the page in the first place.

If your websites primary purpose is to deliver content, you can absolutely build a fast website using server-side rendering that does not need JavaScript or only uses it for enhancements. Not only will it be faster for your users it will also be more usable.

Use the available tooling

If you absolutely must use the latest fancy JavaScript framework then please also use the available browser tooling to test your damn stuff (and also test in more than just Chrome).

Modern browsers have excellent developer tools built right in. You can debug and measure the performance of your websites directly in the browser.

You can even install all the browsers at the same time for testing (sad that this is hardly the case for most frontend developers).

Browsers also have means to change the screen resolution (to simulate various devices) and even supports throttling to simulate mobile connections and/or slow networks.

Less is more

Performance does matter and using less complex frameworks is an easy way to improve it.

Don’t blindly use the latest framework and follow the latest trends “just because it looks flashy”.

It seems that we - as developers - need to educate other developers more as to what tooling is available while at the same time stopping them from building needlessly complex solutions for trivial problems.

End users also need to complain more about bad software. Waiting more than a minute for a mailbox to load is not ok. Being force-fed a popup on first visit is not ok and neither is having to watch 3 advertisements before the actual content.

tagged as Rant, Performance,