I was drawn to the Internet back in 1997, when I got a proper (ISDN!) connection at my first office position in a large European company. These were the days of MS FrontPage. The websites I built were an horrendous bunch of code, mixture of bastardy html and some proprietary version of a prehistorical version of Javascript by Microsoft.
The browser war was raging and Netscape was the clear leader. There was pretty much one screen standard : 1024x768 resolution replaced the legacy 800x600. It seemed huge! Screens were bulky analog monitors. Of course, we used <table>
and loads of 1px-square transparent gif files as spacers to make interfaces, conceived by print(!) designers.
There was no other choice than code like a pyromaniac bastard.
21 years later, this is 2018. We still build static websites (it was almost deemed old-fashioned a few years ago to not have a dynamic website) but also SPA (Single Page Applications), Progressive Web Apps. Virtual Reality is becoming mainstream. Fridges automatically order tomorrows' bottle of milk from an online dairy store. Standards have arisen for html, css and javascript. It's never been easier to build something and put it online.
Yet, turn off javascript and half of the internet breaks. Because of that "cool javascript framework" that allows pretty much anyone build things fast, without even understanding the underlying architecture, or the purpose of JS Frameworks : SPA, not presentational websites.
AS often is the case, "Fast" for the developer means "bad" for the end-user.
As developers, for the sake of keeping the universal access of the information we put online (and the very reason we exist), we need to re-claim the Progressive Enhancement methodology. Here are just a few reasons why:
1. It is good for the user
- Disabled people, for whom static rendering and full-page reloads are typically still more (not exclusively, but more, and more easily) accessible.
- Search engine spiders (it is being said the GoogleBot parse javascript, but how well exactly ? Google does not recommend js-only interfaces, so, that's that)
2. It is good for the developer
- It's not hard : the
html.js
trick enables writing CSS only applying to javascript-enabled contexts. Javascript framework ? Use the<noscript>
tag, at the very least, so that everyone (including the GoogleBot) gets access to your content. - It's not expensive, on the contrary : you gain time because your code is more maintainable and easier to debug. Thank you miss Separation of Concern.
- you have no idea what devices your code will run on in two years. Build "future-proof" digital products, not sand castles, crushed by the next wave.
3. This is why the Internet was built for.
4. It takes but a few minutes to grasp.
Here is a presentation I did for my bad-ass junior developers at BeCode. Have a browse.
Stil not convinced ? Head over this Reddit thread.
I leave the final word to Tiffany Tse (Shopify) (source)
Considering how quickly things change, and how many new devices there are every year, it’s imperative that we continue to build websites and applications that can scale, change, and employ new features as they become available. To do this, and continue to make sure that the web is accessible for all, we need to ensure progressive enhancement is at the heart of everything we do.
Top comments (6)
Before I switched to a job that lets me develop web apps that don't work without JS (for a reason, our service is based on WebRTC), I worked on a mobile mail client that was supposed to run on virtually every device with any web browser - even including old versions of Obigo, Netfront Access, Opera Mini, Nokia Browser and other obscure pieces of software now mostly extinct. I developed a markup style between HTML5 and WML that worked in all ~300 devices I ever tested: center, b, a, hr, p and table/tr/td (if not overused) work virtually everywhere (though they may not be as nicely rendered).
My favorite exercise was opening the pages locally in lynx and/or w3m. If I could still use them without problems, then I was on the right track. The rest was an exercise in CSS minimalism and sparingly used JS.
Thank you for this excellent, because practical, feedback. Using lynx seem to me like a great testing choice from what I have read. Props for the extra WML mile. Accessibility to the blind takes yet more effort I guess, but at least your approach gets you near the bull's eye.
With this approach, a11y is pretty simple, because lynx renders 99% of what a screen reader would convey and also supports keyboard shortcuts (one of my favorite WAI ARIA features). If you get all that right, the rest of good a11y is really simple.
Progressive enhancement is one of my soapboxes. I'm really excited about the increasing adoption and improvement of server-side rendering techniques for SPAs, though it frustrates me that people always couch it in terms of improved SEO. It can add so much more value that that! It works better for A11y, it usually speeds up your initial render, it can make your site still be useable if your big fancy JS bundle breaks, and so much more. Progressive enhancement is better in so many ways.
Yeah, the SEO argument is a bit of the lazy one, I must admit, because it wins the audience more easily. But it is true the Googlebot can render javascript under certain conditions (and probably ubiquitously at some point in the future). But SEO is also a good indicator : should the content be indexed ? If that is so, it's probably not an app, so you probably don't need a full fledged Javascript framework anyway. But if the content is private stuff, like mails, or to-dos, or bank statements? You're building an app. Go the whole JS way, baby. No indexing need. Do make sure it is accessible for the blind.
Still, if unsure because you are fresh and young in the dev world, remember this : html is cheap, and futureproof.
"First html,
then css,
then javascript.
In that order
_____"
Love that quote!
Very good presentation.