Introduction
This year Luckybeard went through a design overhaul that saw an update in our branding as a company. Along with this change came an updated website and an opportunity to put forward what we know websites can be: sexy, performant, accessible and seo-friendly. In today’s world, websites try hard to balance these various factors but very few get it right. So how did we get it right? Lots of pedantic fine-tuning and creativity, a hallmark of Luckybeard devs.
The Performance Problem
Starting off with the site, we had the design. A sleek and sexy bauhaus driven multi-pager with slick animations and stunning graphics. It looked good in Figma, but we now faced the problem of trying to convert the pretty designs into working, responsive code. The first of the hurdles we had to face was performance. The site, especially our landing page, is media rich and is filled to the brim with sleek animations and fancy graphics. This posed a problem because all of this content also needed to be handled through a CMS. The result is sexy, but that is not worth very much if you have to wait for 30mb of content to load before you can use anything. We broke down the problem step by step and now have a FCP (first contentful paint) of 1.4-1.6s on mobile and about 0.3-0.5s on desktop with LCP (largest contentful paint) scores of 2.4-2.6s and 0.8-1s respectively. So, how did we do it?
Lazy Load the Non-essentials
This is a pretty obvious one, and one that is often talked about, but lazy loading makes a big difference. However, it is not enough to just implement lazy loading. There is a balance. Too much lazy loading and your site feels slow, clunky or jittery. We took the approach of considering what elements of our pages were most needed for the first view of the page and prioritised the loading of those elements. Beyond that, we either defer the loading of the other content until everything else on the site is done loading, or in some cases all the way until a viewer is about to scroll that element into view.
The win: The initial load time of the page was decreased because we needed to fetch fewer resources to show the viewer a page. An additional benefit of this is that a viewer who lands on our home page but never scrolls down doesn’t have to load any of the media or content that they will never see.
Our strategy: We chose to load essential CSS, text and layout elements first, followed by media that was visible on the first view of the page. Anything else was left until later.
Clever Cloudflare Caching
The next step we took was to integrate Cloudflare and the hosting front for our site. This allowed us to make use of Cloudflare's caching rules to ensure that people accessing our site didn’t always have to wait for our server to respond. In fact, in most cases a viewer will never hit our server because the page they are looking for will have been cached in a location near to them by Cloudflare. This shortens response times and balances the load put on our server by mitigating unneeded rebuilds of pages that do not have frequently changing content. The trade-off comes in the fact that changes made to the site through the CMS need to trigger a rebuild, but the advantage is speed, lots of speed.
The win: CSS files, videos, images, JavaScript, any other assets and even page are all cached through Cloudflare so that they can be delivered to you in the blink of an eye.
Our strategy: Earmark the content that doesn’t change frequently and store that in Cloudflare’s cache to prevent viewers of our site from having to wait for the site to build just for them on every page load.
Proxied Media
Once we had Cloudflare in place, we were able to begin making use of it to improve our content delivery experience. For our site we used a headless CMS with a content delivery network in order to structure our content and pages. The drawback of this came in the fact that we had large pieces of media on our site that came from a third-party origin. On top of that we had some third-party fonts and scripts that needed to be added to the site that had the same issue. This meant that we lost points in Lighthouse for not using first-party assets, but it also meant that we couldn’t cache those files for our site users. Our solution was to make use of SvelteKits server endpoints to create proxies for our media assets. This allowed us to call those assets from our server and then return the result to the front-end. Cloudflare then saw that result and allowed us to cache it so that the next time someone requested that resource, they would get the cached version.
The win: We gained more control over the assets on our site and managed to make use of Cloudflare's caching, even for third-party assets.
Our strategy: We made use of SvelteKit’s server endpoints to forward requests to our content CDN and to other third-parties. A request to any of those places would first go through our server and then to wherever it was originally supposed to go. This did mean that the first request for that asset takes longer, but because we could then cache the result, it meant that any request after the first request was heaps faster.
Critical CSS
The next problem to tackle was our CSS. We have a lot of styles on our site, but when it comes down to it, you only need so many of those styles when you first load the page. This is something we call critical CSS. This caused a problem at first because SvelteKit doesn’t handle this out of the box. After much experimentation, we wrote a custom middleware for our SvelteKit node server that intercepts any requests for HTML pages and inlines that CSS for that page. We also enhanced that and made our middleware minify the HTML to decrease the overall file size. Like with proxied media, this meant that straight requests to the server took longer, but because of our caching we were able to cache that request after it was first made so that subsequent viewers got the nicely inlined and minified HTML without having to wait for our server.
The win: Much faster page load times with smaller overall file sizes being delivered to the viewer.
Our strategy: We focused on delivering the most needed assets to a viewer as quickly as possible while leaving the rest for after our page had finished loading. Using inline CSS we are able to give the viewer only the styles they need to see what is on view on their screen while loading the rest after our site has loaded. We were then able to cache the HTML page with that inlined CSS does not have to be done on every request made to view the site.
Service Workers
So far, everything we had done was aimed at shortening the time it took to get the content of our site to the viewer from the internet. Being smart about what we loaded, when we loaded it and how often we recreated it meant that we could deliver just the right amount of content to a view in as little time as possible. However, there was still more we could do. Even with all that caching, viewers would still have to fetch files and assets from our cached servers every time that they load this site. No matter how optimised we make the caching and the server, this means that there is always going to be a wait time. This is when we turned to service workers. Using a custom service worker script that we wrote, we are able to catch specific assets that we want and can cache them on the viewers device. This means that if the main landing page video is loaded for that person, they never need to fetch it again unless we update it. Even then, we will only need to fetch it once again and then it will be cached for that view in the future.
The win: After the first load, viewers of our site are treated to an extremely fast and snappy experience. Our servers take less strain because images do not need to be served as often and viewers use less data when navigating our site.
Our strategy: By prioritising the files on our site that didn’t need to be fetched more than once, we were able to smartly cache safe files on a viewer’s device after they had downloaded the site on their first visit. Rarely changing files like fonts, logos, banners and many images no longer need to be requested on every page load.
Of course there were many other little things we did to boost the performance of our site like compressing images, making use of inline svgs where possible and minimising unneeded third-party libraries required to load the front-end. We managed to do this while still delivering a sleek and unique experience.
Accessibility and Compliance in Practice
It is all good and well making sure that our site is delivered in the blink of an eye, but that is not helpful if our site is not accessible and usable by anyone who visits it. What we are delivering is as important, if not more important, than how quickly we deliver it. We took our site from the ground up and worked through the components with that in mind. Our goal was to make sure that our pretty designs could be meaningfully interpreted by screen readers and assistive technology. Using the handy Skilltide Accessibility Toolbar we navigated through the site using various limiting factors such as using only our keyboard or closing our eyes and trying to complete tasks using just a screen reader’s prompts.
Going through this process we had to make sure that we knew what the purpose of every single element was and how to describe it to a visitor. We had to make sure that all of our images had alt tags so that they could be described, that navigation elements were named, that the tab indexes of our site followed a correct and logical order and that foreground and background contrasts were sufficiently stark. In the end, with a few minute design tweaks, we were able to turn our site into something that any visitor could use. It is easy to ignore accessibility, but at Luckybeard we don’t like to cut corners and so we put a lot of work into creating a site that can show to the world that good design, beautiful design and accessible design are not exclusive to each other.
With accessibility handled, we tackled something equally important: privacy. In this day and age it is hard to find a site where you aren’t bombarded with a million cookies and trackers. Yes, we all hate that “do you accept cookies” banner that you get on every site, but that does something amazing for us: it allows us to choose whether or not we want the website to include us in their analytics and reporting samples. Instead of having to manually go and block those tools yourself, you are given the choice right from the start before you are ever tracked. Wanting to uphold the good performance and accessibility of our site, we used our own proprietary cookie blocking script, rather than a third-party script like cookie bot, to allow a visitor to accept and customize the cookies they want to allow on the site. Best of all, because we built the script ourselves, there isn’t an annoying pop up the second you land on the site. Rather it is triggered only once a visitor has begun navigating on the site.
Optimised for Search Engines
Finally, we tackled the SEO of our site. Everything we had done up until this point was aimed at improving how a visitor, a human, views our site. However, equally important is making sure that search engines like Google as well as tools like Claude and ChatGPT are able to find and understand our site. Thankfully, a lot of the work we had done for accessibility and performance had already gone a long way towards helping with that. However, there were still a couple of things we could do to boost our site to make sure that search engines had everything they needed to crawl and process our site. The first port of call was to create a detailed schema tag for our site so that search engines could read contextual meta data about the hierarchy and content of our pages. Each page was carefully audited and the correct meta tags and descriptors were added to the head of each.
Now, we live in the age of emerging AI and so we also took leverage of a new standard: the LLM.txt file. This is something like a robots.txt file, but instead of simply telling crawlers where to look and not to look, it helps large language models (LLMs) like ChatGPT and Claude to understand what our site is about and how to read the content on it.
Getting a site to this level is no quick or easy feat, but we are always up for a challenge. We don’t believe that good design, beautiful design, excludes good practices. We are passionate about what we do and we want to show to the world that there is a way, that it can be done.
