Every developer has most likely bumped into the concept of caching at some point in their career.
For some, itâs a vital instrument in their everyday work that helps their code to run as fast and cost-effective as possible.
For others, they might only have worked with it that one time they activated âthat cache pluginâ on a clientâs WordPress site.
If you belong to the first group, you probably wonât need this article. Instead, you can enjoy a nice game of âChrome dinoâ. Open a Chrome browser and type: chrome://dino/
(send me your high score afterward!).
For all you people not so well-travelled in the world of cache, I know youâre dying to also play a nice game of âChrome dinoâ, but patience you must have my young Padawan â your time will also come.
First, letâs look at what cache is and how it works.
What is cache?
Caching is the process of temporarily storing copies of data, so it can be accessed more quickly.
Itâs a concept that has been used in computer science for ages, going all the way back to 1965 when British computer scientist Maurice Wilkes introduced memory caching.
But copying things doesnât necessarily make them faster, Letâs look at three examples.
In the first example, weâll see how you can reduce the serverâs workload by turning your dynamic content into static content.
Making dynamic content static
Back in the early days of the web, all websites that existed were static â it was all pure, simple HTML.
Then in the mid 90âs that all changed when multiple server-side languages were released (PHP, ASP, etc.), as well as one client-side scripting language called JavaScript
Dynamic pages revolutionised the web and made websites more personalised and interactive. However, it also made the web a bit slower since the user now had to wait for a server to render the content.
Having the server create the content every single time a user visited, even though nothing had changed, seemed a tad bit silly, so developers started implementing caching.
The caching worked by saving a copy of the âserver responseâ, turning the dynamic content into static content.
That way they could generate content upfront and serve it both quicker and cheaper to the user â they just had to remember to invalidate it when the content changed.
Letâs see what this could look like in the real world.
Example: Soup of the day
Imagine a restaurant serving âsoup of the dayâ. Each day a new, different soup is available.
Which soup is available that day, may depend on which ingredients are in surplus in the kitchen and what the chef feels like making.
Since the restaurant doesnât know what the soup of the day will be, they can only write âSoup of the day â ask the waiterâ on their menu.
(Itâs really difficult for me not to write a witty âask the serverâ-pun here).
Thatâs fine for a small restaurant, or one where a waiter takes your order each time. Now imagine a busy street food restaurant serving hundreds of customers throughout the day.
Getting asked, âWhat is today's soup?â a hundred times a day, every single day, would turn any man or woman into a Soup Nazi, yelling âNO SOUP FOR YOU!â the 249th time this question was asked.
The restaurant could instead âcacheâ this request by writing âSoup of the day: Chicken soupâ on a blackboard above the counter.
The chef could then easily âinvalidateâ this cache once a day, when the soup changed, or when the soup is sold out.
In the next example, we will see how we can get the soup⊠Sorry, I mean content, closer to the user by using caching.
Moving the content closer to the user
The location of a server hosting a website can greatly impact the websiteâs speed.
If the user is located far away from the server, their response will be slow compared to someone located close to the server.
The closer the user is to the server, the quicker the response. This is also why thereâs big money in renting out servers as close to the stock exchanges as possible â for high-speed trading bots, every millisecond counts.
Luckily, we donât need to have the servers quite as close to the users, as a trading bot needs to be to a âtrading floorâ, but we need it to be somewhat close to their geographic area.
Setting up multiple independent servers to serve multiple geographic areas can be both inefficient, costly, hard to scale, and a nightmare to manage. Luckily thereâs a better solution â a CDN.
A CDN, Content Delivery Network, is a group of servers distributed geographically that work together to deliver your content fast to the user.
It stores a copy of your data, including HTML pages, JavaScript files, stylesheets, images, and even videos.
CDN providers have a ton of locations available. Cloudflare, for instance, has 275 servers in more than 100 countries â imagine if you had to set that up yourself!
Letâs try to translate a CDN into a real-world example.
Example: The electrician and his van
Say you are an electrical contractor and run a successful company with several employees under you. Since youâre an electrical contractor and not a baker, you need to go to your customers instead of your customers coming to you.
The customers can be located far away from the workshop, where you store all your tools, cables, and other supplies.
Having to go back and forth from the workshop to the customer, each time you need a tool or cable would be quite cumbersome.
Instead, the electrician âcachesâ his most used tools and supplies in his van. That way the things he needs are close to him, and he can quickly grab what he needs.
Each day when he gets back to the workshop, he can ârevalidate his cacheâ by filling up the vans with new supplies, charging his electrical tools, and preparing the van for the next job.
Image source: https://www.fieldpulse.com/blog/electrician-van-setup/
In the last example, we will see how we can cache our data to faster storage solutions.
Moving the data to faster storage
All storage is not âbuilt the sameâ. You'll know what I'm talking about if you have ever switched from using a hard drive to an SSD.
Hard drives, which store the data on mechanical spinning disks are great at storing a large amount of data since they are incredibly cheap when it comes to âcost per GBâ.
However, having mechanical parts just isnât as fast as something without any moving parts, like an SSD, which stores its data on flash memory. SSDs are much faster, but (as you may already have guessed) also more expensive.
Over the years many web hosts have started offering plans with SSD hosting to make their services even faster.
But do you know whatâs even faster and way more expensive than SSD storage? RAM storage.
Thatâs right, you can store your files in the RAM. Benchmark has shown read speeds that were 6.3 times faster than the speed of an SSD â insane!
But no sane person would store websites in memory (RAM), would they? Indeed, they would.
Redis is an in-memory data store that you can use as a database â or a cache in front of your regular database. The speed of Redis makes it ideal to cache database queries, complex computations, API calls, and session state.
Now, I know what you are thinking. âIâm just a simple developer trying to make my way in the universe. Iâm not going to set up and maintain a separate database merely for caching.â
I get that, boy do I get that. As the late Phil Karlton once stated: âThere are only two hard things in Computer Science: cache invalidation and naming thingsâ.
If only there was a way to reap the fruits of in-memory storage, without the hassle of setup, maintenance â and more importantly cache invalidation.
You can see where this is going, canât you?
Yes, this is indeed shameless self-promotion. Donât worry, Iâll be sure to whip myself 3 times afterward and put on the cone of shame for the rest of the day.
Enterspeed is a way to cache your data in a high-speed, in-memory storage, without the hassle of maintenance and cache invalidation.
We offload your data, which you can combine with multiple data sources (as well as transform), in a Redis database. What this does is essentially decouple your server, as well as make your dynamic content static.
If you want to know more about how this all works, head over to Enterspeed.com to read more.
Well, thatâs enough self-promotion (for now). Letâs look at the real-world example for this.
Example: RV with a car storage
After thinking about it for many years, youâve finally decided that youâre going to see Europe. Youâll take some months of work, rent an RV, and visit all the places youâve dreamt about seeing.
An RV is perfect since you donât have to plan far ahead and can stay at each place for as long as you like. One problem though. While an RV is great for many things, it isnât exactly an easy or fast vehicle to operate â especially not on those tiny European streets.
Youâll need something else for those small sightseeing tours. Therefore, you decide to add a car to your travel plans.
What solves this problem, is an RV with built-in car storage. That way you can use your RV to travel from location to location, and the car to go sightseeing within the location.
Image source: https://www.concorde.eu/modelle/liner
If youâre not Scrooge McDuck, this could also simply be a bike, or a scooter placed on the back of the bus â but how cool is a freaking built-in car garage!
Now, before moving on to the âhow toâ of the article, letâs look at some of the different types of web caching.
Web caching types
There are several types of caching. Some of the most used in web caching includes:
- Client-side caching: A cache that is stored on the userâs computer.
- Browser caching: A cache that is stored on the userâs computer.
- Server-side caching: A cache that is stored on the server.
- CDN caching: A cache that is stored across multiple CDN servers.
- Reverse proxy caching: A cache stored on a reverse proxy server.
Now youâre properly thinking: âWait, isnât client-side caching and browser caching the same thing?â.
Not exactly. Theyâre similar in the way they store their data (on the client), but theyâre not the same thing. To add to the confusion, the way client-side caching can be implemented is by using the browserâs storage APIs.
However, what differentiates them isnât how they store data but rather what types of data they store.
Client-side caching is used to store responses from the server, e.g., API requests, which then reduces the number of requests that need to be made to the server.
Browse caching on the other hand stores static resources like images, videos, fonts, stylesheets, HTML files, JavaScript files, etc.
Both client-side caching and browser caching are controlled by the userâs browser. The developer can control the cache headers (for instance when the cache should be invalidated), but it is ultimately up to the web browser to interpret and enforce them.
This brings us to Server-side caching which stores the cache on⊠you guessed it â the server. Here the developer is in full control. There are several types of server-side caching, some of those are:
- Page caching: Stores all the HTML of a page, so the server doesnât have to generate it dynamically (as explained in the âMaking dynamic content staticâ-example).
- Database caching: Stores data that are frequently used in the database (As explained in the âMoving the data to faster storageâ-example).
- Object caching: Stores complex data structures or objects, which reduces the number of database queries needed (Also explained in the âMoving the data to faster storageâ-example). The next type of caching is CDN caching, which we explained in the âMoving the content closer to the userâ-example, so we wonât dive too much into that.
Finally, we have reverse proxy caching. A reverse proxy cache, also known as a reverse HTTP cache, is also a type of server-side cache.
It sits in front of the server(s) and acts as a buffer between the client and the server. When a request arrives, itâs forwarded to the server by the reverse proxy which then caches the response. Thus, future requests can be served from the cache instead of hitting the server.
A reverse proxy cache is therefore a great way to improve not only the performance of a website but also its scalability â a reverse proxy cache is also able to work as a load balancer.
Before moving on to some of the ways you can implement cache on your website, we need to look at two more types of cache â private and public cache.
Private caching is a cache that can only be accessed by a specific user or user group. It can be stored both client-side and server-side.
Some examples of things that can be stored in a private cache are:
- User preferences (Preferred language, dark mode, etc.)
- Personal data (Name, e-mail address, etc.)
- Private messages
Public caching, on the other hand, is accessible to all users and is often stored on the server. Some examples of things that are stored in a public cache are:
- Commonly accessed resources (Images, videos, product catalogues, etc.)
- Public content (articles, blog posts. etc.)
Now thatâs all set, itâs time to move on to how you can use cache on your website using some âeasy winsâ.
Using cache on your website
Designing and setting up a caching strategy can be difficult. There are several ways to tackle cache and it all depends on how your setup looks and what your needs are.
Installing a caching plugin
If youâre using a CMS like WordPress, one of the quickest ways to start utilising cache is by installing a cache plugin.
One of the most beloved WordPress caching plugins is WP Rocket. Theyâve managed to take something complex and make it extremely easy, yet still powerful and configurable. They have also made it easy to integrate with a CDN in just a few clicks.
Setting up a CDN
Implementing a CDN is an extremely easy way to implement caching on your website.
If your site is built on the Jamstack principles, you are probably already using it via a provider like Netlify, Vercel, Cloudflare Pages, etc.
If not, then itâs as simple as setting up an account at a CDN provider and updating your nameservers. Cloudflare offers a generous free plan and is really simple to set up. You can read more about the setup here.
Implementing ISR (Incremental Static Regeneration)
Itâs no secret that Iâm a big fan of Next.js and its many fantastic features. One of the features I adore is their Incremental Static Regeneration.
When rendering a page, the choice is usually between SSR (Server-side Rendering) or SSG (Static Site Generation). Due to the risk of poor SEO and slow initial load, CSR (Client-side Rendering) isnât often used on âregular websitesâ (non-app websites).
SSR is great since it makes your pages dynamic but can be slow since it must wait for the server each time.
SSG is great since itâs super-fast but isnât easy to update since you will have to do a deployment each time you want to change something.
Well, ISR completely changed the game. Like a peanut butter and jelly sandwich, it took two great things and combined them, making something even better. It combined the power of SSR with the power of SSG.
ISR caches each page individually. When a user visits the page, it will check if there is new content available. If there is, it will start regenerating the page and once itâs done, invalidate the cache and show the updated page for the next visitor.
You can read more about Next.js ISR here.
Caching API responses
If you are fetching data client-side, e.g., showing your visitors data from the Star Wars API, you should cache these responses.
If the user has already made a request to see all the starships in Star Wars, thereâs no reason to make an identical request if the user wants to see it again. Instead, the response should be cached.
You can implement this yourself, for instance by using a state management solution, or something as simple as the State hook in React.
You can also use data fetching packages with built-in caching management, for instance, TanStack Query, SWR, etc.
I hope you enjoyed this article about caching. You are now free to play all the âChrome Dinoâ that you wish.
The âChrome Dinoâ game is shown in Chrome when you are offline. If you want to make your website/app work offline (making it a PWA), one of the ways you can store data is by using cache. You can read more about âoffline dataâ here.