After researching Angular Universal's TransferState from my Angular Universal post, I wanted to find out more about how this works in other frameworks.
First let's understand other Frameworks:
Angular Universal
See my last post.
What I didn't know is that it does not use cookies. It actually just pastes the server data in a script tag, and you can read it later from the browser like so:
<script id="serverApp-state" type="application/json">
// json string of your data
</script>
You also see this with other frameworks like URQL trying to pass SSR data.
NextJS
NextJS uses getServerSideProps
in order to transfer state. First it runs the function on the server, then it returns your data automatically as the correct property as well as in a script tag. It then re-initializes
the Virtual DOM with the same data in the script tag, so the UI does not get changed. This code looks like:
<script id="__NEXT_DATA__" type="application/json">
// json string of your data
</script>
NuxtJS
NuxtJS uses the exact same approach, but it does not stringify your data. It just sets the JSON data directly in the script tag. The server function is asyncData
and looks like:
<script>
window.__NUXT__= // json data
</script>
So what about SvelteKit ?
SvelteKit takes a different approach. It assumes you need to fetch something on the server. It uses the load
function to accomplish this. Instead of the data you return being put inside a json script tag, the data inside the fetch function is put inside a json script tag.
<script
type="application/json"
data-type="svelte-data"
data-url="/api/something"
>
// json string of your data
</script>
Load Function Example:
export async function load({ fetch }) {
const res = await fetch('/api/me');
return {
status: res.status,
props: {
article: res.ok && await res.json()
}
}
}
Like NextJS and NuxtJS, the data is automatically added back to the DOM through Svelte's compiler change detection.
Just one problem, what if you're not using fetch?
Solution 1 - Pass Svelte's Fetch to Your Function
Some APIs allow you to pass a custom fetch function. You can do this in Apollo, URQL, or even Supabase. I wrote my j-dgraph package to use a custom fetch for URQL.
One of the many problems with this is that you have to initialize your client code within the load
function of SvelteKit.
Solution 2 - API Endpoint
But what if you're API does NOT use fetch. A perfect example of this is Firebase. You cannot pass a custom fetch to the Firebase API. This means if you run the load
function to get the Firebase data, you will be reading the database twice. The load function will run on the server two times to get your data. Firebase charges by reads, so this is definitely not adequate.
The accepted solution to this, although not documented except by Rich Harris on github, is to create an endpoint.
routes/something.ts
import { getFirebaseDoc } from "../modules/posts";
export async function get() {
return {
body: await getFirebaseDoc()
};
}
routes/content.svelte
export async function load({ fetch }) {
const res = await fetch('/resources');
if (res.ok) {
return {
props: { resources: await res.json() }
};
}
return {
status: res.status,
error: new Error()
};
}
By doing it this way, you're basically forcing the built-in load fetch
function to work how you want. Rest API Endpoints are extremely fast.
My Take ?
Let's just do what getServerSideProps
does in NextJS. Whatever we return needs to be put in the script tag to save state, not whatever is in the fetch function. I love how SvelteKit thinks outside of the box, but I should not need an API endpoint to accomplish this. It is more code and more fetching.
Also, SvelteKit is still in public beta. Rich Harris now works for Vercel. NextJS puts API endpoints in their own Serverless Functions. At this rate, SvelteKit will follow. As Rich Harris pointed out in an interview somewhere, Serverless Functions create longer cold start times, which is not good for your app loading time. The problem is that AWS functions, like Google Cloud Functions, but unlike Google Cloud Run, have extremely small storage limits (10MB and 50MB). Maybe Rich Harris can invent a way around the cold start problem. Either way, I don't want my fetching done in a separate endpoint when I don't have to. However, this is how it is currently done.
J