Again, I procrastinated much of my time researching and fiddling around with some details that simply didn't matter.
TL;DR Somehow, I didn't write much about Preact, sorry. But my insight: Load your data before rendering anything on the server with Preact and simply pass your data down with props.
Streaming HTML
First, I tried to get HTTP streaming responses running on Cloudflare Workers. My idea was, I could chunk up HTML parts and send them to the client in small packages so a slow mobile connection could start rendering the page before it finished receiving it.
Turns out, the underlying JS engine would decide by itself when it had enough data to send it to the client; otherwise, it would simply wait and gather new chunks. The only way to force it to send what it already has is to finish the response, which doesn't allow to send more afterward.
After I found a GitHub issue by a React core dev struggling with the same problem, I stopped investigating.
Probably was premature optimization anyway.
Typing TypeScript
My next procrastination target was TypeScript. I started to use it everywhere to save myself from some bugs in the future.
I use it for infrastructure as code with Pulumi, on the backend with Cloudflare Workers, and eventually on the browser.
The autocomplete is lovely, but I already got at the end of my skills when I tried to fiddle around with some complex generics that should be passed along a function chain. I think I lost like a week to that, and in the end, I simply used the unknown
type. That way, I have to write more types manually, but at least TypeScript reminds me to do so.
Yesterday, I had a nice moment where I would upload assets to Workers KV with Pulumi and could reuse the interface I defined for it in my Worker code that loaded these assets later.
Server(less) Side Rendering with Preact
I planned to split up the SaaS into two services. One for customer signup, payment, etc., and one for the actual microblogging service. So I started with the customer-management service and tried to set up server(less) side rendering.
Why SSR? Because I always struggle on the train with slow internet connections and I wanted to build something that can be simply read without a colossal bundle download on the first visit. It would also be cool if it worked without any client-side JavaScript at all.
I struggled a bit with rendering Preact on the server because I wanted to use it like on the client. This doesn't work since I can't call asynchronous code inside the rendering pipeline.
I had the impression my stack got much more complex in the endeavor to get this to work.
In the end, I threw it all away and built everything like I would create a REST API. The only difference being that I would use Preact to render HTML around the JSON data when the Accept
header wouldn't include application/json
.
This is the whole pipeline:
async function(request) {
const query = generateQuery(request);
const data = await queryDb(query);
const app = <html>
<title>{data.title}</title>
</html>;
const html = renderToString(app);
return new Response(html, {
status: 200,
headers: {"Content-Type": "text/html" }
});
}
It simply serves out static HTML, so no client-side hydration, etc. The client doesn't even know there is Preact involved. I just wanted to see how far I get without all the client-side JS goodies.
This isn't as cool as every component loading its own data, but at least it's a straightforward design and predictable in terms of performance.
I read this article about Django for startups and will use the ideas here to structure my API design.
The nice thing is, I can later reuse the stack for client-side data fetching if that ever becomes a need.
api
.match(request => true)
.auth(async request => true)
.load(async request => rawData)
.format(rawData => data)
.render(data => response);
I think this is a flexible design. I can match a route on anything in the request
, authentication/authorization and data-loading can be async in needed, and I can drop out before rendering HTML if the Accept
header requires it.
With TypeScript, I can make it more evident that these methods are all mandatory, and this way, all route definitions should look the same. I will add some other methods as the design grows.
Since everything is a function, I already wrote small helper functions I can plug in.
Cloudflare Worker Limits
The limits here are that I can only have 1MB of code in a worker, so all Preact code will be added to this. The number of Workers per account is limited to 30. Overall I can only have 30MB of backend code per account. Right now, I'm under 100KB of code and one worker; let's see when I'll hit that limit and have to split things up.
I'm using ESBuild to bundle everything up to get the most out of it. No gzip magic helping here; the size limit applies to what I upload to a worker, haha.
ESBuild is quite lovely; it works with TypeScript and doesn't need many configurations, and (at least at the moment) only takes 10ms to do all the work.
I'm storing assets that are only needed on the client in Workers KV, a fast and cheap key-value store. It has unlimited space and very fast reads. Its limit is 25MB per file, but this shouldn't be an issue for this project. Might as well use it for storing customer media files later.
Anything Else?
I'm currently using GitHub Codespaces (VSCode in the cloud) and GitHub Copilot (AI code suggest tool), and they are pleasant to work with.
With Codespaces, I can use all of my code on multiple machines without committing anything to the repository. It's like AWS Cloud9, which I'm using for years now, only that it's using VSCode, which I like much more because it has such a big eco-system. (Also, it's free in preview, hurr hurr)
Copilot isn't perfect, but it's also not intrusive. Every now and then, it hits me with a big suggestion that really saves typing, but it just drops small useless snippets most of the time.
I also checked out SonarLint as a free alternative to Snyk Code, but at the moment, it didn't give much better advice than ESLint, so yeah, I probably will set up Snyk's VSCode plugin these days and see how that goes.
Next
Adding authentication with Auth0 and connecting the whole thing to Fauna. Then I should be ready to build the first features.