Intro
I wanted to migrate a static website from a VPS to a CDN to improve website loading time and SEO performance. After a few searches, I discovered a new sleek CDN called BunnyCDN, which beats all performance charts in latency with an average of 40ms. That's what I was looking for!
Note: In 2020, it was a young and fresh company, tho, and in 2024, it is still winning the global latency performances.
It provides the following key advantages:
- Automatic replication in 15 Storage Region
- Delivers the content through 153 CDN POP servers (Content Delivery Network Point of Presence)
- Cheap price
- Company based in Europe (Slovenia) 🇪🇺
To set up a project on BunnyCDN, you need to create a Storage Zones, similar to an S3 Bucket or Open Stack Swift Container. Then, you have to link the storage to a CDN, also named a Pull Zone.
There are three methods to upload/download files to the Bunny Storage:
- Manage files on the web interface: not the best to upload hundreds of files
- Manage files through an FTP: When you create a storage zone, the interface gives you credentials to upload. That's better to bulk upload, but it's not the best solution for automating the process.
- Manage files through the Bunny API: It is the best solution for handling thousands of files and automating the process.
The last option is the way. Mandalorian flute sound
Request the Bunny Storage API
As my project was powered by NuxtJS, I created a Node SDK to communicate with the Bunny Storage API, and it is named bunnycdn-storage-api-node-sdk.
Yes, that's a super long name; I couldn't find better; I should have done something like: BSANSDK
When I update and build the static website for production, the SDK automatically uploads new files to the storage. Then BunnyCDN replicate files into each Storage Region:
The Node SDK was created in 2020, the most important about it:
- 🦄 Simple to use: Only 4 methods:
putFile
,getFile
,deleteFile
andgetFiles
. - 🚀 Vanilla Javascript with only one dependency: rock-req to make HTTP requests. Rock-req is faster than
fetch
oraxios
. - ✅ 100% tested
First, install the package with:
npm install --save bunnycdn-storage-api-node-sdk
Then instantiate a storage instance by providing your Bunny API Credential, and the API Storage Zone:
const storage = require('bunnycdn-storage-api-node-sdk')(API_STORAGE_KEY, API_STORAGE_ZONE);
To upload a file, use the putFile
function and provide a Callback as an argument:
const fs = require('fs');
const PATH = '/path/';
const FILE_NAME = 'superPicture.jpeg';
const FILE_BUFFER = fs.readFileSync('./' + FILE_NAME);
// callback version
storage.putFile(PATH, FILE_NAME, FILE_BUFFER, (err, resp, data) => {
console.log(data.toString()); // { "HttpCode" : 201, "Message" : "File uploaded." }
});
Each function supports Callback functions and Promise; The following putFile
is receiving the API result as a Promise:
// Promise version
storage.putFile(PATH, FILE_NAME, FILE_BUFFER).then(data => {
console.log(data.toString()); // { "HttpCode" : 201, "Message" : "File uploaded." }
}).catch(err => {
console.log(err);
});
To download a file, execute getFile
and pass a Callback, or Promise:
// callback version
storage.getFile(PATH, FILE_NAME, (err, resp, data) => {
// data is a buffer
});
To Delete a file, call the deleteFile
function:
// callback version
storage.deleteFile(PATH, FILE_NAME, (err, resp, data) => {
console.log(data); // { HttpCode: 200, Message: 'File deleted successfuly.' }
});
Finally, to get a list of files and directories, call the listFiles
function:
storage.getFiles('/').then(data => {
console.log(data) // data is an array of objects, a list of files
}).catch(err => {
console.log(err);
});
The Node SDK is limited to the Edge Storage, and it does not support for the moment the Bunnet.net API
and the Stream API; It could be a great evolution!
Finally, to automate the update of the website when a new build is created, I wrote a little JS script that is executed at the end of the build process:
require('dotenv').config()
const fs = require('fs')
const path = require('path')
const storage = require('bunnycdn-storage-api-node-sdk')(
process.env.CDN_STORAGE_API_KEY,
process.env.CDN_STORAGE_ZONE
)
uploadNuxtFiles()
async function uploadNuxtFiles() {
const PATH = path.join(__dirname, '..', 'dist')
await clearStorage()
const files = getFiles(PATH)
files.forEach((filePath) => {
const FILE_PATH = path.dirname(filePath.replace(PATH, ''))
const FILE_NAME = path.basename(filePath)
const FILE_BUFFER = fs.readFileSync(filePath)
storage.putFile(FILE_PATH, FILE_NAME, FILE_BUFFER, (err, resp, data) => {
if (err) {
throw new Error(err)
}
// eslint-disable-next-line no-console
console.log(JSON.parse(data.toString()), filePath.replace(PATH, ''))
})
})
}
/** UTILS */
function getFiles(dir, files_) {
files_ = files_ || []
const files = fs.readdirSync(dir)
for (const i in files) {
const name = dir + '/' + files[i]
if (fs.statSync(name).isDirectory()) {
getFiles(name, files_)
} else {
files_.push(name)
}
}
return files_
}
function clearStorage() {
return storage
.deleteFile('/')
.then((data) => {
// eslint-disable-next-line no-console
console.log(data)
})
.catch((err) => {
throw new Error(err)
})
}
Code break-down:
- First, it loads the environment variable file
.env
. - The Node SDK to request the Bunny API is initialised
- The uploadNuxtFiles is executed:
- Old files from the Bunny Storage are deleted with the
clearStorage
function. - The
build
directory is read to get the list of new files. - New files are uploaded using the
putFile
function.
- Old files from the Bunny Storage are deleted with the
Voilà 🎉
Conclusion
The Node SDK and BunnyCDN are performing great in delivering my static website worldwide, simply and automatically.