And simply want to remind that you can check demo of what are we building here, and source code here.
We did it, we are on the finish line of building our e-commerce store with Nuxt.js. A few important things that we still need to handle are generating pages, setting up a sitemap, and configuring robots.txt. By generating server-side rendered (SSR) pages, we can improve the initial load time and search engine crawlability of our store. Additionally, implementing a sitemap will provide search engines with a comprehensive overview of our website's structure and content, while the robots.txt file will allow us to control which pages and directories should be crawled or excluded. These final steps are crucial for optimizing our e-commerce store's search engine optimization (SEO) and ensuring a smooth user experience for both visitors and web crawlers.
1. Generating static and dynamic pages with Nuxt.js SSR
Okay, how does page generation functionality work in Nuxt.js SSR? First of all, Nuxt will check your pages folder and generate a routes array, according to which it will create static webpages with their routes without additional configuration. Then Nuxt will find dynamic pages ([product] folder in our case) but there will be no such route inside the route array, that is the place with the main problem. We have to tell Nuxt which pages should be generated before the start generation process. For that, we will use nitro hooks (Nitro Hooks are a way to extend and customize the behavior of the Nitro server engine. They allow you to hook into different stages of the server's lifecycle and modify its configuration or behavior as needed) in our Nitro server (Nitro is a new server engine introduced in Nuxt 3 that aims to improve server-side rendering (SSR) performance and development experience). Let's step-by-step modify our generation process:
- create a "getProductRoutes" function that will call our "getAllProducts" service and fetch the products list from the database. Then we will create an array of modified routes, and return that array;
const getProductRoutes = async () => {
try {
const data: string[] = [];
const productsList = await getAllProducts();
productsList.forEach((doc:any) => {
data.push(doc.id);
})
return data.map((doc: string) => `/shop/${doc}`);
} catch (error: any) {
console.error('Error:', error.message);
return [];
}
};
- use the 'nitro.config' hook to call the "getProductRoutes" function, and push the resulted array to the prerendered routes;
hooks: {
async 'nitro:config'(nitroConfig: any) {
try {
const slugs = await getProductRoutes();
nitroConfig.prerender.routes.push(...slugs);
} catch (error) {
console.error('Unhandled promise rejection:', error);
}
},
},
- set ssr config value to true;
Okay, and now we can use the new command "npm run generate" in the command line, this will generate static and dynamic pages and create an output folder with all pages. You can visit that folder and check numerous product folders with the product ID as a name, and inside that folder will be ready to use the index.html file and js file. To check our generated e-commerce store we can use the "npm run preview" command, and Nuxt will launch our builded project on the localhost:3000 address. You can use those addresses in your browser window and check our website.
By the way, previously we were adding meta tags directly to each page and also to our dynamic page, now we can check those meta tags, We simply need to press "Ctrl + U" when you are on the product page, and check page header, it will have dynamically rendered meta tags.
That's great, but meta tags are not the only thing that makes your e-commerce store visible to search engines and users, so let's move on.
2. Implementing Sitemap to our Nuxt.js project
A sitemap is a file that provides information about the pages, videos, and other files on a website and their respective locations. It acts as a roadmap for search engines, helping them crawl and index the website's content more efficiently. Let's implement a sitemap into our e-commerce store.
First of all, we will install the Nuxt.js sitemap module with the "npx nuxi@latest module add sitemap" command, we need to open nuxt.config.ts and add a sitemap to the modules array, then add site URL value (our base site) and that is it.
To check if our sitemap was generated correctly we simply need to rebuild our project and insert sitemap.xml as a route of our website.
We are almost there, we need to install and configure the robots.txt file.
3. Configuring robots.txt for Better Crawlability and SEO
Robots.txt - is a text file that provides instructions to web crawlers or robots on which pages or directories of a website should be crawled or ignored.
We will use nuxt-simple-robots module to configure the robots.txt file. We need to use "npx nuxi@latest module add simple-robots" to install the simple-robots module, then we need to include "nuxt-simple-robots" into the modules array, and that's it we finished with this part also. But for example, if I do not want crawlers to see my checkout page, I can add robot configs into the nuxt.config.ts file, and add the checkout page into the disallow array.
robots: {
disallow: [
'/checkout'
]
},
Let's rebuild our e-commerce store project and check the result.
Great, our robots.txt file was successfully generated, it even contains information about our sitemap and route that we blocked for crawlers. Good job guys!
By taking the time to implement these essential components, we have taken our e-commerce store to the next level, ensuring that it is not only visually appealing and functional but also optimized for discoverability and visibility on search engines. With this foundation in place, our e-commerce store is well-positioned to attract and retain customers, ultimately driving growth and success in the competitive online marketplace.
If you need a source code for this tutorial you can get it here.
This article was the last one in a series where we were building our e-commerce store with Nuxt.js. Hope you liked it. Please, subscribe to the newsletters to be informed about the next series and tutorials. See you and Happy Coding!