why You Must use Nx monorepos with existing workspaces

tkssharma - Mar 15 '23 - - Dev Community

Recently i was exploring Nx monorepo and how it can be useful for today's development with all these latest stacks

I created a playlist to learn and explore thing in Nx
Here is the playlist

Earlier i was just Using Lerna which is just workspace based solution and latest version of Lerna also adopted Nx solution.

Most of current world development is happening around monorepos and we have different solution which enables monorepo for your repository like

  • Lerna
  • TuroboRepo
  • Nx Monorepo
  • .. many more

How and Why you must Use Nx Monorepo solution

I will not talk about what is monorepo and how it works, if you are interested please explore more from here

Now coming to the point why Nx, why should i use Nx, i would say it because of its ecosystem and features it provides

  • Integrated monorepo
  • Package based monorepo
  • standalone app based monorepo

Integrated monorepos are more controlled by Nx so you just need to focus more on adding logic to your services and applications, Nx does all linking and script execution automation

Package Based monorepo are more useful where we want more flexibility as we have have simple NPM, PNPM or Yarn workspace (existing or new) and can add Nx capabilities which we need like script automation

Nx will make development fast of existing monorepos or brand new monorepos

In this video series i have covered how we can build PNPM workspace and add Nx capabilities to the workspace

I really liked how it automates and does cache of script execution, it has nice vs code plugin to execute commands and works with Nx plugins

I would say it's worth exploring and here is the playlist, i will be adding more examples where we have multiple Nest JS microservices (PNPM workspace with Nx monorepo).

Happy coding, Learn something new Everyday !!

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .