Over the course of my 30+ year career in Information Technology, I have encountered my fair share of proof-of-concept or prototype applications which have found their way into a productional state. This has always been a direct conflict with the original intent of the work that was created.
While this should be considered some form of a compliment for a job well-done, more often than not there are decisions made in those prototyping exercises which are not labeled as production-support friendly. To some degree, this very same scenario exists with the Heroku-based SaaS fitness solution I originally created for my sister-in-law.
Now, it is time to complete this part of the journey.
Recapping the Fitness Journey (so far)
For those who are not familiar, I started building a fitness application for my sister-in-law (who is a personal trainer) in 2020 during the onset of the pandemic. This idea has since turned into a SaaS solution running in Heroku – providing a multi-tenant design for personal trainers across the United States.
Below is a listing of each article in this series:
- Using Heroku to Quickly Build a Multi-Tenant SaaS Product - introduction and initial design
- Integrating Twilio Into My SaaS Solution In Heroku - introduces SMS functionality for end-users
- Leveraging CloudAMQP Within My Heroku-Based SaaS Solution - adds message-based processing for invoicing
These articles helped pave the way for the features and functionality noted in the following road map:
The 1.0.5 release is going to feature some neat stuff that uses machine learning and artificial intelligence, but a database conversion needs to happen to better prepare the RESTful API.
Destination PostgreSQL
Initially, I used MySQL for the underlying data source, because my focus was on gathering the necessary requirements for the application's initial release. Since I was already using MySQL for another application I had running in Heroku, it was easy to leverage my existing knowledge.
I knew my end state for the data layer was going to be Heroku Postgres because of the following benefits over MySQL:
- Free & Open Source (FOSS) - my project has leveraged some amazing OSS frameworks; it only makes sense for the database to reap the benefits of open-source
- Object-Relational Database - can define objects and table inheritance (advanced data structures)
- Excellent for Complex Queries - will be the focus of fitness automation features currently being designed
- Multi-Version Concurrency Control (MVCC) - eliminates the need for a read-write lock in order to interact with data
- Shared Across Dynos - possible to share a paid Heroku Postgres instance with multiple applications (like the application I wrote for my mother-in-law)
Migrating to Heroku Postgres
The first step in the process is to put the fitness client (written in Angular and served over Heroku via Node.js) and service (using Spring Boot) into maintenance mode. This can be completed using the following Heroku CLI command for each Dyno:
heroku maintenance:on
The following steps walk through the entire process of converting from MySQL to Heroku Postgres.
1. Establishing Heroku Postgres Add-On
The first step in migration from MySQL over to Heroku Postgres is to add Heroku Postgres my fitness API. This is as simple as executing the following Heroku CLI command:
heroku addons:create heroku-postgresql:hobby-dev
The hobby-dev plan will be temporary until I am ready to shut down my existing MySQL database. As an example, let's assume the following database results were returned from the hobby-dev instance:
Creating heroku-postgresql:hobby-dev on ⬢ sushi... free
Database has been created and is available
! This database is empty. If upgrading, you can transfer
! data from another database with pg:copy
Created postgresql-acute-52767 as DATABASE_URL
Using the heroku config
CLI command, the full DATABASE_URL can be retrieved from Heroku, which includes the system account ID and password required to access the database. As an example, let's assume the following information was retrieved:
postgres://user:password@somehost.compute-1.amazonaws.com:5432/someDatabase
Next, I am going to use my MacBook Pro in order to work on the data conversion locally. The first step is to set the DATABASE_URL in my local environment:
export DATABASE_URL=postgres://user:password@somehost:5432/someDatabase
Using echo $DATABASE_URL
will validate that the environmental variable is available.
2. Installing PostgreSQL Locally
To perform the migration locally, the Postgres.app (with PostgreSQL version 13 support) was installed on my MacBook Pro.
The Postgres command line tools were installed next:
sudo mkdir -p /etc/paths.d && echo /Applications/Postgres.app/Contents/Versions/latest/bin | sudo tee /etc/paths.d/postgresapp
I restarted my terminal session, then validated the Postgres tools were running correctly using:
which psql
Which responded with:
/Applications/Postgres.app/Contents/Versions/latest/bin/psql
I verified I could access the Postgres information from Heroku using the following command:
heroku pg:info
The following response was received:
=== DATABASE_URL
Plan: Hobby-dev
Status: Available
Connections: 0/20
PG Version: 13.2
Created: 2021-05-03 04:13 UTC
Data Size: 7.9 MB
Tables: 0
Rows: 0/10000 (In compliance)
Fork/Follow: Unsupported
Rollback: Unsupported
Continuous Protection: Off
Add-on: postgresql-acute-52767
3. Using pgloader To Migrate Data
pgloader is an open-source library to migrate data from MySQL into PostgreSQL. Using Homebrew, the installation was quick and easy:
brew install pgloader
Once installed, I was able to run pgloader using the following commands:
pgloader –version
The command returned the following information:
pgloader version "3.6.2"
compiled with SBCL 2.0.11
Next, I created the PostgreSQL version of the fitness database using the following command:
createdb fitness
Using the database URL and the new PostgreSQL version I just created, the conversion was started using the following command:
pgloader mysql://userId:password@hostname/oldFitnessDatabase postgresql://localhost/fitness
From there, the pgloader command completed the migration effortlessly. It provided the following information:
2021-05-03T06:05:38.013630+01:00 LOG pgloader version "3.6.2"
2021-05-03T06:05:38.087483+01:00 LOG Migrating from #<MYSQL-CONNECTION mysql:// userId:password@hostname/oldFitnessDatabase {1004FE7A53}>
2021-05-03T06:05:38.087632+01:00 LOG Migrating into #<PGSQL-CONNECTION pgsql://john.vester@localhost:5432/fitness {1005142EA3}>
2021-05-03T06:05:42.536036+01:00 LOG report summary reset
... report information here ...
Total import time ✓ 4830 231.6 kB 3.150s
I was able to establish a connection to the local PostgreSQL database and confirm that all the expected data was there.
4. Getting Data Into Heroku Postgres
With the local PostgreSQL database containing the real data from MySQL, the next step is to push these changes into the Heroku Postgres instance.
In order to load the data, a compressed version of the database can be created using the following CLI command:
PGPASSWORD=mypassword pg_dump -Fc --no-acl --no-owner -h localhost -U userId fitness > fitness.dump
Using my existing (but rarely utilized) AWS S3 account, I uploaded the fitness.dump onto AWS S3 and generated a pre-signed URL using the following command:
aws s3 presign s3://myBucket/myKey
The last step was to restore the data into the newly created Heroku Postgres instance:
heroku pg:backups:restore '<SIGNED URL>' DATABASE_URL
The DATABASE_URL refers to the attribute added to the local environment in the "1. Establishing Heroku Postgres Add-On" section above.
I was able to establish a connection to the Heroku Postgres database and make sure all of the expected data was there.
5. Updating Spring Boot Service
With the database updated, the final step is to change the pom.xml
in the Spring Boot RESTful service to remove this:
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<scope>runtime</scope>
</dependency>
And then replace it with this:
<dependency>
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
<scope>runtime</scope>
</dependency>
Starting the Spring Boot service resulted in no errors and the data converted without any issues.
The final step in the process is to disable maintenance mode for the fitness client and service. This can be completed using the following Heroku CLI command for each Dyno:
heroku maintenance:off
Conclusion
The following article demonstrated how easy it is to migrate an existing MySQL database to Heroku Postgres. Of course, I highly recommend executing and validating these steps in a non-production environment first. While the tooling appears to be top-notch and focused to convert data without any issues, it's always nice to execute such tasks in an environment which will not impact customers.
For my scenario, the cost to convert from MySQL to Heroku Postgres was a non-issue, since the price difference was only a few cents per month. However, with the Heroku Postgres implementation, it is possible for me to migrate away from the free version of MySQL for my mother-in-law's application to use the same Heroku Postgres instance.
However, for the same price I feel comfort in knowing I have a database that is focused on meeting the challenging demands ahead on my fitness application journey running a SaaS solution in Heroku.
Have a really great day!