If you're reading this you probably got a really steep bill from Neon after finding yourself on their "Scale" plan. If you do want to stay with Neon but avoid surprise bills then go to the Plans page and choose what you actually want.
If you're not using Neon's branching features, my favorite two options for alternative postgres hosts are:
1) Render.com currently offers postgres databases for $7 a month. The $7 instance is pretty weak as far as RAM and CPU, and their prices also get pretty unreasonable after that. However, this is a quick setup and cheaper alternative to Neon.
2) Self host with Digital Ocean + Docker
Note that this setup is not recommended for production data!
First, setup a droplet running Ubuntu (version 24.04). The $6 "Regular" 1GB/1CPU box is plenty for what I need.
Once you have access to the machine, you'll need to install docker. Just follow the steps here: https://docs.docker.com/engine/install/ubuntu/
Then create 3 files. The first is Dockerfile. Modify the list of extensions you'll need. This file includes pgvector, cron, and postgis.
# Use the official PostgreSQL image as the base image
FROM postgres:15
# Update package lists
RUN apt-get update
# Install PostgreSQL extensions
RUN apt-get install -y --no-install-recommends \
postgresql-contrib \
postgresql-15-pgvector \
postgresql-15-cron \
postgresql-15-postgis
# Clean up
RUN rm -rf /var/lib/apt/lists/*
# Expose the PostgreSQL port
EXPOSE 5432
Next create docker-compose.yml
version: '3'
services:
postgres:
env_file:
- .env
image: selfhosted_postgres:latest
build:
context: .
dockerfile: Dockerfile
restart: always
ports:
- 5432:5432
volumes:
- ./data/db:/var/lib/postgresql/data
Finally create .env (don't use the same password I have here of course)
POSTGRES_USER=postgres
POSTGRES_PASSWORD=PutASecurePasswordHere
Then, startup the server with
docker compose up -d
Get the id of the container
docker ps
Next, shell into the postgres container (the id for mine was 98e6fe75cd4d) and access the database:
docker exec -it 98e6fe75cd4d bash
psql -h localhost -p 5432 -U postgres
Replace "postgres" in the psql command if you used something custom for POSTGRES_USER in the .env file.
Then use SQL commands like these to setup a database
create database el_cheapo;
create user stingy with encrypted password 'PASSWORD';
grant all privileges on database el_cheapo to stingy;
ALTER ROLE stingy SUPERUSER;
\q
In the example commands above "el_cheapo" is the database name. "stingy" is the database username, and "PASSWORD" is the user's password. Avoid the SUPERUSER command if you can, I used it to avoid errors encountered by "CREATE EXTENSION" while importing data.
Finally, copy data from Neon into your database. This command runs in the same shell that you opened in the postgres container. You will also need to adjust the psql connection string to match the credentials you used when creating the database.
pg_dump NEON_CONNECTION_STRING --no-owner | psql postgresql://stingy:PASSWORD@localhost:5432/el_cheapo
Get NEON_CONNECTION_STRING from your dashboard in Neon.
You may see a few errors like this, but I believe that they are ok to ignore: ERROR: role "neon_superuser" does not exist
Once the data import is done you can connect to the new instance, verify that all the data has been copied over correctly, and then delete your Neon databases. Don't forget to delete your account or set your Plan to free or they might keep on charging for not even having any databases!
Note that per docker-compose.yml, the database files will be stored in a "data" directory in the same folder.
Use this command to stop the database (run in same folder as docker-compose.yml):
docker compose down
If you want to automatically backup the data then I recommend just turning on snapshots for the droplet.
Finally, sit back, relax, and enjoy the peace of knowing you won't get a stupidly high bill for your development databases.