A repo that can set up all of my applications (presently, AwardIt, personal site, MissingLink, StarterTab, Umami) in the cloud, keeping in mind the concept 'cattle, not pets'
- kubernetes
- packer
- terraform
application | frontend | backend | database |
---|---|---|---|
startertab | nextjs on vercel | next js api routes on vercel | postgres on neon |
awardit | react on aws s3 | ubuntu vm on digital ocean | postgres on digital ocean |
missinglink | react on vercel | next js api routes on vercel | postgres on digital ocean |
personal site | nextjs on vercel | n/a | n/a |
- non-prod environments for all sites
- taking backups for my dbs into an s3 bucket (configured and stood up with Terraform)
- I should be able to able to bootstrap all my websites from these scripts with minimal click-ops (exceptions for things like dns)
- $20 for Vercel
- $6 for Digital Ocean
- $2.5 for AWS
Can I get this down with containerization and free tiers?
- sets up an ubuntu image using packer
- configures that image with ansible
- deploys that image, among other things using terraform
-
next action
- only push up the DB backups if it's PROD, don't take the backup scripts if I am doing a non-prod build
- should be deleted out of crontab, isn't
- start working on getting Cloudflare working for StarterTab front end
- only push up the DB backups if it's PROD, don't take the backup scripts if I am doing a non-prod build
-
move from Vercel to Cloudflare pages for startertab
- get a baseline of speed for comparison
- build out a non-prod env using terraform in Cloudflare pages
- swap out prod
- Cloudflare link
- Terraform link
-
use ansible to set up my ubuntu image with the following
- firewall rules
- hbfw (ssh and https, nothing else)
- nginx
- nodejs and .net installs
- redis
- configuring the swap (common)
- systemd for missinglink & awardit
- cron job to call the missinglink backend
- s3 bucket backups for the databases (something I need to get going on the extant VM)
- procure an s3 api key
- use terraform to build an s3 bucket
- get a script working that takes a postgres backup to a directory
- need to create a new disk volume, running out of space (20gb) (needs 5gb then more for backing up...)
- use a cron job to back up and push up the backup to s3 daily
- include those scripts (to pull down once), and then to push up in the future
- postgres setup
- install the db
- accounts
- edit the PostgreSQL pg_hba.conf file so that it allows logins from localhost so I can take backups answer
- databases (comes from s3, have a command line or variable to disable this so that it doesn't use too much bandwidth)
- firewall rules
-
how do I do DNS to set this up as a non-prod?
-
use terraform for the following
-
standing up the image
-
get the IP showing after creation
-
nbfw create a new rule and attach it
-
edit aws route53 to point the non-prod dns entry to the host's IP using provisioners
-
certificates (letsencrypt)
- generate letsencrypt certs (sudo certbot --nginx)
-
set up nginx for non-prod
-
set up dns records for non-prod using route 53
-
set up certificates
-
get this all working with non-prod front ends
- set up startertab with cloudflare?
- how do I manage the non-prods envs linking in with Vercel, I need to get my backend urls from env variables
-
-
long term todos/clean ups
- build a diagram to show what these scripts are doing, or just a list
- clean up this README as a piece of documentation
- I need to know if the database backups into s3 fail somehow, scripts?
- hosting and deployment of the awardit front end
- hosting and deployment of allistergrange.com front end