Code Monkey home page Code Monkey logo

davinci-resolve-postgresql-workflow-tools's Introduction

DaVinci Resolve PostgreSQL Workflow Tools

Effortlessly set up automatic backups and automatic optimizations of DaVinci Resolve's PostgreSQL databases

Here are some workflow tools designed for macOS or Linux systems that are running as PostgreSQL servers for DaVinci Resolve.

This repository includes:

  • For macOS Ventura:
    • A bash script that will let you effortlessly create, load, and start launchd daemons that will automatically backup and automatically optimize your PostgreSQL 13 databases
    • A bash script to uninstall the above tools
  • For Red Hat Enterprise Linux 9:
    • A bash script that will let you effortlessly create and start systemd units and timers that will automatically backup and automatically optimize your PostgreSQL 13 databases
    • A bash script to uninstall the above tools

How to use on macOS

Download the macos-install.sh file and execute the script with sudo permissions:

sudo sh macos-install.sh

The script will then:

  1. Prompt you for the name of your PostgreSQL database;
  2. Prompt you for the path of the folder where your backups will go;
  3. Prompt you for how often you want to back the database up, in seconds; and
  4. Prompt you for how often you want to optimize the database, in seconds.

Once you run through this script, you will be automatically backing up and optimizing your database according to whatever parameters you entered.

The script creates macOS launchd daemons, so these automatic backups and automatic database optimizations will continue on schedule, even after the system is rebooted. It's neither necessary nor desirable to run the script more than once per individual Resolve database.

To verify that everything is in working order, you can periodically check the log files located in /Users/Shared/DaVinci-Resolve-PostgreSQL-Workflow-Tools/logs/.

zsh vs. bash

macOS Ventura's default shell is zsh. However, these scripts' shebangs still specify the use of bash, which has still been included since the switch from back in macOS Catalina. The scripts do not use any incompatible word splitting or array indices, so the scripts should be easily converted to native zsh in future releases of macOS. For more information, see Scripting OS X.

How to use on Red Hat Enterprise Linux

From an administrative user account, download the enterprise-linux-install.sh file and then execute the script:

sudo sh enterprise-linux-install.sh

The script will then:

  1. Prompt you for the name of your PostgreSQL database;
  2. Prompt you for the path of the folder where your backups will go;
    • Be sure to use the absolute path
  3. Prompt you for how often you want to back the database up; and
  4. Prompt you for how often you want to optimize the database.

Once you run through this script, you will be automatically backing up and optimizing your database according to whatever parameters you entered.

The script creates systemd units and timers, so these automatic backups and automatic database optimizations will continue on schedule, even after the system is rebooted. It's neither necessary nor desirable to run the script more than once per individual Resolve database.

To verify that everything is in working order, you can periodically check the log files located in /usr/local/DaVinci-Resolve-PostgreSQL-Workflow-Tools/logs.

System requirements

This script has been tested and works for PostgreSQL 13 servers for:

  • DaVinci Resolve 18

macOS

  • macOS Ventura
  • EnterpriseDB PostgreSQL 13, as included from Blackmagic Design's DaVinci Resolve Project Server app

Red Hat Enterprise Linux

  • Red Hat Enterprise Linux 7
  • Red Hat Enterprise Linux 8
  • Red Hat Enterprise Linux 9
  • PostgreSQL 13 from the PostgreSQL Yum Repository

DaVinci Resolve 18 terminology for "Project Library"

Beginning with DaVinci Resolve 18, the Project Manager window, as well as the Project Server GUI app refer to "project libraries." These are just individual PostgreSQL databases, referred to in previous versions of DaVinci Resolve as "databases." These scripts refer to the names of "databases" you want to back up and optimize.

Background

Jathavan Sriram wrote a great article back in 2014 about how to use pgAdmin III tools in bash, instead of having to use the psql shell.

The core insights from his 2014 article still apply, but several crucial changes need to be made for modern systems:

  1. Apple deprecated cron in favor of launchd.
  2. From DaVinci Resolve 12.5.4 through 17, DaVinci Resolve used PostgreSQL 9.5. For DaVinci Resolve 18 an onward, PostgreSQL 13 is recommended.
  3. The locations of the pg_dump, reindexdb, and vacuumdb binaries in PostgreSQL 13 are different from what they were in PostgreSQL 8.4 and 9.5.

What this script does

On macOS, this script creates and installs bash scripts and launchd daemons that, together, regularly and automatically backup and optimize the PostgreSQL databases that DaVinci Resolve uses.

On Red Hat Enterprise Linux, this script creates and installs bash scripts, systemd units, and systemd timers that, together, regularly and automatically backup and optimize the PostgreSQL databases that DaVinci Resolve uses. After a reboot, each systemd timer will be delayed by a random number of seconds, up to 180 seconds, so as to stagger the database utilities for optimal performance.

Configuration

macOS

The .pgpass file that the script creates assumes that the password for your PostgreSQL database is DaVinci, which is a convention from Blackmagic Design.

Make sure that you create the directory where your backups are going to go before running the script.

If you have any spaces in the full path of the directory where your backups are going, be sure to escape them with \ when you run the script.

The script can be run from any admin user so long as it's run with sudo so as to have root user permissions.

Because the script generates launchd daemons, the backups and optimizations will occur if the machine is running, even without any user being logged in.

Red Hat Enterprise Linux

The .pgpass file that the script creates assumes that the password for your PostgreSQL database is DaVinci, which is a convention from Blackmagic Design.

Make sure that you create the directory where your backups are going to go before running the script.

Be sure to use the absolute path for the directory into which the backups will go.

The script can be run from any admin user so long as it's run with sudo so as to have root user permissions.

Restoring from backup

The *.backup files that this script generates can be restored into a new, totally blank PostgreSQL database in the event of a disk failure. These *.backup files are also handy even just to migrate entire databases from one PostgreSQL server to another.

In the event of a disk failure hosting the PostgreSQL database, the procedure to restore from these *.backup files to a new PostgreSQL server is as follows:

  1. Set up a new, totally fresh PostgreSQL server

  2. Create a fresh PostgreSQL database on the server, naming your database whatever you want it to be named

    1. If the version of Resolve you're using is the same version you were using when the *.backup file was created, you can just connect your client workstation and create a new blank database via the GUI;
    2. But if your *.backup file was created for some earlier version of Resolve, you'll need to become the postgres user with root permissions and create a completely blank database:
      $ sudo su - postgres
      $ createdb <newdatabasename>
      
  3. Run the command:

    $ pg_restore --host localhost --username postgres --password --single-transaction --clean --if-exists --dbname=<dbname> <full path to your backup file>
    

    You'll need to enter the password for the postgres user. This is the password for the PostgreSQL database user postgres, not the OS user.

  4. If the version of Resolve you're using is the same version you were using when the *.backup file was created, you should be good to go, but if your *.backup file was created for some earlier version of Resolve, you should now be able to connect the the database via the GUI on the client and then upgrade it for your current version.

Uninstall

Uninstall on macOS

If you wish to stop automatically backing up and optimizing a particular database, you can run macos-uninstall.sh:

sudo sh macos-uninstall.sh

The script will ask you what database you want to stop backing up and optimizing. The database you specify will then stop being backed up, stop being optimized, and all relevant files will be safely and cleanly removed from your system. The database itself, as well as the backup files that have already been generated, will remain untouched.

Uninstall on Red Hat Enterprise Linux

If you wish to stop automatically backing up and optimizing a particular database, you can run enterprise-linux-uninstall.sh:

sudo sh enterprise-linux-uninstall.sh

The script will ask you what database you want to stop backing up and optimizing. The database you specify will then stop being backed up, stop being optimized, and all relevant files will be safely and cleanly removed from your system. The database itself, as well as the backup files that have already been generated, will remain untouched.

davinci-resolve-postgresql-workflow-tools's People

Contributors

sethgoldin avatar walterfilms avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

davinci-resolve-postgresql-workflow-tools's Issues

Error message concerning .pgpass on Linux

On Linux, upon running the centos-automate-workflow.sh script with sudo, the script seems to check for .pgpass in the wrong spot, in /home/root/, and throws the following error:

touch: cannot touch ‘/home/root/.pgpass’: No such file or directory
./centos-automate-workflow.sh: line 79: /home/root/.pgpass: No such file or directory
chmod: cannot access ‘/home/root/.pgpass’: No such file or directory

It might be unnecessary though, since the pg_hba.conf file is already set up to just use trust authentication.

Will see about completely eliminating the step for creating a .pgpass file.

Feature Request: Support Initial Start Time For Recurring Backup/Optimize Tasks

I would love to see a change to the install that allows specification of time of when to begin the count. In my use case, I want a daily backup. However, since I would install it during work hours, the interval should commence from that point if my understanding is correct. Meaning our DB will be locked for a good while while the dump is underway when the 24h window cycles. I would like this to configurable so it can commence the job at night, where it won't affect availability or latency.

I'm open to alternatives, as long as it gets the same result. While I could do it with a cronjob, but I'd rather use your solution if it can support it.

0 bytes

For some reason, when the launchd runs. The backup has 0 bytes. If I run the backup sh manually, it is a normal size. The log says that it ran successfully. Any ideas?

Some sort of permissions issue on Big Sur is preventing the agents from having permission to run the backup and optimize shell scripts

When running the .sh scripts manually, not through the LaunchAgent, I see:
pg_dump: [custom archiver] could not open output file "/Users/testuser/Desktop//test1_2021_03_10_11_24.backup": Permission denied

Giving Terminal Full Disk Access doesn't resolve the issue.

This is on macOS Big Sur 11.2.2.

The .backup file that gets generated is 0 bytes, and the logs aren't getting touched.

Ubuntu Support

Hi Seth,

this is great, thanks! I have my Postgres DB installed on ubuntu instead of centOS, will your script work on Ubuntu?

Scheduled backup time seems to concatenate backup process time as back ups go on...

I have been using this script to do nightly Resolve DB backups for the last month.

Thanks so much for creating it!

All is working well but I can see in the logs and files that the backup is slowly slipping forward in time as the backups go on... This seems like something worth flagging if anyone plans to use this long term..

Please see my log:

ResolveDB_23_Q2 was backed up at 2023_10_02_23_05 into /Users/bkup/Desktop/Resolve-database-backups.
ResolveDB_23_Q2 was optimized at 2023_10_02_23_06.
ResolveDB_23_Q2 was backed up at 2023_10_03_23_09 into /Users/bkup/Desktop/Resolve-database-backups.
ResolveDB_23_Q2 was backed up at 2023_10_04_23_14 into /Users/bkup/Desktop/Resolve-database-backups.
ResolveDB_23_Q2 was backed up at 2023_10_05_23_18 into /Users/bkup/Desktop/Resolve-database-backups.
ResolveDB_23_Q2 was backed up at 2023_10_06_23_22 into /Users/bkup/Desktop/Resolve-database-backups.
ResolveDB_23_Q2 was backed up at 2023_10_07_23_26 into /Users/bkup/Desktop/Resolve-database-backups.
ResolveDB_23_Q2 was backed up at 2023_10_08_23_30 into /Users/bkup/Desktop/Resolve-database-backups.
ResolveDB_23_Q2 was optimized at 2023_10_09_23_06.
ResolveDB_23_Q2 was backed up at 2023_10_09_23_34 into /Users/bkup/Desktop/Resolve-database-backups.
ResolveDB_23_Q2 was backed up at 2023_10_10_23_38 into /Users/bkup/Desktop/Resolve-database-backups.
ResolveDB_23_Q2 was backed up at 2023_10_11_23_42 into /Users/bkup/Desktop/Resolve-database-backups.
ResolveDB_23_Q2 was backed up at 2023_10_12_23_46 into /Users/bkup/Desktop/Resolve-database-backups.
ResolveDB_23_Q2 was backed up at 2023_10_13_23_50 into /Users/bkup/Desktop/Resolve-database-backups.
ResolveDB_23_Q2 was backed up at 2023_10_14_23_54 into /Users/bkup/Desktop/Resolve-database-backups.
ResolveDB_23_Q2 was backed up at 2023_10_15_23_59 into /Users/bkup/Desktop/Resolve-database-backups.
ResolveDB_23_Q2 was optimized at 2023_10_16_23_06.
ResolveDB_23_Q2 was backed up at 2023_10_17_00_03 into /Users/bkup/Desktop/Resolve-database-backups.
ResolveDB_23_Q2 was backed up at 2023_10_18_00_07 into /Users/bkup/Desktop/Resolve-database-backups.
ResolveDB_23_Q2 was backed up at 2023_10_19_00_11 into /Users/bkup/Desktop/Resolve-database-backups.
ResolveDB_23_Q2 was backed up at 2023_10_20_00_15 into /Users/bkup/Desktop/Resolve-database-backups.
ResolveDB_23_Q2 was backed up at 2023_10_21_00_20 into /Users/bkup/Desktop/Resolve-database-backups.
ResolveDB_23_Q2 was backed up at 2023_10_22_00_24 into /Users/bkup/Desktop/Resolve-database-backups.
ResolveDB_23_Q2 was backed up at 2023_10_23_00_28 into /Users/bkup/Desktop/Resolve-database-backups.
ResolveDB_23_Q2 was optimized at 2023_10_23_23_06.
ResolveDB_23_Q2 was backed up at 2023_10_24_00_33 into /Users/bkup/Desktop/Resolve-database-backups.
ResolveDB_23_Q2 was backed up at 2023_10_25_00_37 into /Users/bkup/Desktop/Resolve-database-backups.
ResolveDB_23_Q2 was backed up at 2023_10_26_00_41 into /Users/bkup/Desktop/Resolve-database-backups.
ResolveDB_23_Q2 was backed up at 2023_10_27_00_46 into /Users/bkup/Desktop/Resolve-database-backups.

We started the first back up at 23:05 and now we are at 00:46 on backup number 25 so it looks like each back up takes 5mins, then the counter starts again? Essentially adding the time that back up takes to the 24Hr timer.

I worry that if this script gets left unattended it will ultimately spill in the working day and could cause DB slow down/ mess with archive/backup software etc.

Fallback destination directory

From @macvfx, a suggestion to check to make sure the folder exists:

#!/bin/bash

# The backup file path to check
file_path="/path/to/folder"

# Check if the file path exists and if we have write permissions on it
if [ -d "$file_path" ] && [ -w "$file_path" ]; then
  echo "The file path exists and we have write permissions on it. Do the regular backup step here"
else
  echo "The file path either does not exist or we do not have write permissions on it. Do the fail safe backup"
fi

Change time of backup

I installed the script early in the morning, to run every 24 hours, but every day it seems to start a little later, where it now runs in the middle of the day, causing issues for people using the db. How do I adjust the time?

Full disk access is required if the destination for the `.backup` files are set as a remote network share like NFS or SMB

A couple of users noticed that if a remote network share is specified as the destination directory for the .backup files, like an NFS or SMB share, then the launchd daemon won't have permission to execute pg_dump.

A couple of similar cases on Stack Exchange:

So a line of documentation should be added to README.md to let users know to grant the /Library/PostgreSQL/13/bin/pg_dump binary full disk access if they intend to save out to a remote network share.

pg_restore error when restoring from another computer

Hello

I'm trying to restore a database backup on another computer, with the following command :

/usr/local/Cellar/postgresql@13/13.10/bin/pg_restore --host localhost --username postgres --password --single-transaction --clean --if-exists --dbname=test_restore_resolve /Users/me/Desktop/resolve_2023_02_21_09_00_From_Rocky.backup
I end up with the following error in the Terminal :

pg_restore: while INITIALIZING: pg_restore: error: could not execute query: ERROR: unrecognized configuration parameter "idle_in_transaction_session_timeout" Command was: SET idle_in_transaction_session_timeout = 0;

Do you know if -and where and how- it is a parameter that I need to set on the inital server (the one that hosts the database and does the backup) or on the client I'm trying to restore on ?

illegal time format

We've tested this on 2 differents workstation with the Resolve DB server installed and we are getting a illegal time format error when running sudo sh macos-install.sh

Any idea why?
Capture d’écran, le 2023-08-18 à 10 13 42

systemd timers don't have permissions to start after a reboot on CentOS

This seems to be some sort of permissions issue. Running sudo systemctl start backup-<database>.timer and sudo systemctl start optimize-<database>.timer, and then authorizing with a password after each, seems to get everything working properly again.

The timers somehow need to be given permission to start after a reboot.

Back up zero byte

I have install the script on 2 different servers.

One is working perfectly fine ( Mac osx with davinci 15)
The other only create zero byte backup. ( Mac osx davinci 16)

I can't find the reason why.
The password and db is ok.
I check .pgpass
I add the "TRUST" in .conf

is there also a way to backup multiple database?

Cheers!

-BP

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.