Code Monkey home page Code Monkey logo

craft-scripts's Introduction

craft-scripts

Shell scripts to manage database backups, asset backups, file permissions, asset syncing, cache clearing, and database syncing between Craft CMS environments

Overview

There are several scripts included in craft-scripts, each of which perform different functions. They all use a shared .env.sh to function. This .env.sh should be created on each environment where you wish to run the craft-scripts, and it should be excluded from your git repo via .gitignore.

Craft-Scripts works with both Craft 2.x & Craft 3.x, and has built-in support for both mysql as well as postgres databases.

Installation

  • Copy the scripts folder into the root directory of your Craft CMS project
  • Duplicate the example.env.sh file, and rename it to .env.sh
  • Add .env.sh to your .gitignore file
  • Then open up the .env.sh file into your favorite editor, and replace REPLACE_ME with the appropriate settings.

Upgrading

To upgrade to a later version of Craft-Scripts, replace the contents of your scripts folder with the newest Craft-Scripts, while preserving your existing .env.sh file.

Craft-Scripts comes with defaults so that even older .env.sh files should work with the latest Craft-Scripts.

set_perms.sh

The set_perms.sh script sets the Craft CMS install file permissions in a strict manner, to assist in hardening Craft CMS installs.

See Hardening Craft CMS Permissions for a detailed writeup.

Note: if you use git, please see the Permissions and Git section below.

clear_caches.sh

The clear_caches.sh script clears the Craft CMS caches by removing all of the craft/storage/runtime/ cache dirs, as well as emptying the craft_templatecaches db table.

It can also clear Redis db caches if LOCAL_REDIS_DB_ID is set, and it can clear FastCGI Cache if LOCAL_FASTCGI_CACHE_DIR is set.

If you want to add this to your Forge / DeployBot / Buddy.works deploy script so that caches are auto-cleared on deploy, set up the .env.sh on your remote server(s) and then add this to your deploy script:

scripts/clear_caches.sh

The above assumes that the current working directory is the project root already.

pull_db.sh

The pull_db.sh script pulls down a database dump from a remote server, and then dumps it into your local database. It backs up your local database before doing the dump.

The db dumps that craft-scripts does will exclude tables that are temporary/cache tables that we don't want in our backups/restores, such as the templatecaches table.

See Database & Asset Syncing Between Environments in Craft CMS for a detailed writeup.

N.B.: The pull_db.sh script can be used even if the local and remote are on the same server.

pull_assets.sh

The pull_assets.sh script pulls down an arbitrary number of asset directories from a remote server, since we keep client-uploadable assets out of the git repo. The directories it will pull down are specified in LOCAL_ASSETS_DIRS

It will also pull down the Craft userphotos and rebrand directories from craft/storage by default. The directories it will pull down are specified in LOCAL_CRAFT_FILE_DIRS

See Database & Asset Syncing Between Environments in Craft CMS for a detailed writeup.

N.B.: The pull_assets.sh script can be used even if the local and remote are on the same server.

pull_backups.sh

The pull_backups.sh script pulls down the backups created by craft-scripts from a remote server, and synced into the LOCAL_BACKUPS_PATH

For database backups, a sub-directory REMOTE_DB_NAME/db inside the REMOTE_BACKUPS_PATH directory is used for the database backups.

For asset backups, a sub-directory REMOTE_DB_NAME/assets inside the REMOTE_BACKUPS_PATH directory is used for the asset backups.

Because rsync is used for these backups, you can put a .rsync-filter in any directory to define files/folders to ignore. More info

See Mitigating Disaster via Website Backups for a detailed writeup.

sync_backups_to_s3.sh

The sync_backups_to_s3.sh script syncs the backups from LOCAL_BACKUPS_PATH to the Amazon S3 bucket specified in REMOTE_S3_BUCKET.

If you have defined a optional subfolder, it will contain the backups to the path defined in REMOTE_S3_PATH.

This script assumes that you have already installed awscli and have configured it with your credentials.

It's recommended that you set up a separate user with access to only S3, and set up a private S3 bucket for your backups.

You can set LOCAL_AWS_PROFILE to determine which AWS profile to connect with.

See Mitigating Disaster via Website Backups for a detailed writeup.

backup_db.sh

The backup_db.sh script backs up the local database into a timestamped, gzip compressed archive into the directory set via LOCAL_BACKUPS_PATH. It will also automatically rotate out (delete) any backups that are older than GLOBAL_DB_BACKUPS_MAX_AGE old.

The database backups exclude temporary/cache tables, and are stored in the sub-directory LOCAL_DB_NAME/db, inside of LOCAL_BACKUPS_PATH.

The numbers at the end of the backup archive are a timestamp in the format of YYYYMMDD-HHMMSS.

See the Automated Script Execution section below for details on how to run this automatically

See Mitigating Disaster via Website Backups for a detailed writeup.

backup_assets.sh

The backup_assets.sh script backs up an arbitrary number of asset directories to the directory specified in LOCAL_BACKUPS_PATH. The directories it backs are up specified in LOCAL_ASSETS_DIRS, just as they were for the pull_assets.sh script.

It will also back up the Craft userphotos and rebrand directories from craft/storage by default. The directories it will backup are specified in LOCAL_CRAFT_FILE_DIRS

Because rsync is used for these backups, you can put a .rsync-filter in any directory to define files/folders to ignore. More info

For example, if you don't want any Craft image transforms backed up, your .rsync-filter file in each assets directory might look like this:

# This file allows you to add filter rules to rsync, one per line, preceded by either
# `-` or `exclude` and then a pattern to exclude, or `+` or `include` and then a pattern
# to include. More info: http://askubuntu.com/questions/291322/how-to-exclude-files-in-rsync
- _*/**

See the Automated Script Execution section below for details on how to run this automatically

See Mitigating Disaster via Website Backups for a detailed writeup.

backup_dirs.sh

The backup_dirs.sh script backs up an arbitrary number of directories to the directory specified in LOCAL_BACKUPS_PATH. The directories it backs are up specified in LOCAL_DIRS_TO_BACKUP.

This script is provided in case you have other files outside of your project that need backing up. For example, you might have a separate wiki or directory of config files.

Because rsync is used for these backups, you can put a .rsync-filter in any directory to define files/folders to ignore. More info

For example, if you have a wiki with data/cache and data/tmp directories that you don't want backed up, your .rsync-filter file in the wiki directory might look like this:

# This file allows you to add filter rules to rsync, one per line, preceded by either
# `-` or `exclude` and then a pattern to exclude, or `+` or `include` and then a pattern
# to include. More info: http://askubuntu.com/questions/291322/how-to-exclude-files-in-rsync
- public/data/cache
- public/data/tmp

See the Automated Script Execution section below for details on how to run this automatically

See Mitigating Disaster via Website Backups for a detailed writeup.

restore_db.sh

The restore_db.sh restores the local database to the database dumb passed in via command line argument. It backs up your local database before doing the restore.

You can pass in either a path to a .sql file or .gz file to restore_db.sh, and it will do the right thing based on the file type.

See Mitigating Disaster via Website Backups for a detailed writeup.

restore_assets.sh

The restore_assets.sh restores the assets from the backup that has been created with backup_assets.sh.

restore_dirs.sh

The restore_dirs.sh restores the dirs from the backup that has been created with backup_dirs.sh.

Setting it up

  1. Download or clone the craft-scripts git repo
  2. Copy the scripts directory into the root directory of your Craft CMS project
  3. In the scripts directory, duplicate the craft2-example.env.sh (for Craft 2.x projects) or craft3-example.env.sh (for Craft 3.x projects) file, and rename it to .env.sh. These *-example.env.sh files are largely the same, just with some different defaults for Craft 2.x and Craft 3.x.
  4. Add .env.sh to your .gitignore file
  5. Then open up the .env.sh file into your favorite editor, and replace REPLACE_ME with the appropriate settings.

All configuration is done in the .env.sh file, rather than in the scripts themselves. This is is so that the same scripts can be used in multiple environments such as local dev, staging, and live production without modification. Just create a .env.sh file in each environment, and keep it out of your git repo via .gitignore.

Global Settings

All settings that are prefaced with GLOBAL_ apply to all environments.

GLOBAL_DB_TABLE_PREFIX is the Craft database table prefix, usually craft_

GLOBAL_CRAFT_PATH is the path of the craft folder, relative to the root path. This should normally be craft/, unless you have moved it elsewhere. Paths should always have a trailing /

GLOBAL_DB_BACKUPS_MAX_AGE Is the maximum age of local backups in days; backups older than this will be automatically rotated out (removed).

GLOBAL_DB_DRIVER is the database driver for this Craft install (mysql or pgsql)

Local Settings

All settings that are prefaced with LOCAL_ refer to the local environment where the script will be run, not your local dev environment.

LOCAL_ROOT_PATH is the absolute path to the root of your local Craft install, with a trailing / after it.

LOCAL_ASSETS_PATH is the relative path to your local assets directories, with a trailing / after it.

LOCAL_CHOWN_USER is the user that is the owner of your entire Craft install.

LOCAL_CHOWN_GROUP is your webserver's group, usually either nginx or apache.

LOCAL_WRITEABLE_DIRS is a quoted list of directories relative to LOCAL_ROOT_PATH that should be writeable by your webserver.

LOCAL_ASSETS_DIRS is a quoted list of asset directories relative to LOCAL_ASSETS_PATH that you want to pull down from the remote server. It's done this way in case you wish to sync some asset directories, but not others. If you want to pull down all asset directories in LOCAL_ASSETS_PATH, just leave one blank quoted string in this array

LOCAL_CRAFT_FILE_DIRS is a quoted list of Craft file directories relative to LOCAL_CRAFT_FILES_PATH that you want to pull down from the remote server. By default, it will pull down the userphotos and rebrand directories in craft/storage, which typically are not kept in git. If you don't want it to sync anything, just leave the setting empty, e.g.: LOCAL_CRAFT_FILE_DIRS=()

LOCAL_DIRS_TO_BACKUP is an array of bsolute paths to directories to back up, in addition to LOCAL_ASSETS_DIRS and LOCAL_CRAFT_FILE_DIRS

LOCAL_FASTCGI_CACHE_DIR is the local FastCGI Cache path; leave it empty ("") if you're not using FastCGI Cache; paths should always have a trailing /. The clear_caches.sh script will delete everything in this directory when it is executed (say, on deploy)

LOCAL_REDIS_DB_ID is the local Redis database ID; leave it empty ("") if you're not using Redis. The clear_caches.sh script will purge this Redis database when it is executed (say, on deploy)

LOCAL_DB_NAME is the name of the local mysql Craft CMS database

LOCAL_DB_PASSWORD is the password for the local mysql Craft CMS database

LOCAL_DB_USER is the user for the local mysql Craft CMS database

LOCAL_DB_HOST is the host name of the local mysql database host. This is normally localhost

LOCAL_DB_PORT is the port number of the local mysql database host. This is normally 3306 for mysql, and 5432 for postgres.

LOCAL_MYSQL_CMD is the command for the local mysql executable, normally just mysql. It is provided because some setups like MAMP require a full path to a copy of mysql inside of the application bundle.

LOCAL_MYSQLDUMP_CMD is the command for the local mysqldump executable, normally just mysqldump. It is provided because some setups like MAMP require a full path to a copy of mysqldump inside of the application bundle.

LOCAL_PSQL_CMD is the command for the local postgres executable, normally just psql.

LOCAL_PG_DUMP_CMD is the command for the local pg_dump executable, normally just pg_dump.

LOCAL_DB_LOGIN_PATH if this is set, it will use --login-path= for your local db credentials instead of sending them in via the commandline (see below)

LOCAL_BACKUPS_PATH is the absolute path to the directory where local backups should be stored. For database backups, a sub-directory LOCAL_DB_NAME/db will be created inside the LOCAL_BACKUPS_PATH directory to store the database backups. Paths should always have a trailing /

LOCAL_AWS_PROFILE is an AWS named profile you can set to determine which profile to connect to S3 with.

Using mysql within a local docker container

LOCAL_MYSQL_CMD which is normally just mysql, is prepended with docker exec -i CONTAINER_NAME to execute the command within the container. (Example: docker exec -i container_mysql_1 mysql)

LOCAL_MYSQLDUMP_CMD which is normally just mysqldump, is prepended with docker exec CONTAINER_NAME to execute the command within the container. (Example: docker exec container_mysql_1 mysqldump)

Remote Settings

All settings that are prefaced with REMOTE_ refer to the remote environment where assets and the database will be pulled from.

REMOTE_SSH_LOGIN is your ssh login to the remote server, e.g.: [email protected]

REMOTE_SSH_PORT is the port to use for ssh on the remote server. This is normally 22

REMOTE_DB_USING_SSH determines whether the database connection needs to be done over ssh, or the database should be directly connected to (such as for Heroku or Amazon RDS services). This is normally yes

REMOTE_ROOT_PATH is the absolute path to the root of your Craft install on the remote server, with a trailing / after it.

REMOTE_ASSETS_PATH is the relative path to the remote assets directories, with a trailing / after it.

REMOTE_DB_NAME is the name of the remote mysql Craft CMS database

REMOTE_DB_PASSWORD is the password for the remote mysql Craft CMS database

REMOTE_DB_USER is the user for the remote mysql Craft CMS database

REMOTE_DB_HOST is the host name of the remote mysql database host. This is normally localhost

REMOTE_DB_PORT is the port number of the remote mysql database host. This is normally 3306 for mysql, and 5432 for postgres.

REMOTE_MYSQL_CMD is the command for the local mysql executable, normally just mysql.

REMOTE_MYSQLDUMP_CMD is the command for the local mysqldump executable, normally just mysqldump.

REMOTE_PSQL_CMD is the command for the remote postgres executable, normally just psql.

REMOTE_PG_DUMP_CMD is the command for the remote pg_dump executable, normally just pg_dump.

REMOTE_DB_LOGIN_PATH if this is set, it will use --login-path= for your remote db credentials instead of sending them in via the commandline (see below)

REMOTE_BACKUPS_PATH is the absolute path to the directory where the remote backups are stored. For database backups, a sub-directory REMOTE_DB_NAME/db inside the REMOTE_BACKUPS_PATH directory is used for the database backups. Paths should always have a trailing /

REMOTE_S3_BUCKET is the name of the Amazon S3 bucket to backup to via the sync_backups_to_s3.sh script

REMOTE_S3_PATH is a optional path relative to the Amazon S3 bucket where the sync_backups_to_s3.sh script will contain the backups if specified

Setting up SSH Keys

Normally when you ssh into a remote server (as some of the craft-scripts do), you have to enter your password. Best practices from a security POV is to not allow for password-based logins, but instead use SSH Keys.

The day in, day out benefit of setting up SSH Keys is that you never have to enter your password again, so it allows for automated execution of the various craft-scripts. Use the excellent How To Set Up SSH Keys artice as a guide for setting up your SSH keys.

Permissions and Git

If you use git, a sample .gitignore file that you can modify & use for your Craft CMS projects is included in craft-scripts as example.gitignore. If you wish to use it, the file should be copied to your Craft CMS project root, and renamed .gitignore

If you change file permissions on your remote server, you may encounter git complaining about overwriting existing local changes when you try to deploy. This is because git considers changing the executable flag to be a change in the file, so it thinks you changed the files on your server (and the changes are not checked into your git repo).

To fix this, we just need to tell git to ignore permission changes on the server. You can change the fileMode setting for git on your server, telling it to ignore permission changes of the files on the server:

git config --global core.fileMode false

See the git-config man page for details.

The other way to fix this is to set the permission using set_perms.sh in local dev, and then check the files into your git repo. This will cause them to be saved with the correct permissions in your git repo to begin with.

The downside to the latter approach is that you must have matching user/groups in both local dev and on live production.

Automatic Script Execution

If you want to run any of these scripts automatically at a set schedule, here's how to do it. We'll use the backup_db.sh script as an example, but the same applies to any of the scripts.

Please see the Setting up SSH Keys section and set up your SSH keys before you set up automatic script execution.

On Linux

If you're using Forge you can set the backup_db.sh script to run nightly (or whatever interval you want) via the Scheduler. If you're using ServerPilot.io or are managing the server yourself, just set the backup_db.sh script to run via cron at whatever interval you desire.

craft-scripts includes a crontab-helper.txt that you can add to your crontab to make configuring cron easier. Remember to use full, absolute paths to the scripts when running them via cron, as cron does not have access to your environment paths, e.g.:

/home/forge/nystudio107.com/scripts/backup_db.sh

On a Mac

If you're using a Mac and you want to execute the script locally, Apple uses Launch Daemons instead of cron.

N.B.: Even if you are on a Mac, if you run your local dev in a VM like Vagrant/Homestead, you'll want to execute the craft-scripts from inside of the VM itself, not on your local Mac. If you use something like Valet or Mamp, read on.

Included in craft-scripts is a com.example.launch_daemon.plist to help you get started. This file is an XML file, and the name should be a unique, reverse-DNS-style name suffixed with .plist. This file is analogous to a single line in a crontab file.

Rename com.example.launch_daemon.plist to something unique to your project/script, e.g.: com.clientdomain.backup_db.plist and place it in /Library/LaunchDaemons/ (you'll need to sudo to do this).

The Launch Daemon .plist file is an XML file with a series of <key></key>s followed by some type that is a value for that key. The value for the <key>Label</key> should match the name of the file, minus the .plist extension, e.g.: <string>com.clientdomain.backup_db</string>. The value for the <key>UserName</key> should be the user name that you want the task to run as, e.g.: <string>andrew</string>

The value for the <key>Program</key> is a path to the command to execute, e.g.: <string>/Users/andrew/webdev/sites/nystudio107/scripts/backup_db.sh</string>

Launch Daemons offer any number of ways to schedule when and how they execute; please see the Launch Daemon documentation for details.

Once the file has been created in /Library/LaunchDaemons/, it'll need to be loaded (you only need to do this once) via launchctl, e.g.:

sudo launchctl load /Library/LaunchDaemons/com.clientdomain.backup_db.plist

For more information on configuring Launch Daemons, please see the excellent launchd.info website.

Using login-path with mysql 5.6

If you're using mysql 5.6 or later, you’ll note the warning from mysql (this is not an issue if you’re using MariaDB):

mysql: [Warning] Using a password on the command line interface can be insecure.

What the craft-scripts is doing isn’t any less secure than if you typed it on the command line yourself; everything sent over the wire is always encrypted via ssh. However, you can set up login-path to store your credentials in an encrypted file as per the Passwordless authentication using mysql_config_editor with MySQL 5.6 article.

If you set LOCAL_DB_LOGIN_PATH or REMOTE_DB_LOGIN_PATH it will use --login-path= for your db credentials on the respective environments instead of sending them in via the commandline.

For example, for my local dev setup:

mysql_config_editor set --login-path=localdev --user=homestead --host=localhost --port=3306 --password

...and then enter the password for that user. And then in the .env.sh I set it to:

LOCAL_DB_LOGIN_PATH="localdev"

...and it will use my stored, encrypted credentials instead of passing them in via commandline. You can also set this up on your remote server, and then set it via REMOTE_DB_LOGIN_PATH

Brought to you by nystudio107

craft-scripts's People

Contributors

angrybrad avatar dennisfrank avatar jackbewley avatar jdsimcoe avatar kboduch avatar khalwat avatar martinherweg avatar mildlygeeky avatar preposthuman avatar qbunt avatar rostockahoi avatar sjelfull avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

craft-scripts's Issues

You have an error in your SQL syntax

I've just tried to backup a site (which I've done before with Craft Scripts) and I'm getting this error message:

ERROR 1064 (42000) at line 3802: You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near '' at line 1

Not expecting that this is directly related to Craft Scripts, but any idea what could be causing this?

mysql commands on a nitro environment

hello,

i just switched from using laradock and docker to nitro on my local dev environment. Now i'm wondering what i need to set in the config for:

LOCAL_MYSQL_CMD="mysql"
LOCAL_MYSQLDUMP_CMD="mysqldump"

On my previous environment i needed to config it like this:

LOCAL_MYSQL_CMD="docker exec -i laradock_mysql_1 mysql"
LOCAL_MYSQLDUMP_CMD="docker exec laradock_mysql_1 mysqldump"

Do we need to ssh into nitro to use the mysql there? Does anybody got it working with nitro and multipass and has got some hints for me?

greetings
Marten

Unsupported mysqldump args cause scripts to fail with MariaDB

Apparently MariaDB does not recognise the --set-gtid-purged=OFF arg that was added in the last update, so pull_db.sh may fail if the DB in use is MariaDB.

$ ./scripts/pull_db.sh
mysqldump: unknown variable 'set-gtid-purged=OFF'
mysqldump: unknown variable 'set-gtid-purged=OFF'
site-db-dump-20210206.sql.gz       100%   58     0.9KB/s   00:00    
mysqldump: unknown variable 'set-gtid-purged=OFF'

Found a similar issue on the Laravel repo which probably describes it better.

If I comment out that line, it seems to work fine for me again, though my remote is a MariaDB instance on Amazon RDS. Not sure if there's a way around it that also works for those permissions issues you were trying to solve?

Permission set with set_perms.sh on ./craft being 644

Question

set_perms.sh sets the Craft script (./craft) to 644, which ends up preventing any Composer Scripts that use Craft scripts from being executed. I know that changing the Craft script to have executable permissions will resolve this (chmod a+x craft), but I was wondering if setting the Craft script to 644 was intentional.

Without the Craft script being executable (specifically in the production environment), Craft scripts like clear-caches and migrating—which are part of the post-craft-update commands—won't work. So I'm figuring the 644 permission being set on the Craft script is not intentional? Not sure, but figured I'd ask the wizard directly. Thanks!

Add an asset solution for Windows users

So this is obviously not something that affects many people, I know most of us are on Macs, but one thing I came across is that Rsync does not exist for Windows users. There are a few options out there for us to use, but many of them don't have support for all the flags your script in ./pull_assets.sh uses.

It's not an urgent issue at all, but if anyone has a solution for getting around this requirement, it'd be rad. It's the only stumbling block I have in getting this environment up and running in a near 1:1 setup with my Macs.

How does this work with Craft 3.1 project config?

I'm mainly wondering about the scenario where I am using the project.yaml to sync settings/structures between environments but then I use pull_db.sh to sync the actual db content.

Does pull_db.sh only bring in the actual content, or will it also override the db settings that are initially created from the project.yaml file?

Thanks!

Support DDEV DB imports

In DDEV, our database settings are set to nothing more than db, as the system handles all that on its own. This causes a problem, because when I run ./scripts/pull_db.sh, I get an error with those values, and the message: sh Unknown suffix 'd' used for variable 'port' (value 'db') mysqldump: Error while setting value 'db' to 'port'

Describe the solution you would like

I’d like for the script to accommodate this issue when validating, as we don‘t have direct access to the ports.

Add push_db

When developing a site locally, I usually push to staging all the time. Do you have a script for pushing to remote already, or should I send you a pull request?

Pull DB issue

Hey Andrew!

I've been using these scripts for a long time and rarely have issues but I'm stumped on this one.

I can pull assets without issue, but not the DB. When I try, I get the following error:

scp: /tmp/svrwudhceb-db-dump-20231013.sql.gz: No such file or directory
*** Backed up local database to /tmp/[redacted-local-db-name]-db-backup-20231013.sql.gz
gunzip: can't stat: /tmp/svrwudhceb-db-dump-20231013.sql.gz (/tmp/svrwudhceb-db-dump-20231013.sql.gz.gz): No such file or directory
*** Restored local database from /tmp/svrwudhceb-db-dump-20231013.sql.gz

Production DB version is: mysql Ver 15.1 Distrib 10.4.20-MariaDB, for debian-linux-gnu (x86_64) using readline 5.2

I've double- and triple-checked credentials—they're all correct. Not sure what the issue could be as this is pushing out of my area of expertise. Any thoughts or suggestions?

Thanks!

Commerce Orders & Customers

Wondering what your thoughts are to exclude certain sensitive fields from being imported or obfuscate the data.

I saw in an earlier issue that you prefer to have the whole database locally, does this apply to orders and customers as well?

Thanks!

sync_backups_to_s3 needs PATH with Laravel Forge scheduler (doc suggestion)

In case someone else has this problem, when using sync_backups_to_s3.sh with Laravel Forge scheduler, I needed to add PATH to the script so cron could find the aws command.

PATH=/home/forge/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin

You can also set the command with an absolute path (/home/forge/.local/bin/aws), but that’s not the recommend way.

WSL: Host not being used

Describe the bug

A clear and concise description of what the bug is.

I am setting up the scripts on Windows with the WSL subsystem and Craft Nitro. The pull assets script works but I am having a problem pulling the database. I have this set in my env file:

LOCAL_DB_HOST="mysql-5.7-3306.database.nitro"

But when I pull the scripts:

mysqldump: Got error: 1044: Access denied for user 'nitro'@'%' to database 'mydatabase' when selecting the database

This works fine on Mac, I wonder if there is a problem with Nitro. I have no idea why it is using % I've tried on MySQL 5.7 and MySQL 8.0. Any direction or ideas would be much appreciated!

DDEV pull-database not connecting or not properly pulling + restoring remote

I converted some local dev sites from Nitro to DDEV and I'm having issues running the 'pull-database' script. Here are my database details when running ddev describe:

│ db       │ OK   │ InDocker: db:3306                                       │ mysql:8.0          │
│            │          │ Host: 127.0.0.1:60810                                   │ User/Pass: 'db/db' │
│            │          │                                                                        │ or 'root/root'  
  1. If I run with the indocker settings that are in my .env file, I get a database connection error
  2. If I customize the .env.sh to use the host settings, I can connect and it looks like it connects and works (see below), but the remote database is not restored over my local database
mysqldump: [Warning] Using a password on the command line interface can be insecure.
mysqldump: Couldn't execute 'FLUSH TABLES': Access denied; you need (at least one of) the RELOAD or FLUSH_TABLES privilege(s) for this operation (1227)
mysqldump: [Warning] Using a password on the command line interface can be insecure.
Enter passphrase for key '/Users/stevehurst/.ssh/id_rsa': 
av07952-kcraft-db-dump-20230320.sql.gz                                                                  100%   42MB   8.4MB/s   00:05    
mysqldump: [Warning] Using a password on the command line interface can be insecure.
mysqldump: [Warning] Using a password on the command line interface can be insecure.
*** Backed up local database to /tmp/db-db-backup-20230320.sql.gz
mysql: [Warning] Using a password on the command line interface can be insecure.
ERROR 1062 (23000) at line 41: Duplicate entry '1' for key 'announcements.PRIMARY'
*** Restored local database from /tmp/av07952-kcraft-db-dump-20230320.sql.gz

Are there specific settings changes needed to the .env.sh file for mysql commands when running Craft on DDEV?

# Local database constants; default port for mysql is 3306, default port for postgres is 5432
# This pulls values from your local .env file
LOCAL_DB_NAME="db"
LOCAL_DB_PASSWORD="root"
LOCAL_DB_USER="root"
LOCAL_DB_HOST="127.0.0.1"
LOCAL_DB_PORT="60810"
LOCAL_DB_SCHEMA="public"

# The `mysql` and `mysqldump` commands to run locally
LOCAL_MYSQL_CMD="mysql"
LOCAL_MYSQLDUMP_CMD="mysqldump"

No assets being pulled

Ive been using these scripts for years and enjoying them.
Just recently I've been getting issues syncing assets like below. There are files to consider but nothing comes across.
Ive read about some rsync issues on mac (in this case M1 silicon). Ive checked permissions, enabled disk access but all i get is below.

Is there a cache of rsync that its comparing to and deciding nothing is required? Is there an internal reference for craft that is being read/misread?

Ive tried removing my local assets directory to 'force' syncing to no avail.
Any pointers welcome:)

domain@MacBook-Pro-2 scripts % bash pull_assets.sh
Ensuring asset directory exists at '/Users/domain/sites/mycraftsite/public_html/assets/'
receiving file list ...
1326 files to consider

Skip tables on pull_db

Hi,

I would like to know if it is possible to skip some tables on pull_db command ?

Indeed I have a warning which I may be able to solve.
mysqldump: Couldn’t execute ‘SELECT COLUMN_NAME, JSON_EXTRACT(HISTOGRAM, ‘$.“number-of-buckets-specified”’) FROM information_schema.COLUMN_STATISTICS WHERE SCHEMA_NAME = ‘db_name’ AND TABLE_NAME = ‘ct_assetindexdata’;’: Unknown table ‘column_statistics’ in information_schema (1109) mysqldump: [Warning] Using a password on the command line interface can be insecure. mysqldump: Couldn’t execute ‘SELECT COLUMN_NAME, JSON_EXTRACT(HISTOGRAM, ‘$.“number-of-buckets-specified”’) FROM information_schema.COLUMN_STATISTICS WHERE SCHEMA_NAME = ‘db_name’ AND TABLE_NAME = ‘ct_assets’;’: Unknown table ‘column_statistics’ in information_schema (1109)

Thanks !

Compatibility with CraftCMS 4

Question

I noticed that the repository only mentions CraftCMS 3 in its scripts and documentation, and the last commit was made years ago. Can you please clarify the following:

  • Are the scripts still actively maintained?
  • Is it safe to use them in a CraftCMS 4 project?
  • Are they still considered a recommended best practice for the tasks they were designed for?

Syncing assets subdirectories

Thanks for these scripts (and every blog post about Craft!)

I think there is an issue syncing assets that are in subdirectories of the base path, for instance, if env:

LOCAL_ASSETS_PATH=${LOCAL_ROOT_PATH}"public/assets/"
LOCAL_ASSETS_DIRS=(
                "images/products/subfolder1"
                "pdfs/brochures"
                )

Then the pulled assets will end up at

[...]public/assets/subfolder1
[...]public/assets/brochures

(missing out the extra path bit)

I think it's fixable using
tobystokes@26309c7
(adds the DIR path, but deletes shortest match after the last "/" which in the case on no subdirectory, is everything)
Works For Me, not rigorously tested, could be cleaned up for PR, but maybe I'm missing a trick in my setup?

Feature Request: Multiple remotes

For sites that have more than two environments (local/dev/staging/prod), it would be cool to be able to run pull_db or pull_assets from any environment. Currently I'm doing a two step process, where I ssh into dev to pull from staging, then pull from dev to local.

I might set this up eventually - I'll do a PR if I do. 🙂

pull assets ownership issue

I have production on same server, different domain /account from staging
When I log into staging SSH and pull assets, i get all the production assets pulled down to my staging site. Great.
But the weird thing is that although i have specified in env.sh the chown group as the staging account owner, the assets folder always end up after pulling the assets with the assets folder being CHOWN'd to the production account owner.

I can't see set_perm.sh is involved in this. And i have scripts running in multiple other projects inthe same way.

ANy ideas why staging assets are being chown'd with production name after pull_assets.sh is run?

FR: clear assettransformindex as part of clear_caches.sh

We have opted to use .rsync-filter to exclude asset transforms when doing any environment synching / backup (to save on storage, bandwidth). But as a result, any time we do a remote -> local sync for continuous development, we always have to jump into Craft's CP to manually clear the Asset Transform Index cache. So if the clear_caches.sh script could also include the craft_assettransformindex table, that would really help!

pull_db.sh on local MAMP Pro

Any advice on getting pull_db.sh to work on local env using MAMP Pro

Here's the output I get locally when running the ./pull_db.sh script while MAMP is running

$ ./pull_db.sh scp: /tmp/cpuksbnet-db-dump-20181022.sql.gz: No such file or directory mysqldump: [Warning] Using a password on the command line interface can be insecure. mysqldump: [Warning] Using a password on the command line interface can be insecure. *** Backed up local database to /tmp/cpuksb-db-backup-20181022.sql.gz gunzip: can't stat: /tmp/cpuksbnet-db-dump-20181022.sql.gz (/tmp/cpuksbnet-db-dump-20181022.sql.gz.gz): No such file or directory mysql: [Warning] Using a password on the command line interface can be insecure. *** Restored local database from /tmp/cpuksbnet-db-dump-20181022.sql.gz

Automatically delete tables that are not in use

Question

I was wondering if it's possible to use a parameter that deletes the whole local database before a pull from the remote database, or a check that clears up any "unused" tables that are not in the remote database. The reasoning behind this is that sometimes during updates, running migrations create new tables, but when these migrations fail, those tables aren't deleted. On a new run of the pull database sync script, those tables are not automatically deleted and will therefore stay in the database as useless tables.

ENV Paths Question

Just want to make sure I'm inputting the settings in my .env.sh file correctly to use the set_perms.sh script. So my file structure looks like this:

/my/server/path/craft
/my/server/path/public
/my/server/path/public/assets

So in the env file I would have these settings:

GLOBAL_CRAFT_PATH="./craft/"

LOCAL_ROOT_PATH="/my/server/path/"
LOCAL_ASSETS_PATH=${LOCAL_ROOT_PATH}"public/assets"

Is that correct? I wasn't sure because I don't see the GLOBAL_CRAFT_PATH variable used in the set_perms.sh file so I don't know if it will update permissions on just my craft files or on everything in my root directory.

Table 'craft_cache' doesn't exist

Thanks for a great set of scripts Andrew.

I am having this issue:

Removing cache dir /home/forge/www.domainname.co.uk/craft/storage/runtime/cache
Removing cache dir /home/forge/www.domainname.co.uk/craft/storage/runtime/compiled_templates
Removing cache dir /home/forge/www.domainname.co.uk/craft/storage/runtime/state
Emptying cache table craft_cache
mysql: [Warning] Using a password on the command line interface can be insecure.
ERROR 1146 (42S02) at line 1: Table 'craft.domainname.craft_cache' doesn't exist
Emptying cache table craft_templatecaches
mysql: [Warning] Using a password on the command line interface can be insecure.
*** Caches cleared

Could it be because the database name craft.domainname has a dot in it?

The craft_templatecaches table empties without issue.

pull_db issue

I successfully have been able to download the assets and I used to be able to download the database from the server, but now I am getting an error once the db says it was imported.

Craft Error: Craft appears to be installed but the info table is empty.
Command Line Errors: ERROR 1235 (42000) at line 3830: This version of MySQL doesn't yet support 'multiple triggers with the same action time and event for one table'

Restore Assets

as well as restore_db.sh, a restore_assets.sh option would be useful too.

fts_read: Too many open files in system

Hey I am getting this error when trying to set the permissions. Any solutions other than upping the file open limit?

find: /Users/me/www/project/src/craft/app/framework/caching/dependencies: Too many open files in system
find: fts_read: Too many open files in system

_z:25: pipe failed: too many open files in system                                                        
git_prompt_info:2: too many open files in system: /dev/null
git_prompt_info:3: too many open files in system: /dev/null

exclude tables

Should the exclude tables list in common .db exclude "_templatecachequeries"?

Sanitise personal and private data for dev environments

Would be nice on the database syncing from LIVE to DEV to be able to sanitize the data to replace user data and sensitive information with dummy data.

If I have any luck getting it to happen, will be in touch. Unless someone else can do it faster.

clear_caches is caching the wrong tables

Correct me if I'm wrong, but shouldn't the table names in the clear_caches.sh script be:

"templatecachecriteria"
"templatecacheelements"
"templatecaches"

And not:

"cache"
"templatecaches"

?

Issue with pull_db.sh

Describe the bug

When I run pull_db.sh, the script correctly makes a local backup of my script, and then does nothing with the remote DB pull. It just silently fails, but never ends. It's mystifying because I can pull the same DB to my local macOS installation with no problems, but it fails on the production server.

Versions

  • Plugin version: Whatever the most recent 2021 version was
  • Craft version: Craft Pro 3.7.37
  • MariaDB 10.6 on production
  • MySQL 5.7 remote
  • PHP 8.0 on production
  • PHP 7.3

One might think that this could be related to a MySQL oddity with 5.7, but I can reproduce with recent versions of MariaDB and the same versions of PHP as well.

What logs should I look at to determine what is failing and why?

Backup everything

You probably want to backup database, assets and files at the same time, so probably make sense to have a script that runs all of them, and optionally sync them to S3 at the same time?

That means you don't have to keep 4 cron jobs for every site.

FR: Ability to use a remote .env.sh's LOCAL values as REMOTE

When setting up a local .env.sh file, which pulls from a remote source, we have to repeat a lot of values (paths, DB creds, etc) that are (potentially) already set in that remote server's own .env.sh file. So it would be great if we could simply define:

REMOTE_ENV_PATH

And if REMOTE_ENV_PATH is defined, then the scripts uses the LOCAL_ values found in that .env.sh file as the REMOTE_ values at runtime.

Question: file/dir path of backup_dirs.sh

Hi,

thank you for this awesome scripts. I tested your backup_dirs.sh and I was wondering, why the BACKUP_FILES_DIR_PATH is not using the same structure like backup_db.sh and backups_assets.sh including the db name in the path with ${LOCAL_DB_NAME}:

backup_dirs.sh:
BACKUP_FILES_DIR_PATH="${LOCAL_BACKUPS_PATH}${FILES_BACKUP_SUBDIR}/"

vs. backup_db.sh and backup_assets.sh
BACKUP_DB_DIR_PATH="${LOCAL_BACKUPS_PATH}${LOCAL_DB_NAME}/${DB_BACKUP_SUBDIR}/"

Is there a reason behind the decision?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.