Code Monkey home page Code Monkey logo

craft-scripts's Issues

Permission set with set_perms.sh on ./craft being 644

Question

set_perms.sh sets the Craft script (./craft) to 644, which ends up preventing any Composer Scripts that use Craft scripts from being executed. I know that changing the Craft script to have executable permissions will resolve this (chmod a+x craft), but I was wondering if setting the Craft script to 644 was intentional.

Without the Craft script being executable (specifically in the production environment), Craft scripts like clear-caches and migrating—which are part of the post-craft-update commands—won't work. So I'm figuring the 644 permission being set on the Craft script is not intentional? Not sure, but figured I'd ask the wizard directly. Thanks!

No assets being pulled

Ive been using these scripts for years and enjoying them.
Just recently I've been getting issues syncing assets like below. There are files to consider but nothing comes across.
Ive read about some rsync issues on mac (in this case M1 silicon). Ive checked permissions, enabled disk access but all i get is below.

Is there a cache of rsync that its comparing to and deciding nothing is required? Is there an internal reference for craft that is being read/misread?

Ive tried removing my local assets directory to 'force' syncing to no avail.
Any pointers welcome:)

domain@MacBook-Pro-2 scripts % bash pull_assets.sh
Ensuring asset directory exists at '/Users/domain/sites/mycraftsite/public_html/assets/'
receiving file list ...
1326 files to consider

You have an error in your SQL syntax

I've just tried to backup a site (which I've done before with Craft Scripts) and I'm getting this error message:

ERROR 1064 (42000) at line 3802: You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near '' at line 1

Not expecting that this is directly related to Craft Scripts, but any idea what could be causing this?

Automatically delete tables that are not in use

Question

I was wondering if it's possible to use a parameter that deletes the whole local database before a pull from the remote database, or a check that clears up any "unused" tables that are not in the remote database. The reasoning behind this is that sometimes during updates, running migrations create new tables, but when these migrations fail, those tables aren't deleted. On a new run of the pull database sync script, those tables are not automatically deleted and will therefore stay in the database as useless tables.

Add an asset solution for Windows users

So this is obviously not something that affects many people, I know most of us are on Macs, but one thing I came across is that Rsync does not exist for Windows users. There are a few options out there for us to use, but many of them don't have support for all the flags your script in ./pull_assets.sh uses.

It's not an urgent issue at all, but if anyone has a solution for getting around this requirement, it'd be rad. It's the only stumbling block I have in getting this environment up and running in a near 1:1 setup with my Macs.

pull_db.sh on local MAMP Pro

Any advice on getting pull_db.sh to work on local env using MAMP Pro

Here's the output I get locally when running the ./pull_db.sh script while MAMP is running

$ ./pull_db.sh scp: /tmp/cpuksbnet-db-dump-20181022.sql.gz: No such file or directory mysqldump: [Warning] Using a password on the command line interface can be insecure. mysqldump: [Warning] Using a password on the command line interface can be insecure. *** Backed up local database to /tmp/cpuksb-db-backup-20181022.sql.gz gunzip: can't stat: /tmp/cpuksbnet-db-dump-20181022.sql.gz (/tmp/cpuksbnet-db-dump-20181022.sql.gz.gz): No such file or directory mysql: [Warning] Using a password on the command line interface can be insecure. *** Restored local database from /tmp/cpuksbnet-db-dump-20181022.sql.gz

fts_read: Too many open files in system

Hey I am getting this error when trying to set the permissions. Any solutions other than upping the file open limit?

find: /Users/me/www/project/src/craft/app/framework/caching/dependencies: Too many open files in system
find: fts_read: Too many open files in system

_z:25: pipe failed: too many open files in system                                                        
git_prompt_info:2: too many open files in system: /dev/null
git_prompt_info:3: too many open files in system: /dev/null

pull_db issue

I successfully have been able to download the assets and I used to be able to download the database from the server, but now I am getting an error once the db says it was imported.

Craft Error: Craft appears to be installed but the info table is empty.
Command Line Errors: ERROR 1235 (42000) at line 3830: This version of MySQL doesn't yet support 'multiple triggers with the same action time and event for one table'

pull assets ownership issue

I have production on same server, different domain /account from staging
When I log into staging SSH and pull assets, i get all the production assets pulled down to my staging site. Great.
But the weird thing is that although i have specified in env.sh the chown group as the staging account owner, the assets folder always end up after pulling the assets with the assets folder being CHOWN'd to the production account owner.

I can't see set_perm.sh is involved in this. And i have scripts running in multiple other projects inthe same way.

ANy ideas why staging assets are being chown'd with production name after pull_assets.sh is run?

Syncing assets subdirectories

Thanks for these scripts (and every blog post about Craft!)

I think there is an issue syncing assets that are in subdirectories of the base path, for instance, if env:

LOCAL_ASSETS_PATH=${LOCAL_ROOT_PATH}"public/assets/"
LOCAL_ASSETS_DIRS=(
                "images/products/subfolder1"
                "pdfs/brochures"
                )

Then the pulled assets will end up at

[...]public/assets/subfolder1
[...]public/assets/brochures

(missing out the extra path bit)

I think it's fixable using
tobystokes@26309c7
(adds the DIR path, but deletes shortest match after the last "/" which in the case on no subdirectory, is everything)
Works For Me, not rigorously tested, could be cleaned up for PR, but maybe I'm missing a trick in my setup?

FR: Ability to use a remote .env.sh's LOCAL values as REMOTE

When setting up a local .env.sh file, which pulls from a remote source, we have to repeat a lot of values (paths, DB creds, etc) that are (potentially) already set in that remote server's own .env.sh file. So it would be great if we could simply define:

REMOTE_ENV_PATH

And if REMOTE_ENV_PATH is defined, then the scripts uses the LOCAL_ values found in that .env.sh file as the REMOTE_ values at runtime.

clear_caches is caching the wrong tables

Correct me if I'm wrong, but shouldn't the table names in the clear_caches.sh script be:

"templatecachecriteria"
"templatecacheelements"
"templatecaches"

And not:

"cache"
"templatecaches"

?

Pull DB issue

Hey Andrew!

I've been using these scripts for a long time and rarely have issues but I'm stumped on this one.

I can pull assets without issue, but not the DB. When I try, I get the following error:

scp: /tmp/svrwudhceb-db-dump-20231013.sql.gz: No such file or directory
*** Backed up local database to /tmp/[redacted-local-db-name]-db-backup-20231013.sql.gz
gunzip: can't stat: /tmp/svrwudhceb-db-dump-20231013.sql.gz (/tmp/svrwudhceb-db-dump-20231013.sql.gz.gz): No such file or directory
*** Restored local database from /tmp/svrwudhceb-db-dump-20231013.sql.gz

Production DB version is: mysql Ver 15.1 Distrib 10.4.20-MariaDB, for debian-linux-gnu (x86_64) using readline 5.2

I've double- and triple-checked credentials—they're all correct. Not sure what the issue could be as this is pushing out of my area of expertise. Any thoughts or suggestions?

Thanks!

Backup everything

You probably want to backup database, assets and files at the same time, so probably make sense to have a script that runs all of them, and optionally sync them to S3 at the same time?

That means you don't have to keep 4 cron jobs for every site.

exclude tables

Should the exclude tables list in common .db exclude "_templatecachequeries"?

Unsupported mysqldump args cause scripts to fail with MariaDB

Apparently MariaDB does not recognise the --set-gtid-purged=OFF arg that was added in the last update, so pull_db.sh may fail if the DB in use is MariaDB.

$ ./scripts/pull_db.sh
mysqldump: unknown variable 'set-gtid-purged=OFF'
mysqldump: unknown variable 'set-gtid-purged=OFF'
site-db-dump-20210206.sql.gz       100%   58     0.9KB/s   00:00    
mysqldump: unknown variable 'set-gtid-purged=OFF'

Found a similar issue on the Laravel repo which probably describes it better.

If I comment out that line, it seems to work fine for me again, though my remote is a MariaDB instance on Amazon RDS. Not sure if there's a way around it that also works for those permissions issues you were trying to solve?

Table 'craft_cache' doesn't exist

Thanks for a great set of scripts Andrew.

I am having this issue:

Removing cache dir /home/forge/www.domainname.co.uk/craft/storage/runtime/cache
Removing cache dir /home/forge/www.domainname.co.uk/craft/storage/runtime/compiled_templates
Removing cache dir /home/forge/www.domainname.co.uk/craft/storage/runtime/state
Emptying cache table craft_cache
mysql: [Warning] Using a password on the command line interface can be insecure.
ERROR 1146 (42S02) at line 1: Table 'craft.domainname.craft_cache' doesn't exist
Emptying cache table craft_templatecaches
mysql: [Warning] Using a password on the command line interface can be insecure.
*** Caches cleared

Could it be because the database name craft.domainname has a dot in it?

The craft_templatecaches table empties without issue.

FR: clear assettransformindex as part of clear_caches.sh

We have opted to use .rsync-filter to exclude asset transforms when doing any environment synching / backup (to save on storage, bandwidth). But as a result, any time we do a remote -> local sync for continuous development, we always have to jump into Craft's CP to manually clear the Asset Transform Index cache. So if the clear_caches.sh script could also include the craft_assettransformindex table, that would really help!

Sanitise personal and private data for dev environments

Would be nice on the database syncing from LIVE to DEV to be able to sanitize the data to replace user data and sensitive information with dummy data.

If I have any luck getting it to happen, will be in touch. Unless someone else can do it faster.

Question: file/dir path of backup_dirs.sh

Hi,

thank you for this awesome scripts. I tested your backup_dirs.sh and I was wondering, why the BACKUP_FILES_DIR_PATH is not using the same structure like backup_db.sh and backups_assets.sh including the db name in the path with ${LOCAL_DB_NAME}:

backup_dirs.sh:
BACKUP_FILES_DIR_PATH="${LOCAL_BACKUPS_PATH}${FILES_BACKUP_SUBDIR}/"

vs. backup_db.sh and backup_assets.sh
BACKUP_DB_DIR_PATH="${LOCAL_BACKUPS_PATH}${LOCAL_DB_NAME}/${DB_BACKUP_SUBDIR}/"

Is there a reason behind the decision?

WSL: Host not being used

Describe the bug

A clear and concise description of what the bug is.

I am setting up the scripts on Windows with the WSL subsystem and Craft Nitro. The pull assets script works but I am having a problem pulling the database. I have this set in my env file:

LOCAL_DB_HOST="mysql-5.7-3306.database.nitro"

But when I pull the scripts:

mysqldump: Got error: 1044: Access denied for user 'nitro'@'%' to database 'mydatabase' when selecting the database

This works fine on Mac, I wonder if there is a problem with Nitro. I have no idea why it is using % I've tried on MySQL 5.7 and MySQL 8.0. Any direction or ideas would be much appreciated!

mysql commands on a nitro environment

hello,

i just switched from using laradock and docker to nitro on my local dev environment. Now i'm wondering what i need to set in the config for:

LOCAL_MYSQL_CMD="mysql"
LOCAL_MYSQLDUMP_CMD="mysqldump"

On my previous environment i needed to config it like this:

LOCAL_MYSQL_CMD="docker exec -i laradock_mysql_1 mysql"
LOCAL_MYSQLDUMP_CMD="docker exec laradock_mysql_1 mysqldump"

Do we need to ssh into nitro to use the mysql there? Does anybody got it working with nitro and multipass and has got some hints for me?

greetings
Marten

Compatibility with CraftCMS 4

Question

I noticed that the repository only mentions CraftCMS 3 in its scripts and documentation, and the last commit was made years ago. Can you please clarify the following:

  • Are the scripts still actively maintained?
  • Is it safe to use them in a CraftCMS 4 project?
  • Are they still considered a recommended best practice for the tasks they were designed for?

Issue with pull_db.sh

Describe the bug

When I run pull_db.sh, the script correctly makes a local backup of my script, and then does nothing with the remote DB pull. It just silently fails, but never ends. It's mystifying because I can pull the same DB to my local macOS installation with no problems, but it fails on the production server.

Versions

  • Plugin version: Whatever the most recent 2021 version was
  • Craft version: Craft Pro 3.7.37
  • MariaDB 10.6 on production
  • MySQL 5.7 remote
  • PHP 8.0 on production
  • PHP 7.3

One might think that this could be related to a MySQL oddity with 5.7, but I can reproduce with recent versions of MariaDB and the same versions of PHP as well.

What logs should I look at to determine what is failing and why?

Commerce Orders & Customers

Wondering what your thoughts are to exclude certain sensitive fields from being imported or obfuscate the data.

I saw in an earlier issue that you prefer to have the whole database locally, does this apply to orders and customers as well?

Thanks!

Add push_db

When developing a site locally, I usually push to staging all the time. Do you have a script for pushing to remote already, or should I send you a pull request?

sync_backups_to_s3 needs PATH with Laravel Forge scheduler (doc suggestion)

In case someone else has this problem, when using sync_backups_to_s3.sh with Laravel Forge scheduler, I needed to add PATH to the script so cron could find the aws command.

PATH=/home/forge/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin

You can also set the command with an absolute path (/home/forge/.local/bin/aws), but that’s not the recommend way.

DDEV pull-database not connecting or not properly pulling + restoring remote

I converted some local dev sites from Nitro to DDEV and I'm having issues running the 'pull-database' script. Here are my database details when running ddev describe:

│ db       │ OK   │ InDocker: db:3306                                       │ mysql:8.0          │
│            │          │ Host: 127.0.0.1:60810                                   │ User/Pass: 'db/db' │
│            │          │                                                                        │ or 'root/root'  
  1. If I run with the indocker settings that are in my .env file, I get a database connection error
  2. If I customize the .env.sh to use the host settings, I can connect and it looks like it connects and works (see below), but the remote database is not restored over my local database
mysqldump: [Warning] Using a password on the command line interface can be insecure.
mysqldump: Couldn't execute 'FLUSH TABLES': Access denied; you need (at least one of) the RELOAD or FLUSH_TABLES privilege(s) for this operation (1227)
mysqldump: [Warning] Using a password on the command line interface can be insecure.
Enter passphrase for key '/Users/stevehurst/.ssh/id_rsa': 
av07952-kcraft-db-dump-20230320.sql.gz                                                                  100%   42MB   8.4MB/s   00:05    
mysqldump: [Warning] Using a password on the command line interface can be insecure.
mysqldump: [Warning] Using a password on the command line interface can be insecure.
*** Backed up local database to /tmp/db-db-backup-20230320.sql.gz
mysql: [Warning] Using a password on the command line interface can be insecure.
ERROR 1062 (23000) at line 41: Duplicate entry '1' for key 'announcements.PRIMARY'
*** Restored local database from /tmp/av07952-kcraft-db-dump-20230320.sql.gz

Are there specific settings changes needed to the .env.sh file for mysql commands when running Craft on DDEV?

# Local database constants; default port for mysql is 3306, default port for postgres is 5432
# This pulls values from your local .env file
LOCAL_DB_NAME="db"
LOCAL_DB_PASSWORD="root"
LOCAL_DB_USER="root"
LOCAL_DB_HOST="127.0.0.1"
LOCAL_DB_PORT="60810"
LOCAL_DB_SCHEMA="public"

# The `mysql` and `mysqldump` commands to run locally
LOCAL_MYSQL_CMD="mysql"
LOCAL_MYSQLDUMP_CMD="mysqldump"

Restore Assets

as well as restore_db.sh, a restore_assets.sh option would be useful too.

ENV Paths Question

Just want to make sure I'm inputting the settings in my .env.sh file correctly to use the set_perms.sh script. So my file structure looks like this:

/my/server/path/craft
/my/server/path/public
/my/server/path/public/assets

So in the env file I would have these settings:

GLOBAL_CRAFT_PATH="./craft/"

LOCAL_ROOT_PATH="/my/server/path/"
LOCAL_ASSETS_PATH=${LOCAL_ROOT_PATH}"public/assets"

Is that correct? I wasn't sure because I don't see the GLOBAL_CRAFT_PATH variable used in the set_perms.sh file so I don't know if it will update permissions on just my craft files or on everything in my root directory.

Feature Request: Multiple remotes

For sites that have more than two environments (local/dev/staging/prod), it would be cool to be able to run pull_db or pull_assets from any environment. Currently I'm doing a two step process, where I ssh into dev to pull from staging, then pull from dev to local.

I might set this up eventually - I'll do a PR if I do. 🙂

How does this work with Craft 3.1 project config?

I'm mainly wondering about the scenario where I am using the project.yaml to sync settings/structures between environments but then I use pull_db.sh to sync the actual db content.

Does pull_db.sh only bring in the actual content, or will it also override the db settings that are initially created from the project.yaml file?

Thanks!

Support DDEV DB imports

In DDEV, our database settings are set to nothing more than db, as the system handles all that on its own. This causes a problem, because when I run ./scripts/pull_db.sh, I get an error with those values, and the message: sh Unknown suffix 'd' used for variable 'port' (value 'db') mysqldump: Error while setting value 'db' to 'port'

Describe the solution you would like

I’d like for the script to accommodate this issue when validating, as we don‘t have direct access to the ports.

Skip tables on pull_db

Hi,

I would like to know if it is possible to skip some tables on pull_db command ?

Indeed I have a warning which I may be able to solve.
mysqldump: Couldn’t execute ‘SELECT COLUMN_NAME, JSON_EXTRACT(HISTOGRAM, ‘$.“number-of-buckets-specified”’) FROM information_schema.COLUMN_STATISTICS WHERE SCHEMA_NAME = ‘db_name’ AND TABLE_NAME = ‘ct_assetindexdata’;’: Unknown table ‘column_statistics’ in information_schema (1109) mysqldump: [Warning] Using a password on the command line interface can be insecure. mysqldump: Couldn’t execute ‘SELECT COLUMN_NAME, JSON_EXTRACT(HISTOGRAM, ‘$.“number-of-buckets-specified”’) FROM information_schema.COLUMN_STATISTICS WHERE SCHEMA_NAME = ‘db_name’ AND TABLE_NAME = ‘ct_assets’;’: Unknown table ‘column_statistics’ in information_schema (1109)

Thanks !

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.