Matthew Musgrove
Matthew Musgrove
``` CPAN.pm: Building C/CO/COSIMO/Net-Statsd-Server-0.10.tar.gz Checking if your kit is complete... Looks good Writing Makefile for Net::Statsd::Server Writing MYMETA.yml and MYMETA.json (C:\strawberry\perl\bin\perl.exe Makefile.PL exited with 0) CPAN::Reporter: Makefile.PL result is 'pass',...
If you want to use a pre-commit hook without docker just use this: ```repos: - repo: local hooks: - id: shellcheck name: shellcheck description: Test shell scripts with shellcheck entry:...
Sorry to hijack the thread but I'm also seeing the "gzip: stdout: Broken pipe" errors. I'm running mydumper/myloader version v0.16.3-3 on three systems.The first is where I take the mydumper...
[mmusgrove@dbase05-west ~]$ mydumper --version mydumper v0.16.3-3, built against MySQL 5.7.44-48 with SSL support [mmusgrove@staging02 ~]$ myloader --version myloader v0.16.3-3, built against MySQL 5.7.44-48 with SSL support [mmusgrove@qa01 ~]$ myloader --version...
``` myloader --host=localhost --database=matthew --enable-binlog --directory=/backup/dump --queries-per-transaction=250000 --threads=12 --compress-protocol --verbose=3 --fifodir=/tmp/bkp --serialized-table-creation ** Message: 14:29:49.690: Using 12 loader threads ** Message: 14:29:49.690: Config file loaded ** Message: 14:29:49.692: Connection via...
No, that is the only mention of that table in the output.
I suppose that's possible. I ran it again after clearing out /tmp/bkp and it coredumped. By the way, changed how I was replacing table names to prevent errors and so...
That shouldn't be possible given that I'm using this to take the backups ``` d=$(date +%Y-%m-%d) /usr/bin/mydumper --no-data --compress -B dbname -o /ERAID10/backup/dump/$d rm -rf /ERAID10/backup/dump/$d/dbname-schema-triggers.sql.gz /usr/bin/mydumper --dirty --rows 250000...
Thanks @davidducos. I inherited it. I'll make some time to test your suggestion.
> Hi @mrmuskrat, Well... you are mixing 2 backups, that is not a good idea. I don't know why you are splitting between data and schema as that is something...