getting Lost connection to mysql when using mysqldump even with max allowed packet parameter

  • 21,000
  • Tác giả: admin
  • Ngày đăng:
  • Lượt xem: 21
  • Tình trạng: Còn hàng

Try adding the --quick option to lớn your mysqldump command; it works better with large tables. It streams the rows from the resultset to lớn the output rather phàn nàn slurping the whole table, then writing it out.

 mysqldump -uroot -h my.host -p'mypassword' --quick --max_allowed_packet=512M db_name table_name | \
 gzip  > dump_test.sql.gz

You can also try adding the --compress option to lớn your mysqldump command. That makes it use the more network-friendly compressed connection protocol to lớn your MySQL server. Notice that you still need the gzip pipe; MySQL's compressed protocol doesn't cause the dump to lớn come out of mysqldump compressed.

It's also possible the server is timing out its connection to lớn the mysqldump client. You can try resetting the timeout durations. Connect to lớn your server via some other means and issue these queries, then run rẩy your mysqldump job.

These mix the timeouts to lớn one calendar day.

    SET GLOBAL wait_timeout=86400;
    SET GLOBAL interactive_timeout=86400;

Finally, if your server is far away from your machine (through routers and firewalls) something may be disrupting mysqldump's connection. Some inferior routers and firewalls have time limits on NAT (network address translation) sessions. They're supposed to lớn keep those sessions alive while they are in use, but some don't. Or maybe you're hitting a time or size limit configured by your company for external connections.

Try logging into a machine closer to lớn the server and running mysqldump on it. Then use some other means (sftp?) to lớn copy your gz tệp tin to lớn your own machine.

Or, you may have to lớn segment the dump of this tệp tin. You can vì thế something lượt thích this (not debugged).

mysqldump  -uroot -h my.host -p'mypassword'  \ 
          db_name table_name --skip-create-options --skip-add-drop-table \
          --where="id>=0 AND id < 1000000" | \
          gzip....

Then repeat that with these lines.

          --where="id>=1000000 AND id < 2000000" | \

          --where="id>=2000000 AND id < 3000000" | \
          ...

until you get all the rows. Pain in the neck, but it will work.