Ticket #7 (closed enhancement: fixed)
Large MySQL database support - Error 2013 - PATCH attached.
|Reported by:||rosskevin||Owned by:||soergel|
Our production database has outgrown the standard method. I started experimenting with parameters on the mysqldump command, but had no luck. I decided to go back to command line and see if I had any issues dumping the db with the standard parameters and had no issues whatsoever. I believe the crux of this issue is that mysqldump is holding open the mysql connection while attemping to dump to gzip, and in tern utilize jS3tream to upload. Essentially, we receieved a "Lost Connection" to the database. While I could start manipulating the db to get around this, our production values are good, and inline with best practices. I decided to fix the problem by streaming to a temp file, then triggering jS3tream, followed by a cleanup of the file. The sendToS3 function has an optional third parameter that could generally be used for an interim temp file. I've tested this method, and it works well in our production system. I believe it is the correct way to solve the problem and handles larger databases with realistic timeouts/packet sizes etc.
- Owner changed from dsoergel to soergel
- Status changed from new to assigned