Viewing 6 posts - 1 through 6 (of 6 total)
  • IT Help Needed – MySQL large table transfer
  • sl2000
    Full Member

    I need to transfer a large (12GB in 35,000 rows) MySQL table from one server to another.

    The old source server is running an old Solaris; the new target Win 2008 and MySQL community 5.5.

    I’ve no spare disk space on the old server; and it’s live so I can’t make any changes to the MySQL config. I’ve got SSH and 3306 access to it from the new server, and a reasonable network connection (so the transfer took 2 hours using my attempt below). The new server is on my subnet so I’ve got full access.

    I’ve tried using mysqldump (from the new server) and then simply loading the dump file…

    mysqldump.exe –compress=TRUE –default-character-set=utf8 –max_allowed_packet=1G –host=essupportinternal –user=xxx –password=xxx –lock-tables=FALSE –add-locks=FALSE –port=3306 “support” “job_files” > dumpjobfiles.sql

    mysql –user=xxx –password=xxx support < dumpjobfiles.sql

    …but mysql crashes out in the load with an error “ERROR 2006 (HY000) at line 330: MySQL server has gone away”.

    It’s possible that size of the single insert statement is the issue – but I can’t see a way to persuade mysqldump to split the output into multiple insert statements.

    My next plan will be to write a program to read the .sql file and split it, but I assume other people have made similar transfers so there has to be a better way.

    We have no-one in the company (including me) with any significant MySQL knowledge unfortunately.

    STW – I need your help please.

    retro83
    Free Member

    Check Frank Eriksson’s post here (then restart mysql):

    http://bugs.mysql.com/bug.php?id=14236

    Oh and if you’ve got any SQL functions, you’ll probably need to add –routines to the dump command.

    mysqldump is such a barsteward 😡

    I’d guess it’s because you’re dumping with a max packet size of 1gb but the new server is set to something less than that

    geoffj
    Full Member

    Edit – I wasn’t helping.

    mst
    Free Member

    Hi,
    Instead of sending it out to a file, why not just pipe it into mysql ?

    Something like this ?

    mysqldump.exe –compress=TRUE –default-character-set=utf8 –max_allowed_packet=1G –host=essupportinternal –user=xxx –password=xxx –lock-tables=FALSE –add-locks=FALSE –port=3306 “support” “job_files” | mysql –user=xxx –password=xxx –host=essupportinternal –port=3306

    allthegear
    Free Member

    I don’t suppose by any chance you have phpmyadmin on there, do you? There’s an option in the export page there to break large inserts into smaller ones…

    Rachel

    sl2000
    Full Member

    @retro83 – You’re a star! Adding max_allowed_packet=1G to my.ini solved it.

    @mst – It’s the network transfer taking most of the time, so it’s good to have the data on the server in case something goes wrong (as it has here).

    @allthegear – Yep, have phpmyadmin, but don’t fancy my chances of persuading php to play well with a 12GB table.

    Many thanks all!
    Simon

Viewing 6 posts - 1 through 6 (of 6 total)

The topic ‘IT Help Needed – MySQL large table transfer’ is closed to new replies.