简体   繁体   中英

How to export huge MyISAM table to InnoDB in chunks?

I am using the MySQL Workbench to alter some huge (Millions of rows) MyISAM tables to InnoDB. I tryed it by ALTER TABLE... and by INSERT INTO othertable SELECT * FROM mytable after cloning the table but it seems that the tables are too big. The workbench quits with an error about lost connection to server... .

So I thought I could wright a SQL script, that loops oversy every 10000 entry or so and copy the content of the MyISAM table in chunks to the InnoDB table.

Since there seem to be no for loops SQL I wonder if someone has a suggestion how to realize this?

INSERT INTO `latest`.`AUTHORS` (author_id, author_name)
SELECT
`author_names`.`author_name_id`,
`author_names`.`name`
FROM `pubmap_latest`.`dblp_author_names`;

This is the query that I want to run in chunks. A count(*) -query returns 985330 rows.

I would use mysqldump (command line tool) to export the data from the old table and then import it into the new table. Something like:

mysqldump database table > file.sql

Then edit the table name and type in the file and:

mysql database < file.sql

In MySQL you can use the LIMIT keyword in SELECT statements, defining how many result rows you want to display. Combining this with an offset, you can split up your table in chunks. Of course, you should use an order by clause to work on the same result set, when splitting the table up into chunks.

SELECT `author_names`.`author_name_id`, `author_names`.`name`
FROM `pubmap_latest`.`dblp_author_names`
ORDER BY  `author_names`.`author_name_id`
LIMIT 10000 OFFSET 0
INTO OUTFILE dblp_author_names_0.sql

For offset you have to use 10000, 20000, 30000, ...

Load your results with

LOAD DATA INFILE ...

Refer to the MySQL 5.1 reference manual .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM