简体   繁体   English

MySQL workbench 表数据导出极慢

[英]MySQL workbench table data export extremely slow

I just downloaded the newest version of MySQL Workbench (6.3.6) and attempted to export a remote table (on Google CloudSQL) to csv using the new "table data export" wizard.我刚刚下载了最新版本的 MySQL Workbench (6.3.6),并尝试使用新的“表数据导出”向导将远程表(在 Google CloudSQL 上)导出到 csv。 The table had about 600,000 rows and the final downloaded size was about 75MB.该表有大约 600,000 行,最终下载的大小约为 75MB。 It took 7.5 hours.花了7.5个小时。

I realize I can use Google Developer Console to perform this export (which I did, and took about 15 seconds), but it seems that something is wrong with MySQL Workbench.我意识到我可以使用 Google Developer Console 执行此导出(我这样做了,花了大约 15 秒),但 MySQL Workbench 似乎出了点问题。 Could there be a configuration issue which is causing this to go so slowly?会不会是配置问题导致速度如此之慢?

I know this question is quite old but I'm answering as I recently had this issue.我知道这个问题已经很老了,但我正在回答,因为我最近遇到了这个问题。 I was trying to export 2 million + rows and it had taken 2 days to only complete half.我试图导出 200 万多行,但花了 2 天时间才完成一半。 This was after trying several different ways of export.这是在尝试了几种不同的出口方式之后。 Then found this:然后发现了这个:

SELECT * 
FROM my_table
INTO OUTFILE 'C:/ProgramData/MySQL/MySQL Server 8.0/Uploads/my file.csv' 
FIELDS ENCLOSED BY '"' 
TERMINATED BY ';' 
ESCAPED BY '"' 
LINES TERMINATED BY '\r\n';

And it completed in 80 seconds!它在 80 秒内完成!

Please note: if you hit secure_file_priv issue then set the file path to be equal to the result of:请注意:如果您遇到secure_file_priv问题,则将文件路径设置为等于以下结果:

SHOW VARIABLES LIKE "secure_file_priv"

Description : Workbench is very slow exporting large datasets through the CSV export wizard.说明:Workbench 通过 CSV 导出向导导出大型数据集时速度非常慢。 Disproportionately slow comapred to a smaller set.与较小的集合相比,速度慢得不成比例。 However, this is something I've come across before with .NET.但是,这是我以前在使用 .NET 时遇到过的事情。

How to repeat : Get a table with 15k or so records or more, and export through the wizard.如何重复:得到一个15k左右记录或更多的表,并通过向导导出。 Note how long it takes and then export a subset of that data and see how the time taken does not correlate linearly with the amount of rows.注意它需要多长时间,然后导出该数据的一个子集,看看所花费的时间如何与行数不线性相关。

Suggested fix : Something I've noticed when building CSV export applications is that the MS .NET framework can't deal with huge strings very well, and tends to perform badly as a result.建议的修复:在构建 CSV 导出应用程序时我注意到,MS .NET 框架不能很好地处理巨大的字符串,因此往往表现不佳。

I found a solution though.我找到了解决方案。 When building up the huge string to the dump into the file when you've done generating it, instead of building 1 huge string and writing it to file all at once when the export is done, I get much better performance by only doing a few hundred rows of CSV generated at a time, write it to the file and flush the buffer you have been writing the generated data to.在完成生成后构建巨大的字符串以转储到文件中,而不是构建 1 个巨大的字符串并在导出完成后立即将其写入文件,我只需要做一些就可以获得更好的性能一次生成一百行 CSV,将其写入文件并刷新您一直在将生成的数据写入其中的缓冲区。

I'd recommend writing to a temp file, then rename/move it to the user's specified one when done.我建议写入一个临时文件,然后在完成后将其重命名/移动到用户指定的文件。 The Write to temp and then move/rename is the way Photoshop and some other applications save their data.写入临时文件然后移动/重命名是 Photoshop 和其他一些应用程序保存数据的方式。 And the writing x rows and flushing I've found when developing myself is much faster than trying to get .NET to manage a 20MB string.我在开发自己时发现的写入 x 行和刷新比试图让 .NET 管理 20MB 的字符串快得多。

Try using ETL tools Pental ETL尝试使用 ETL 工具 Pental ETL

or要么

https://www.mycli.net/ https://www.mycli.net/

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM