简体   繁体   English

PhpMyAdmin数据导入性能问题

[英]PhpMyAdmin data import performance issues

Originally, my question was related to the fact that PhpMyAdmin's SQL section wasn't working properly. 最初,我的问题与PhpMyAdmin的SQL部分无法正常工作有关。 As suggested in the comments, I realized that it was the amount of the input is impossible to handle. 正如评论中所建议的那样,我意识到输入的数量是无法处理的。 However, this didn't provide me with a valid solution of how to deal with the files that have (in my case - 35 thousand record lines) in format of (CSV): 但是,这并没有为我提供一个有效的解决方案,如何处理(CSV)格式的文件(在我的情况下 - 35,000条记录行):

...
20120509,126,1590.6,0
20120509,127,1590.7,1
20120509,129,1590.7,6
...

The Import option in PhpMyadmin is struggling just as the basic copy-paste input in SQL section does. PhpMyadmin中的Import选项正在苦苦挣扎,正如SQL部分中的基本复制粘贴输入那样。 This time, same as previously, it takes 5 minutes until the max execution time is called and then it stops. 这一次,与之前相同,直到调用最大执行时间需要5分钟然后停止。 What is interesting tho, it adds like 6-7 thousand of records into the table. 有趣的是,它增加了6-7千条记录。 So that means the input actually goes through and does that almost successfully. 这意味着输入实际上已经完成,并且几乎成功。 I also tried halving the amount of data in the file. 我还尝试将文件中的数据量减半。 Nothing has changed however. 然而,没有任何改变。

There is clearly something wrong now. 现在显然有些不对劲。 It is pretty annoying to have to play with the data in php script when simple data import is not work. 当简单数据导入不起作用时,必须使用php脚本中的数据进行播放是非常烦人的。

Change your php upload max size. 更改您的PHP上传最大大小。

Do you know where your php.ini file is? 你知道你的php.ini文件在哪里吗?

First of all, try putting this file into your web root: 首先,尝试将此文件放入您的Web根目录:

phpinfo.php

( see http://php.net/manual/en/function.phpinfo.php ) (见http://php.net/manual/en/function.phpinfo.php

containing: 含:

<?php

phpinfo();

?>

Then navigate to http://www.yoursite.com/phpinfo.php 然后导航到http://www.yoursite.com/phpinfo.php

Look for "php.ini". 寻找“php.ini”。

To upload large files you need max_execution_time, post_max_size, upload_max_filesize 要上传大文件,您需要max_execution_time,post_max_size,upload_max_filesize

Also, do you know where your error.log file is? 另外,你知道你的error.log文件在哪里吗? It would hopefully give you a clue as to what is going wrong. 希望能给你一个关于出了什么问题的线索。

EDIT: 编辑:

Here is the query I use for the file import: 这是我用于文件导入的查询:

$query = "LOAD DATA LOCAL INFILE '$file_name' INTO TABLE `$table_name` FIELDS TERMINATED BY ',' OPTIONALLY
    ENCLOSED BY '\"' LINES TERMINATED BY '$nl'";

Where $file_name is the temporary filename from php global variable $_FILES, $table_name is the table already prepared for import, and $nl is a variable for the csv line endings (default to windows line endings but I have an option to select linux line endings). 其中$ file_name是来自php全局变量$ _FILES的临时文件名,$ table_name是已准备好导入的表,$ nl是csv行结尾的变量(默认为windows行结尾但我有选择选择linux行)结局)。

The other thing is that the table ($table_name) in my script is prepared in advance by first scanning the csv to determine column types. 另一件事是我的脚本中的表($ table_name)是事先准备好的,首先扫描csv以确定列类型。 After it determines appropriate column types, it creates the MySQL table to receive the data. 在确定适当的列类型后,它会创建MySQL表以接收数据。

I suggest you try creating the MySQL table definition first, to match what's in the file (data types, character lengths, etc). 我建议你首先尝试创建MySQL表定义,以匹配文件中的内容(数据类型,字符长度等)。 Then try the above query and see how fast it runs. 然后尝试上面的查询,看看它运行的速度有多快。 I don't know how much of a factor the MySQL table definition is on speed. 我不知道MySQL表定义对速度有多大影响。

Also, I have no indexes defined in the table until AFTER the data is loaded. 此外,在加载数据之后,我没有在表中定义索引。 Indexes slow down data loading. 索引会减慢数据加载速度。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM