[英]MySQL Is Not Inserting All Successful Insert Queries…Why?
Before I go on, this is purely a question of intuition. 在我继续之前,这纯粹是直觉的问题。 That is, I'm not seeking answers to work out specific bugs in my PHP/MySQL code. 也就是说,我没有寻求答案来解决我的PHP / MySQL代码中的特定错误。 Rather, I want to understand what the range of possible issues that I need to consider in resolving my issue. 相反,我想了解解决问题时需要考虑的一系列可能问题。 To these ends, I will not post code or attach scripts - I will simply explain what I did and what is happening. 为此,我不会发布代码或附加脚本-我只会解释自己所做的事情和正在发生的事情。
I have written PHP script that 我写的PHP脚本
There are several CSV files that I am processing via separate scheduled cron tasks every 30 minutes. 每隔30分钟,我将通过单独的计划cron任务处理几个CSV文件。 All said, from the various sources, there are an estimated 420,000 insert transactions from file to root table, and another 420,000 insert transactions from root table to master table via the scheduled tasks. 总而言之,从各个来源来看,通过计划的任务,估计有42万个从文件到根表的插入事务,另外42万个从根表到主表的插入事务。
One of the tasks involves a CSV file of about 400,000 records by itself. 其中一项任务涉及一个约40万条记录的CSV文件。 The processing contains no errors, but here's the problem: of the 400,000 records that MySQL indicates have been successfully inserted into the root table, only about 92,000 of those records actually store in the root table - I'm losing about 308,000 records from that scheduled task. 处理过程不包含任何错误,但这是问题所在:MySQL指示已成功将40万条记录插入到根表中,但实际上只有92,000条记录存储在根表中-我从计划中丢失了约308,000条记录任务。
The other scheduled tasks process about 16,000 and 1,000 transactions respectively, and these transactions process perfectly. 其他预定任务分别处理约16,000和1,000个事务,并且这些事务处理得很好。 In fact, if I reduce the number of transactions from 400,000 to, say, 10,000, then these process just fine as well. 实际上,如果我将交易数量从40万减少到10000,那么这些过程也很好。 Clearly, that's not the goal here. 显然,这不是这里的目标。
To address this issue, I have tried several remedies... 为了解决这个问题,我尝试了几种补救措施...
...and none of these remedies have worked as desired. ...而且这些补救措施均未达到预期的效果。
What range of remedial actions should be considered at this point, given the lack of success in the actions taken so far? 鉴于到目前为止所采取的行动均未取得成功,此时应考虑采取何种补救措施? Thanks... 谢谢...
The source data in csv may have duplicate records. csv中的源数据可能有重复的记录。 Even though there are 400,000 record in the csv, your 'insert or update' logic trims them into reduced set. 即使csv中有40万条记录,您的“插入或更新”逻辑也会将它们修剪成精简的记录。 Less memory could lead to exceptions etc, but this kind of data loss. 较少的内存可能导致异常等,但是这种数据丢失。
I suspect there are problems in the CSV file. 我怀疑CSV文件中存在问题。
My suggestion: 我的建议:
It's something like this: 就像这样:
<?php
$csv = fopen('sample.csv', 'r'); $line = 1;
while (($item = fgetcsv($csv)) !== false) {
echo 'Line ' . $line++ . '... ';
$sql = ''; // your SQL query
mysql_query($sql);
$error = mysql_error();
if ($error == '') {
echo 'OK' . PHP_EOL;
} else {
echo 'FAILED' . PHP_EOL . $error . PHP_EOL;
}
}
So, if there are any errors, you can see it and find the problem (what lines of CSV has problem). 因此,如果有任何错误,您可以查看并找到问题(CSV的哪些行有问题)。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.