简体   繁体   English

PHP / PDO-以高性能方式将大表输出为JSON

[英]PHP/PDO - Output large Tables as JSON in a high-performance way

I've some queries which will output large Tables. 我有一些查询,将输出大表。 Reduced the code to the most important thing, it looks like: 将代码简化为最重要的事情,它看起来像:

$pdo = new PDO(...);
$statement = $pdo->prepare($query);
$statement->execute($data);

$largeArray = $statement->fetchAll(PDO::FETCH_ASSOC);
$largeString = json_encode($largeArray);
echo $largeString;

It's working fine. 一切正常。 But what If I want to produce some realy large outputs? 但是,如果我想产生一些真正的大产量怎么办? The above code, has to store the full output in the memory. 上面的代码,必须将完整的输出存储在内存中。

One alternative would be: 一种选择是:

$pdo = new PDO(...);
$statement = $pdo->prepare($query);
$statement->execute($data);

echo '[';
if($line = $statement->fetch(PDO::FETCH_ASSOC)) {
  echo json_encode($line);
  while($line = $statement->fetch(PDO::FETCH_ASSOC)) {
    echo ',';
    echo json_encode($line); 
  }
}
echo ']';

But does this run with good performance? 但这是否运行良好? What would be more performant solutions? 什么是性能更好的解决方案?

What I would do is fetch in batches of lets say 200 我要做的是分批获取200

Why : Every time you load a batch, you are loading into the machine memory which is of course pretty limited. 原因 :每次加载批次时,您都将加载到机器内存中,这当然是非常有限的。

I suggest saving the results into files of 2000 我建议将结果保存到2000个文件中

(the batches or file sizes may vary, and you should probably find the sweet spot for your system but keep it small) (批处理或文件大小可能会有所不同,您可能应该找到适合您的系统的最佳位置,但请使其保持较小)

Why : Even though this is not a must, specially if you are going to load this in batches next time you need it. 原因 :尽管这不是必须的,但特别是如果您下次需要批量加载时,尤其如此。 But if you are planing on opening it in any editor you will surely need to wait a while or even cause the app to crash. 但是,如果您打算在任何编辑器中打开它,则肯定需要等待一段时间,甚至导致应用程序崩溃。

Also might I suggest saving the data as csv if possible instead of json, csv has a much smaller footprint for huge dumps. 另外,如果可能的话,我建议您将数据保存为csv而不是json,csv的占用空间要小得多,可以进行大量转储。 Also when loading from a json file you will not be able to load data in batches without doing some hacking around. 另外,从json文件加载时,如果不进行一些改动,将无法成批加载数据。 with csv you can simply go line by line. 使用csv,您可以简单地逐行进行。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM