[英]Export BIG mysql table to JSON
I have a mysql table with 2.8 million records and i want to convert all of these to JSON. 我有一个280万条记录的mysql表,我想将所有这些都转换为JSON。 I wrote a script to convert but it stops with a memory warning.
我写了一个脚本进行转换,但是由于出现内存警告而停止了。
Then i tried to create smaller files (file1 is 0, to 100000 records, file 2 is 100000 to 1000000 records etc ) and combine with windows copy command. 然后我尝试创建较小的文件(file1为0,至100000条记录,文件2为100000至1000000条记录等),并与Windows copy命令结合。 It works, but each file is a JSON array (like [{...}]) and when it merges, it becomes separate sections like [{}][{}] (where i want it like [{................}])
它可以工作,但是每个文件都是一个JSON数组(例如[{......]),合并后会变成单独的部分,例如[{}] [{}](我希望在其中像[{.... ............}])
Is there any better solution to do this ? 有更好的解决方案吗?
I would suggest you to change 'memory_limit' in your php.ini configuration. 我建议您在php.ini配置中更改“ memory_limit”。 Also if this takes much time then you can handle this by cron job(if possible)
另外,如果这需要花费很多时间,那么您可以通过cron作业处理(如果可能)
OR 要么
you can decode your ALL json files and merge it in a single array and then again encode in json and put in the json file. 您可以解码所有json文件并将其合并为一个数组,然后再次以json编码并放入json文件。
Finally i did this. 最后,我做到了。 Please see the steps (i am not sure this is the right one, but it works).
请查看步骤(我不确定这是正确的步骤,但是可以)。
Totally, i have 2.6 million records in my table. 总计,我的表中有260万条记录。 I created a script which will select mysql rows, convert to json and write to a file.
我创建了一个脚本,该脚本将选择mysql行,转换为json并写入文件。
Select records from 0 to 1 million and create file 1. Repeat from 1 to 2 million and 2 to 2.6 million for file2 and file 3. 从0到1百万中选择记录并创建文件1。对file2和file 3重复1到2百万和2到260万。
Combine these files using JQ ( http://stedolan.github.io/jq/ ) and create a single JSON file. 使用JQ( http://stedolan.github.io/jq/ )组合这些文件,并创建一个JSON文件。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.