简体   繁体   中英

How to encode large amount of data as json

I need to be able to pass a json from php to the client so javascript can parse the code and return it. Currently, the way I have always done this is:

<?php
    $mysql_query = $mysqli->query(QUERY GOES HERE);
    $array - array();
    while($row = $mysql_query->fetch_assoc()){
        array_push($array, $row);
    }
    $json =json_encode($array);
?>
<!-- javascript -->
<script>
    var json = <?php echo $json;?>;
    //...
</script>
<!--- rest of html --->

This usually works. However, the query returns more than 100,000 rows, and php is currently running out of memory on creating the entire array. I have seen some people say to use ajax. Is this the only way? And if so, how exactly would I go about implementing it? Or is there a more efficient method of encoding the mysql data into json without ajax?

Thank you

Since you said you can also send your array in chunks , meaning you can process your data while it's sorted out in several ways, i consider doing this:

  • Fetch the database data in small chunks (memory wise), in normal php arrays. Doing it such a way that you unset the previously allocated chunk to free memory.
  • JSON encode your partial data into an HTML element.

Like this:

<input id="json-1" type="hidden" data-json='<?php echo json_encode($chunk[0]); ?>' />
<input id="json-2" type="hidden" data-json='<?php echo json_encode($chunk[1]); ?>' />
<input id="json-3" type="hidden" data-json='<?php echo json_encode($chunk[2]); ?>' />

Yes, it looks ugly but consider some pros:

  • There's no giant javascript JSON object, hogging the client's memory
  • You can parse the data in a progressive way, thus consuming only a portion of memory

javascript processing

 $(document).ready(function() {
    $("input[id^='json-']").each(function() {
       var json = $(this).data("json");

       // process the chunkied 'json'

       // remove the memory allocated (since you dont want anymore the huge data, probably)
       $(this).removeData(this,"json")
    });
 });

If you want to pass something from php to javascript you will have to use ajax as php is server side and javascript is client side.

Having an array of 100,000 rows is a bit big and could cause problems. Consider splitting the array into smaller chunks or limiting the query.

You can increase php s memory limit but it would be better to optimize the array first.

You don't need to handle so large amount of data. Separate into pages.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM