Currently data is fetched as an array form PHP script. SO I find that for 40 set of data it take about 20 seconds to load. I guess it's because ajax has to wait till until all the results gathered. If the data set increased to hundreds, I think it's gonna slow sown a lot.
So I did some research on how to decrease the loading time thus come across parallel ajax request. I think it would be helpful, for multiple requests.For single request but hundreds and thousands of data fetched, is there a way to increase the speed?
JS :
$(function() {
$.ajax({
dataType: "json",
url: 'showAllTutor.php',
success: function(data) {
console.log(data.length);
var j=0;
for (i = 0; i < data.length; i++) {
j++;
console.log(data[i].name);
}
}
});
});
PHP :
$sql="SELECT * FROM userinfo,posts WHERE userinfo.UUID = posts.UUID AND posts.p_id > '$last_msg_id'";
$stmt =connection::$pdo->prepare($sql);
$stmt->execute();
$json=array();
while($row = $stmt->fetch()) {
$total = $row['reviewPlus']+ $row['reviewNeg'];
array_push($json,array("name"=>$row['name'],"subject"=>$row['subname'],"subid"=>$row['subID'],"rate"=>$row['pricing'],"dateposted"=>$row['datePosted'],"location"=>$row['location'],"contact"=>$row['phone'],"morning"=>$row['morning'],"afternoon"=>$row['afternoon'],"evening"=>$row['evening'],"postId"=>$row['p_id'],"total"=>$total,"plus"=>$row['reviewPlus'],"user"=>$row['UUID']));
}
}
echo json_encode($json);
Loading time might be because of multiple loops in your code, while loop in php and for loop in jquery which increase the execution time,
I suggest you to avoid while loop in your php code collect all data using fetchAll statement
$data = $stmt->fetchAll(PDO::FETCH_ASSOC);
echo json_encode($data);
and make your $total
calculation in yours jquery for loop,
This might help, Happy coding.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.