[英]keep track of latest 5 update of a user, PHP MySql
我正在尝试跟踪用户位置的最新6个更新。 如果第7条记录到来,它将检查用户位置是否已更新。 如果已更新,则检查该值是否小于6或大于6。如果大于6,则将第一个替换为第7个;如果小于6,则插入作为新记录。
我为其编写代码,并且可以正常工作。 但是如何优化代码? 还是有可能出错?
<?php
$servername = "localhost";
$username = "root";
$password = "root";
$dbname = "gpslocation";
$user="testuser";
$userLat="914";
$userLng="111";
$gpsTime="2015-07-16 12:00:36";
// Create connection
$conn = new mysqli($servername, $username, $password, $dbname);
// Check connection
if ($conn->connect_error) {
die("Connection failed: " . $conn->connect_error);
}
$sql = "SELECT * FROM gpslocations where userName='".$user."' and latitude='".$userLat."' and longitude = '".$userLng."'";
$result = $conn->query($sql);
if ($result->num_rows > 0) {
//update location time here
echo "User Location is all ready stored.";
// output time of location
$sql = "UPDATE gpslocations SET gpsTime = '".$gpsTime."',lastUpdate=now() WHERE userName='".$user."' and latitude='".$userLat."' and longitude = '".$userLng."'";
if ($conn->query($sql) === TRUE) {
echo "updated record time successfully";
} else {
echo "Error in updated record time : " . $sql . "<br>" . $conn->error;
}
} else {
//counting number of records of a user
$countSql= "SELECT * FROM gpslocations where userName='".$user."' ";
$countResult = $conn->query($countSql);
//Dont want to store more then 6 record in DB
if ($countResult->num_rows > 5) {
//update location here
echo "greater then 5";
//getting first location data
$sql = "SELECT * FROM gpslocations where userName='".$user."' ORDER BY lastUpdate asc LIMIT 1;";
$result = $conn->query($sql);
if ($result->num_rows > 0) {
while($row = $result->fetch_assoc()) {
echo "id: " . $row["GPSLocationID"] ;
//updating data of first one row
$updateLastRecSql = "UPDATE gpslocations SET lastUpdate=now(), gpsTime = '".$gpsTime."',latitude='".$userLat."',longitude='".$userLng."' WHERE userName='".$user."' and GPSLocationID= '".$row["GPSLocationID"]."'";
if ($conn->query($updateLastRecSql) === TRUE) {
echo "updated record time successfully";
} else {
echo "Error in updated record time : " . $sql . "<br>" . $conn->error;
}
}
}
} else {
//insert data if not more then 6
$sql = "INSERT INTO gpslocations (userName, latitude, longitude,gpsTime)
VALUES ('$user', '$userLat', '$userLng','$gpsTime')";
if ($conn->query($sql) === TRUE) {
echo "New record created successfully";
} else {
echo "Error: " . $sql . "<br>" . $conn->error;
}
}
}
$conn->close();
?>
如果我看一下代码,可以进行一些简单的更改来优化代码。 例如,如果您需要知道行数,只需使用mysql COUNT
功能。 https://dev.mysql.com/doc/refman/5.6/zh-CN/counting-rows.html
其余的可能需要4个查询来更新结果。 如果使用INSERT ON DUPLICATE KEY
查询,则可以将insert语句与update语句结合使用。
https://dev.mysql.com/doc/refman/5.6/zh-CN/insert-on-duplicate.html
为此,您需要在userName, latitude, longitude
上使用unique_key。 如果这样做,可以将代码简化为以下示例。 在重复键上插入将在插入时返回1个受影响的行,并在更新时返回2个行。
<?php
$servername = "localhost";
$username = "root";
$password = "root";
$dbname = "gpslocation";
$user="testuser";
$userLat="914";
$userLng="111";
$gpsTime="2015-07-16 12:00:36";
// Create connection
$conn = new mysqli($servername, $username, $password, $dbname);
// Check connection
if ($conn->connect_error) {
die("Connection failed: " . $conn->connect_error);
}
// Insert or update record
$sql = "INSERT INTO gpslocations (username, latitude, longitude, gpsTime)
VALUES ('{$user}', '{$userLat}', '{$userLng}', '{$gpsTime}')
ON DUPLICATE KEY UPDATE gpsTime=values(gpsTime), lastUpdate=NOW()";
if ($conn->query($sql) === false) {
echo "Error in updated record time : " . $sql . "<br>" . $conn->error;
} else {
if ($conn->affected_rows === 1) {
echo "New record created successfully";
} else {
echo "updated record time successfully";
}
}
$sql = "DELETE FROM gpslocations WHERE GPSLocationID IN (
SELECT GPSLocationID FROM (select GPSLocationID from gpslocations WHERE username='{$user}' order by lastUpdate DESC LIMIT 6,100) x
)";
if ($conn->query($sql) === false) {
echo "Error in deleting records: {$sql} <br> {$conn->error}";
} else {
echo "{$conn->affected_rows} rows deleted.";
}
$conn->close();
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.