简体   繁体   English

如何计算每小时的平均磁盘空间

[英]How to Calculate Average Diskspace per Hour

I need to calculate the average amount of diskspace used per hour. 我需要计算每小时使用的平均磁盘空间量。 In a database, I log the diskspace every hour or so (but not always every hour, sometimes more and sometimes less), and I need to calculate the average usage per hour. 在数据库中,我日志磁盘空间每隔一小时左右(但不总是每隔一小时,有时多有时少),我需要计算每小时的平均使用率。

I'm stumped, though, since the duration between checks changes. 不过,我感到很沮丧,因为两次检查之间的时间间隔发生了变化。 I'm not quite sure the best way to go about doing this. 我不太确定执行此操作的最佳方法。

Additional Details 额外细节

To clarify: 澄清:

We have a system setup where diskspace is logged at an interval for billing purposes. 我们有一个系统设置程序,为了记帐目的,会间隔记录磁盘空间。 The diskspace is charged per byte, but obviously the amount changes throughout the month, so we express it as MB per hour. 磁盘空间是按字节收费的,但是显然每个月收费金额都会变化,因此我们将其表示为每小时MB。

In the billing summary, I want to show the average diskspace used in an hour. 在帐单摘要中,我想显示一小时内使用的平均磁盘空间。

At first, it was logging every five minutes. 最初,它每五分钟记录一次。 Then I changed it to log per hour. 然后我将其更改为每小时记录一次。 Eventually if there are enough accounts to log, it'll log every two or more hours. 最终,如果有足够的帐户要记录,它将每两个小时或更长时间记录一次。 But it needs to be normalized to be per hour. 但是需要将其标准化为每小时。

This is done at the end of the month, so there's really no use case of zero data. 这是在月底完成的,因此实际上没有零数据的用例。

Here is some sample data extracted from MySQL. 这是从MySQL中提取的一些示例数据。 The first query is for the five-minute logs, and the second for the hour logs. 第一个查询是五分钟的日志,第二个查询是小时的日志。

mysql> SELECT FROM_UNIXTIME(date) as date, CONCAT(ROUND(bytes / 1048576, 0), ' MB') AS 'disk usage' FROM host_manager_client_resource_usage WHERE type = 'diskspace' LIMIT 25;
+---------------------+------------+
| date                | disk usage |
+---------------------+------------+
| 2013-03-23 22:40:01 | 28130 MB   |
| 2013-03-23 22:45:01 | 28130 MB   |
| 2013-03-23 22:50:01 | 28116 MB   |
| 2013-03-23 22:55:01 | 28111 MB   |
| 2013-03-23 23:00:01 | 28105 MB   |
| 2013-03-23 23:06:58 | 28089 MB   |
| 2013-03-23 23:10:01 | 28089 MB   |
| 2013-03-23 23:15:01 | 28086 MB   |
| 2013-03-23 23:20:01 | 28086 MB   |
| 2013-03-23 23:25:01 | 28086 MB   |
| 2013-03-23 23:30:01 | 28086 MB   |
| 2013-03-23 23:35:01 | 27644 MB   |
| 2013-03-23 23:40:01 | 26902 MB   |
| 2013-03-23 23:45:01 | 26640 MB   |
| 2013-03-23 23:50:02 | 26640 MB   |
| 2013-03-23 23:55:01 | 26640 MB   |
| 2013-03-24 00:00:01 | 25556 MB   |
| 2013-03-24 00:05:01 | 25556 MB   |
| 2013-03-24 00:10:01 | 24798 MB   |
| 2013-03-24 00:15:01 | 24547 MB   |
| 2013-03-24 00:20:01 | 24547 MB   |
| 2013-03-24 00:25:01 | 20701 MB   |
| 2013-03-24 00:30:01 | 20805 MB   |
| 2013-03-24 00:35:02 | 20807 MB   |
| 2013-03-24 00:40:01 | 20613 MB   |
+---------------------+------------+
25 rows in set (0.00 sec)

mysql> SELECT FROM_UNIXTIME(date) as date, CONCAT(ROUND(bytes / 1048576, 0), ' MB') AS 'disk usage' FROM host_manager_client_resource_usage WHERE duration > 2395 AND type = 'diskspace' ORDER BY id LIMIT 25;
+---------------------+------------+
| date                | disk usage |
+---------------------+------------+
| 2013-03-23 22:40:01 | 28130 MB   |
| 2013-03-28 13:20:37 | 772 MB     |
| 2013-03-28 15:00:01 | 772 MB     |
| 2013-03-28 16:00:02 | 772 MB     |
| 2013-03-28 17:00:01 | 771 MB     |
| 2013-03-28 18:00:01 | 771 MB     |
| 2013-03-28 19:00:01 | 771 MB     |
| 2013-03-28 20:00:01 | 771 MB     |
| 2013-03-28 21:00:01 | 771 MB     |
| 2013-03-28 22:00:01 | 771 MB     |
| 2013-03-28 23:00:01 | 771 MB     |
| 2013-03-29 00:00:01 | 771 MB     |
| 2013-03-29 01:00:01 | 771 MB     |
| 2013-03-29 02:00:01 | 771 MB     |
| 2013-03-29 03:00:01 | 771 MB     |
| 2013-03-29 04:00:01 | 771 MB     |
| 2013-03-29 05:00:01 | 767 MB     |
| 2013-03-29 06:00:01 | 767 MB     |
| 2013-03-29 07:00:01 | 767 MB     |
| 2013-03-29 08:00:01 | 767 MB     |
| 2013-03-29 09:00:01 | 769 MB     |
| 2013-03-29 10:00:01 | 769 MB     |
| 2013-03-29 11:00:01 | 769 MB     |
| 2013-03-29 12:00:01 | 769 MB     |
| 2013-03-29 13:00:01 | 766 MB     |
+---------------------+------------+
25 rows in set (0.00 sec)

Why don't you take the usage at the very beginning, usage at the very end, subtract and divide by the number of hours that passed? 为什么不一开始就使用用量,一开始就使用用量,减去并除以过去的小时数?

You would have to do a separate subselect to get the first timestamp + usage, last timestamp + usage. 您将必须做一个单独的子选择,以获取第一个时间戳+使用情况,最后一个时间戳+使用情况。 Then it is a simple equation like: 然后是一个简单的等式:

(end_usage-start_usage) /
(time_to_sec(timediff(end_timestamp, start_timestamp)) / 3600)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM