简体   繁体   中英

AWS EC2 CloudWatch metrics interpretation

How should I interpret the AWS EC2 CloudWatch NetworkIn and NetworkOut metrics?

What does the Statistic: Average in the chart refer to?

云观察

The docs state that "the units for the Amazon EC2 NetworkIn metric are Bytes because NetworkIn tracks the number of bytes that an instance receives on all network interfaces”.

When viewing the chart below, Network In (Bytes) , with Statistic: Average and a Period: 5 Minutes (note that the time window is zoomed in to around five hours, not one week), it is not immediately obvious how the average is calculated. 网络平均

Instance i-aaaa1111 (orange) at 15.29: 2664263.8

If I change Statistic to “Sum”, I get this: 网络总和

The same instance (i-aaaa1111), now at 15.31: 13321319

It turns out 13321319 /5 = 2664263.8 , suggesting that incoming network traffic during those five minutes was, on average, 2664263.8 Bytes/minute.

=> 2664263.8/60 ≈ 44404.4 Bytes/second

=> 4404.39/1024 ≈ 43.3KB/s

=> 43.3*8 ≈ 350Kbps

I tested this by repeatedly copying a large file from one instance to another, transferring at an average speed of 30.1MB/s. The CloudWatch metric was 1916943925 Bytes (Average) => around 30.5MB/s

The metric, "Network In (Bytes)", refers to bytes/minute.

It appears in my case that the average is computed over the period specified. In other words: for '15 Minutes', it divides the sum of bytes for the 15-minute period by 15, for '5 Minutes', it divides the sum for the 5-minute period by 5.

Here is why I believe this: I used this chart to debug an upload where rsync was reporting ~710kB/sec (~727,000 bytes / sec) when I expected a faster upload. After selecting lots of different sum values in the EC2 plot, I determined that the sums were correct numbers of bytes for the period specified (selecting a 15 minute period tripled the sum compared to a 5 minute period). Then viewing the average and selecting different periods shows that I get the same value of ~45,000,000 when I select a period of "5 Minutes", "15 Minutes", or "1 Hour".

45,000,000 (bytes/???) / 730,000 (bytes/sec) is approximately 60, so ??? is a minute (60 seconds). In fact, ~45,000,000 / 1024 / 60 = ~730 kB/sec and this is within 3% of what rsync was reporting.

Incidentally, my 'bug' was user error - I had failed to pass the '-z' option to rsync and therefore was not getting the compression boost I expected.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM