简体   繁体   English

GCP 存储桶监控

[英]GCP Storage Bucket Monitoring

Is it possible to set-up and generate usage reports for Google Cloud Coldline Bucket?是否可以为 Google Cloud Coldline Bucket 设置和生成使用报告?

I am looking to track items like: Filename, Filesize, Download URL, Requester IP, Requester GEO, Download Status, etc.我希望跟踪以下项目:文件名、文件大小、下载 URL、请求者 IP、请求者 GEO、下载状态等。

You can definitely track some of the things you've mentioned, like filename, download URL, and requester IP out-of-the-box.您绝对可以跟踪您提到的一些内容,例如开箱即用的文件名、下载 URL 和请求者 IP。 Other elements, like requester geo-location and download status, will require additional processing.其他元素,如请求者地理位置和下载状态,将需要额外处理。


When it comes to Google Cloud Storage, you can enable logging for any kind of bucket you may have.对于 Google Cloud Storage,您可以为您可能拥有的任何类型的存储桶启用日志记录。 There are currently two options for logging access to buckets , namely Cloud Audit Logging and Access & Storage logging.目前有两个选项用于记录对存储桶的访问,即 Cloud Audit Logging 和 Access & Storage logging。 The first one is more generic, in the sense that it tracks RESTful requests in real-time, while the 2nd options is specific to Storage and can track more information about each access to a bucket.第一个更通用,因为它实时跟踪 RESTful 请求,而第二个选项特定于存储,可以跟踪有关每次访问存储桶的更多信息。 From what you said you were looking for, Access & Storage seems to be the way to go for you.从您所说的您正在寻找的内容来看,访问和存储似乎是适合您的方式。

Access & Storage logging will create CSV files with plenty of information on each access.访问和存储日志记录将创建 CSV 文件,其中包含每次访问的大量信息。 You can find exactly what's stored in these here .您可以在此处准确找到其中存储的内容。 An interesting thing to note here is that there's a field named c_ip_region that's currently not in use, but may one day contain geo-location information in the future...这里需要注意的一件有趣的事情是,有一个名为c_ip_region的字段当前未使用,但将来可能有一天包含地理位置信息......

You can find information on how to enable Access & Storage logging on a bucket via gsutil here .您可以在此处找到有关如何通过gsutil在存储桶上启用访问和存储日志记录的信息。 It basically comes down to a few commands:它基本上归结为几个命令:

  1. Create a bucket for the CSV ( gsutil mb gs://example-logs-bucket );为 CSV 创建一个存储桶 ( gsutil mb gs://example-logs-bucket );
  2. Give write permissions to the analytics account ( gsutil acl ch -g cloud-storage-analytics@google.com:W gs://example-logs-bucket );向分析帐户授予写入权限 ( gsutil acl ch -g cloud-storage-analytics@google.com:W gs://example-logs-bucket );
  3. Set the default ACL of the logs bucket to something more restrictive ( gsutil defacl set project-private gs://example-logs-bucket );将日志存储桶的默认 ACL 设置为更具限制性的内容( gsutil defacl set project-private gs://example-logs-bucket );
  4. Turn logging on ( gsutil logging set on -b gs://example-logs-bucket [-o log_object_prefix ] gs://example-bucket ).打开日志记录( gsutil logging set on -b gs://example-logs-bucket [-o log_object_prefix ] gs://example-bucket )。

Seeing as this produces CSV files, you can easily import those into BigQuery .看到这会生成 CSV 文件, 您可以轻松地将它们导入 BigQuery That way, you can query its contents easily;这样,您就可以轻松查询其内容; This import can be done via Dataflow or Cloud Functions (the latter is the best option if you want to customize the data before importing it; it also can be triggered by Storage events ).此导入可以通过 Dataflow 或 Cloud Functions 完成(如果您想在导入之前自定义数据,后者是最佳选择;它也可以由 Storage 事件触发)。

GCP monitoring provides graphs for a number of requests and data in bytes etc graphs in the GCP. GCP 监控为 GCP 中的许多请求和数据(字节等)图表提供图表。

To enable GCP monitoring you have to enable GCP monitoring API.要启用 GCP 监控,您必须启用 GCP 监控 API。

Then Go to然后去

monitoring -> Select Dashboard -> Select Cloud Storage.

It will show you a chart/graph for GCP storage bucket activities.它将向您显示 GCP 存储桶活动的图表/图表。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM