[英]How Do I Enable Object-Level Logging for an S3 Bucket using boto3
I'm trying to create an amazon cloudWatch rule which triggers whenever an object is uploaded into a bucket.我正在尝试创建一个 amazon cloudWatch 规则,该规则在对象上传到存储桶时触发。 I know that to do this I need to trigger on the PutObject Event, however best I can tell that requires enabling object level logging on the bucket.
我知道要做到这一点,我需要在 PutObject 事件上触发,但我能说的最好的情况是需要在存储桶上启用对象级日志记录。 I will be using a multitude of buckets and want to be able to automate that process, and because of how most of the system is set up using boto3 seems to make the most sense.
我将使用大量存储桶并希望能够自动化该过程,并且由于使用 boto3 设置大部分系统的方式似乎最有意义。 So how can I turn object-level logging on using boto3?
那么如何使用 boto3 打开对象级日志记录?
The only AWS official resource I've been able to find so far is: How Do I Enable Object-Level Logging for an S3 Bucket with AWS CloudTrail Data Events?到目前为止, 我能找到的唯一 AWS 官方资源是: 如何使用 AWS CloudTrail 数据事件为 S3 存储桶启用对象级日志记录?
Which explains how to enable object level logging through the GUI.其中解释了如何通过 GUI 启用对象级日志记录。 I've also looked through the boto3 library documentation
我还查看了boto3 库文档
Both have ultimately not been helpful based on my understanding.根据我的理解,两者最终都没有帮助。
My chief goal is to enable object-level logging through boto3, if that's something that can be done.我的主要目标是通过 boto3 启用对象级日志记录,如果可以的话。
You can configure an Amazon S3 Event so that, when a new object is created, it can:您可以配置 Amazon S3 事件,以便在创建新对象时,它可以:
See: Configuring Amazon S3 Event Notifications请参阅: 配置 Amazon S3 事件通知
You can use the put_event_selectors() function in CloudTrail service.您可以在 CloudTrail 服务中使用put_event_selectors()函数。
client = boto3.client('s3')
client.put_event_selectors(
TrailName='TrailName',
EventSelectors=[
{
'ReadWriteType': 'All',
'IncludeManagementEvents': True,
'DataResources': [
{
'Type': 'AWS::S3::Object',
'Values': [
'arn:aws:s3:::your_bucket_name/',
]
},
]
},
])
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.