简体   繁体   English

使用boto3查询AWS CloudTrail以确定哪个IAM用户将文件上传到S3?

[英]Using boto3 to query AWS CloudTrail to determine which IAM user uploaded a file to S3?

I'm trying to develop a way of breaking down S3 by which users/projects using CloudTrail. 我正在尝试开发一种通过使用CloudTrail来分解用户/项目的S3的方法。 Does CloudTrail offer the ability to see which IAM user uploaded aa particular object to a bucket? CloudTrail是否提供查看哪个IAM用户将特定对象上传到存储桶的功能?

UPDATE : 更新

I have a CloudTrail turned on that monitors object-level activities (for all s3 buckets, including read and write activities), however, when I try to list "PutObject" events in my update above, it doesn't work (ie the list of events comes up blank). 我打开了一个CloudTrail,它监视对象级别的活动(对于所有s3存储桶,包括读写活动),但是,当我尝试在上面的更新中列出“ PutObject”事件时,它不起作用(即,列表的事件变为空白)。

ct_client = boto3.client('cloudtrail')

response = ct_client.lookup_events(
    LookupAttributes=[
        {
            'AttributeKey': 'EventName',
            'AttributeValue': 'PutObject'
        }],
    StartTime=datetime(2018, 3, 1),
    EndTime=datetime.now(),
    MaxResults=50
)

UPDATE 2 更新2

Images of my bucket properties and CloudTrail in the console: 控制台中我的存储桶属性和CloudTrail的图像:

CloudTrail设置

铲斗特性

Yes, you can monitor IAM users uploading objects to S3 using CloudTrail. 是的,您可以监视使用CloudTrail将对象上传到S3的IAM用户。 The amount of information that CloudTrail records is extensive. CloudTrail记录的信息量很大。

This document link will give you an intro to CloudTrail S3 logging: 该文档链接将向您介绍CloudTrail S3日志记录:

Logging Amazon S3 API Calls by Using AWS CloudTrail 使用AWS CloudTrail记录Amazon S3 API调用

This document link will give you detailed information on the events logged by CloudTrail: 此文档链接将为您提供有关CloudTrail记录的事件的详细信息:

CloudTrail Log Event Reference CloudTrail日志事件参考

Follow this document link to enable Object Level Logging for an S3 Bucket. 单击此文档链接以启用S3存储桶的对象级别日志记录。 This is necessary to see APIs such as PutObject: 要查看诸如PutObject之类的API,这是必需的:

How Do I Enable Object-Level Logging for an S3 Bucket with AWS CloudTrail Data Events? 如何使用AWS CloudTrail数据事件为S3存储桶启用对象级日志记录?

CloudTrail has a Python API. CloudTrail具有Python API。 However, you will want to directly process the CloudTrail logs stored in S3. 但是,您将需要直接处理S3中存储的CloudTrail日志。

CloudTrail Python Boto3 SDK CloudTrail Python Boto3 SDK

I prefer to analyze CloudTrail logs using Athena which makes this process easy. 我更喜欢使用Athena分析CloudTrail日志,这使此过程变得容易。

Querying AWS CloudTrail Logs 查询AWS CloudTrail日志

I don't believe the Data Events are visible in the same way as Management Events. 我认为数据事件与管理事件不一样可见。 That's certainly the case if you view the Event History in the AWS Console. 如果您在AWS控制台中查看事件历史记录,肯定是这种情况。

As suggested elsewhere, laying an Athena table over the s3 location where the data events are stored works well - something like this will then tell you who/what uploaded the object: 如在其他地方建议的那样,在存储数据事件的s3位置上放置一个雅典娜表的效果很好-这样的话就会告诉您对象的上传者/对象:

SELECT
    useridentity
,   json_extract_scalar(requestparameters,'$.bucketName')
,   json_extract_scalar(requestparameters,'$.key')
FROM cloudtrail_logs
WHERE eventname IN ('PutObject')
AND json_extract_scalar(requestparameters,'$.bucketName') = 'xxx'
AND json_extract_scalar(requestparameters,'$.key') = 'yyy';

Where cloudtrail_logs is created in line with the docs at: https://docs.aws.amazon.com/athena/latest/ug/cloudtrail-logs.html 根据以下文档创建cloudtrail_logs的位置: https : cloudtrail_logs

useridentity will not always be an IAM user - it may be an AWS Service, an external account, an assumed role as well - you can use the .type element to filter as required or simply pull all the elements. useridentity并不总是IAM用户-它可以是AWS服务,外部帐户,也可以充当角色-您可以根据需要使用.type元素进行过滤,也可以简单地提取所有元素。

Depending on the number of objects you have in S3 / the size of your cloudtrail_logs in S3 you may want to refine the s3 location of the cloudtrail_logs table by date - eg: 根据S3中对象的数量/ S3中cloudtrail_logs的大小,您可能希望按日期细化cloudtrail_logs表的s3位置-例如:

s3://<BUCKETNAME>/AWSLogs/<ACCOUNTNUMBER>/CloudTrail/<REGION>/2018/08/17

If you wanted you could execute the Athena query using boto3 saving the output to an S3 location and then pull that data from S3 also using boto3. 如果需要,可以使用boto3执行Athena查询,将输出保存到S3位置,然后也使用boto3从S3提取数据。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM