简体   繁体   English

对于 AWS s3,如何动态限制 IP 地址并在 24 小时后释放它们?

[英]How can I restrict IP addresses dynamically and release them after 24hrs for AWS s3?

I want to restrict the IP address for some hours(say 24 hrs), for those who query to my AWS s3 bucket(public read only) for more than 100/second.我想将 IP 地址限制几个小时(比如 24 小时),以供那些查询我的 AWS s3 存储桶(公共只读)超过 100 次/秒的人使用。 As they might be spam and wanted to harm my business by increasing traffic and raising my AWS costing.因为它们可能是垃圾邮件,并且想通过增加流量和提高我的 AWS 成本来损害我的业务。 So far I have not found any policy example to perform this.到目前为止,我还没有找到任何执行此操作的策略示例。 How can I restrict these type of IP addresses dynamically and release them after 24hrs?如何动态限制这些类型的 IP 地址并在 24 小时后释放它们?

There is no out of the box solution for this, however you can build one:没有开箱即用的解决方案,但是您可以构建一个:

  1. Enable Server Access Logging on the bucket, which will ensure that the requests made against the bucket are logged.在存储桶上启用Server Access Logging ,这将确保记录针对存储桶发出的请求。 These logs will be similar to the following [1]:这些日志将类似于以下 [1]:

     79a59df900b949e55d96a1e698fbacedfd6e09d98eacf8f8d5218e7cd47ef2be DOC-EXAMPLE-BUCKET1 [06/Feb/2019:00:00:38 +0000] 192.0.2.3 79a59df900b949e55d96a1e698fbacedfd6e09d98eacf8f8d5218e7cd47ef2be 3E57427F3EXAMPLE REST.GET.VERSIONING - "GET /DOC-EXAMPLE-BUCKET1?versioning HTTP/1.1" 200 - 113 - 7 - "-" "S3Console/0.4" - s9lzHYrFp76ZVxRcpX9+5cjAnEH2ROuNkd2BHfIa6UkFVdtjf5mKR3/eTPFvsiP/XV/VLi31234= SigV4 ECDHE-RSA-AES128-GCM-SHA256 AuthHeader DOC-EXAMPLE-BUCKET1.s3.us-west-1.amazonaws.com TLSV1.2 arn:aws:s3:us-west-1:123456789012:accesspoint/example-AP Yes 79a59df900b949e55d96a1e698fbacedfd6e09d98eacf8f8d5218e7cd47ef2be DOC-EXAMPLE-BUCKET1 [06/Feb/2019:00:00:38 +0000] 192.0.2.3 79a59df900b949e55d96a1e698fbacedfd6e09d98eacf8f8d5218e7cd47ef2be 891CE47D2EXAMPLE REST.GET.LOGGING_STATUS - "GET /DOC-EXAMPLE-BUCKET1?logging HTTP/1.1" 200 - 242 - 11 - "-" "S3Console/0.4" - 9vKBE6vMhrNiWHZmb2L0mXOcqPGzQOI5XLnCtZNPxev+Hf+7tpT6sxDwDty4LHBUOZJG96N1234= SigV4 ECDHE-RSA-AES128-GCM-SHA256 AuthHeader DOC-EXAMPLE-BUCKET1.s3.us-west-1.amazonaws.com TLSV1.2 - - 79a59df900b949e55d96a1e698fbacedfd6e09d98eacf8f8d5218e7cd47ef2be DOC-EXAMPLE-BUCKET1 [06/Feb/2019:00:00:38 +0000] 192.0.2.3 79a59df900b949e55d96a1e698fbacedfd6e09d98eacf8f8d5218e7cd47ef2be A1206F460EXAMPLE REST.GET.BUCKETPOLICY - "GET /DOC-EXAMPLE-BUCKET1?policy HTTP/1.1" 404 NoSuchBucketPolicy 297 - 38 - "-" "S3Console/0.4" - BNaBsXZQQDbssi6xMBdBU2sLt+Yf5kZDmeBUP35sFoKa3sLLeMC78iwEIWxs99CRUrbS4n11234= SigV4 ECDHE-RSA-AES128-GCM-SHA256 AuthHeader DOC-EXAMPLE-BUCKET1.s3.us-west-1.amazonaws.com TLSV1.2 - Yes 79a59df900b949e55d96a1e698fbacedfd6e09d98eacf8f8d5218e7cd47ef2be DOC-EXAMPLE-BUCKET1 [06/Feb/2019:00:01:00 +0000] 192.0.2.3 79a59df900b949e55d96a1e698fbacedfd6e09d98eacf8f8d5218e7cd47ef2be 7B4A0FABBEXAMPLE REST.GET.VERSIONING - "GET /DOC-EXAMPLE-BUCKET1?versioning HTTP/1.1" 200 - 113 - 33 - "-" "S3Console/0.4" - Ke1bUcazaN1jWuUlPJaxF64cQVpUEhoZKEG/hmy/gijN/I1DeWqDfFvnpybfEseEME/u7ME1234= SigV4 ECDHE-RSA-AES128-GCM-SHA256 AuthHeader DOC-EXAMPLE-BUCKET1.s3.us-west-1.amazonaws.com TLSV1.2 - - 79a59df900b949e55d96a1e698fbacedfd6e09d98eacf8f8d5218e7cd47ef2be DOC-EXAMPLE-BUCKET1 [06/Feb/2019:00:01:57 +0000] 192.0.2.3 79a59df900b949e55d96a1e698fbacedfd6e09d98eacf8f8d5218e7cd47ef2be DD6CC733AEXAMPLE REST.PUT.OBJECT s3-dg.pdf "PUT /DOC-EXAMPLE-BUCKET1/s3-dg.pdf HTTP/1.1" 200 - - 4406583 41754 28 "-" "S3Console/0.4" - 10S62Zv81kBW7BB6SX4XJ48o6kpcl6LPwEoizZQQxJd5qDSCTLX0TgS37kYUBKQW3+bPdrg1234= SigV4 ECDHE-RSA-AES128-SHA AuthHeader DOC-EXAMPLE-BUCKET1.s3.us-west-1.amazonaws.com TLSV1.2 - Yes

    so they will include the IP address of the requester(s)所以他们将包括请求者的 IP 地址

  2. Create a dynamoDB table创建一个 dynamoDB 表

  3. Periodically, triggered by a schedule-based EventBridge rule, execute a lambda function which parses the log files of the last n minutes.定期由基于计划的 EventBridge 规则触发,执行 lambda function 解析最后 n 分钟的日志文件。 If a certain IP address is found to be making too many requests, have the lambda function add the IP to the dynamoDB table (together with a timestamp of the current time) and add/update the bucket policy for the bucket to restrict the IP [2], eg:如果发现某个 IP 地址发出过多请求,请让 lambda function 将 IP 添加到 dynamoDB 表(连同当前时间的时间戳)并添加/更新存储桶的存储桶策略以限制 IP [ 2],例如:

     { "Id": "SourceIP", "Version": "2012-10-17", "Statement": [ { "Sid": "SourceIP", "Action": "s3:*", "Effect": "Deny", "Resource": [ "arn:aws:s3:::DOC-EXAMPLE-BUCKET", "arn:aws:s3:::DOC-EXAMPLE-BUCKET/*" ], "Condition": { "IpAddress": { "aws:SourceIp": [ "11.11.11.11/32", "22.22.22.22/32" ] } }, "Principal": "*" } ] }
  4. Periodically, triggered by a schedule-based EventBridge rule, execute another lambda function which reads from the dynamoDB table and checks whether for any of the offending IPs within it, 24 hrs have passed.定期地,由基于计划的 EventBridge 规则触发,执行另一个 lambda function 从 dynamoDB 表读取并检查是否对于其中的任何违规 IP,24 小时过去了。 If they have, remove their IP from the bucket policy.如果他们有,请从存储桶策略中删除他们的 IP。


References:参考:

[1]: https://docs.aws.amazon.com/AmazonS3/latest/userguide/LogFormat.html [1]: https://docs.aws.amazon.com/AmazonS3/latest/userguide/LogFormat.html

[2]: https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_examples_aws_deny-ip.html [2]: https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_examples_aws_deny-ip.html

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 AWS 如何为 VPC 使用公共 IPV4 地址范围并将它们作为私有 IP 地址分配给它的资源,如 EC2s - How can AWS use public IPV4 address range for a VPC and assign them as private IP addresses to its resources like EC2s 如何限制对 AWS s3 存储桶的所有访问 - How to restrict all access to the AWS s3 bucket 如何将我的 AWS SES 验证域限制为仅特定的 VPC IP 范围? - How can I restrict my AWS SES verified domain to only a specific VPC IP range? 如何限制用户下载上传到aws s3的文件 - How to restrict users from download files uploaded to aws s3 Django + AWS s3 可以上传文件但不能访问它们 - Django + AWS s3 can upload files but not access them AWS S3 策略限制文件夹删除 - AWS S3 policy restrict folder delete 如何使用预签名 url 限制上传到 AWS S3 服务的文件的大小 - How to restrict the size of the file being uploaded on to the AWS S3 service using presigned urls 如何使用 AWS Lambda 函数从 S3 解码 a.gz 文件? - How can I decode a .gz file from S3 using an AWS Lambda function? 我如何在 ReactPlayer 上使用来自 AWS S3 的预签名 cookies 实现电影播放器 - How can I implement a movie player on ReactPlayer with presigned cookies from AWS S3 我可以将文件从 Sharepoint 复制到 AWS S3 吗? - can I copy a file from Sharepoint to AWS S3?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM