[英]AWS SDK v2 AllAccessDisabled error for S3 file copy
I'm in the process of switching over to the new aws-sdk in a rails app I wrote and cannot for the life of me find working corresponding methods in the v2 sdk. 我正在切换到我编写的rails应用程序中的新aws-sdk,并且在我的生活中找不到在v2 sdk中使用相应的方法。 I'm also running into access denied issues I can't work out. 我也遇到访问被拒绝的问题,我无法解决。
The way I make use of the v1 sdk is that users directly upload to s3 using an "uploads" namespaced key, and after they create the object they're working on, a callback moves the file to the longterm key and deletes the old one. 我使用v1 sdk的方式是用户使用“uploads”命名空间密钥直接上传到s3,在他们创建了他们正在处理的对象之后,回调将文件移动到longterm密钥并删除旧密钥。 Here is an example of that: 这是一个例子:
def move_file
old_key = s3_key
new_key = "#{self.class.table_name}/#{id}/#{Digest::SHA1.hexdigest([Time.now, rand].join)}/#{filename}"
AWS.config(access_key_id: ENV['AWS_ACCESS_KEY_ID'], secret_access_key: ENV['AWS_SECRET_ACCESS_KEY'], region: 'us-east-1')
s3 = AWS::S3.new
bucket_name = ENV['AWS_S3_BUCKET']
bucket = s3.buckets[bucket_name]
object = bucket.objects[old_key]
begin
object.move_to new_key, :acl => :public_read
rescue AWS::S3::Errors::NoSuchKey
errors.add(:base, "Oops! Something went wrong uploading your file. Please try again, and if the problem persists, open a trouble ticket.")
end
if !bucket.objects[old_key].exists? && bucket.objects[new_key].exists?
update_column(:s3_key, new_key)
end
end
Works great, but now I'm trying to update to the new sdk. 效果很好,但现在我正在尝试更新到新的sdk。 What I've been trying is this: 我一直在努力的是:
def move_file
old_key = file
new_key = "#{self.class.table_name}/#{id}/#{Digest::SHA1.hexdigest([Time.now, rand].join)}/#{filename}"
s3 = Aws::S3::Client.new
begin
s3.copy_object({copy_source:old_key, key:new_key, bucket: ENV['AWS_S3_BUCKET'], acl:'public-read'})
s3.delete_object({bucket: ENV['AWS_S3_BUCKET'], key:old_key})
update_column(:file, new_key)
rescue Aws::S3::Errors::ServiceError
errors.add(:base, "Oops! Something went wrong uploading your file. Please try again, and if the problem persists, open a trouble ticket.")
end
end
Whenever I try to move the uploaded file it throws and error - Aws::S3::Errors::AllAccessDisabled: All access to this object has been disabled 每当我尝试移动上传的文件时它会抛出并发生错误 - Aws :: S3 :: Errors :: AllAccessDisabled:对此对象的所有访问都被禁用
I have tried changing the way I handle security credentials. 我试过改变处理安全凭证的方式。 Instead of a naked access key/ secret key pair, I created a user in IAM, attached a policy that grants them full access to S3 and tried using those credentials, to no avail. 我没有使用裸访问密钥/密钥对,而是在IAM中创建了一个用户,附加了一个策略,授予他们对S3的完全访问权限并尝试使用这些凭据,但无济于事。
What am I doing wrong? 我究竟做错了什么? But also, if anyone is familiar with the new sdk, is my copy_object approach even correct? 但是,如果有人熟悉新的sdk,我的copy_object方法是否正确?
The error is caused by the :copy_source
value you are passing to #copy_object
. 该错误是由您传递给#copy_object
的:copy_source
值引起的。 This value must be the source bucket and source key, separated by a slash (/): 该值必须是源桶和源密钥,用斜杠(/)分隔:
"#{sourcebucket}/#{sourcekey}"
Your old_key
value contains a forward slash. 您的old_key
值包含正斜杠。 Amazon S3 is taking the first path segment of that key and treating it as a bucket name. Amazon S3将获取该密钥的第一个路径段,并将其视为存储桶名称。 Because you do not have permission to that bucket, you are getting an auth error. 因为您没有该存储桶的权限,所以您收到了身份验证错误。 Your credential configuration is probably just fine. 您的凭证配置可能很好。
To correct this error: 要更正此错误:
def move_file
bucket = ENV["AWS_S3_BUCKET"]
old_key = file
new_key = "#{self.class.table_name}/#{id}/#{Digest::SHA1.hexdigest([Time.now, rand].join)}/#{filename}"
s3 = Aws::S3::Client.new
begin
s3.copy_object(bucket:bucket, key:new_key, copy_source:"#{bucket}/#{old_key}", acl:'public-read')
s3.delete_object(bucket:bucket, key:old_key)
update_column(:file, new_key)
rescue Aws::S3::Errors::ServiceError
errors.add(:base, "Oops! Something went wrong uploading your file. Please try again, and if the problem persists, open a trouble ticket.")
end
end
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.