[英]Using nodejs and knox how do I see the files in an S3 Bucket
I upload files in many different ways to my s3 bucket. 我以许多不同的方式将文件上传到s3存储桶。
In python i could use boto like this: 在python中,我可以像这样使用boto:
from boto.s3.connection import S3Connection
conn = S3Connection('access-key','secret-access-key')
bucket = conn.get_bucket('bucket')
for key in bucket.list():
print key.name
In node I have used knox to connect to buckets to get urls, but how could I iterate through the keys in node to see all files in my bucket? 在节点中,我已使用knox连接到存储桶以获取url,但是如何遍历节点中的键以查看存储桶中的所有文件?
If your buckets get big, best stream those keys! 如果您的存储桶变大,则最好以流方式传输这些密钥! Check out knox-copy :
查看knox-copy :
var knoxCopy = require('knox-copy');
var client = knoxCopy.createClient({
key: '<api-key-here>',
secret: '<secret-here>',
bucket: 'mrbucket'
});
client.streamKeys({
// omit the prefix to list the whole bucket
prefix: 'buckets/of/fun'
}).on('data', function(key) {
console.log(key);
});
You can do it with AwsSum. 您可以使用AwsSum做到这一点。 It is actively maintained and can perform ALL the S3 operations provided by Amazon.
它是积极维护的,可以执行Amazon提供的所有S3操作。
There is a fully featured example of exactly what you're looking for in the node-awssum-scripts repo. 在node-awssum-scripts存储库中有一个功能齐全的示例,它正是您正在寻找的东西。 It gets the first 1000 keys, then keeps doing new requests using the 'marker' parameter to the operation until there are no more keys, so you may want to look at that:
它获得前1000个键,然后继续使用操作的“标记”参数进行新请求,直到没有更多键为止,因此您可能需要看一下:
If you need any help, give me a shout on GitHub. 如果您需要任何帮助,请在GitHub上给我个呼喊。 Disclaimer: I'm chilts, author of Awssum.
免责声明:我很高兴,Awssum的作者。 :)
:)
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.