[英]Apache Beam/Google Dataflow - Exporting Data from Google Datastore to File in Cloud Storage
I need create a file report for user request. 我需要为用户请求创建文件报告。 Each user select the filter for file report, and my application should generate a file in cloud storage and send a notification with the file link generated.
每个用户都选择文件报告过滤器,我的应用程序应该在云存储中生成一个文件,并发送带有生成的文件链接的通知。
This is the application workflow: 这是应用程序工作流程:
It is possible to create a file for each pubsub entrance ? 是否可以为每个pubsub入口创建文件?
How I do to create a file with custom name? 如何创建具有自定义名称的文件?
It is correct this architecture ? 这种架构是正确的吗?
Your use case sounds as if it would be more applicable to google cloud storage than cloud datastore. 您的用例听起来似乎比云数据存储区更适用于Google云存储。 Google cloud storage is meant for opaque file-like blobs of data, and provides a method to receive pubsub notifications on file updates https://cloud.google.com/storage/docs/pubsub-notifications .
Google云存储用于存储不透明的文件状数据,并提供一种方法来接收有关文件更新的pubsub通知https://cloud.google.com/storage/docs/pubsub-notifications 。
However, its a bit unclear why you're using the indirection of pubsub and datastore in this case. 但是,在这种情况下,为什么要使用pubsub和数据存储的间接寻址还不清楚。 Could the server handling the client request instead directly make a call to the google cloud storage api?
服务器可以处理客户端请求,而是直接调用Google云存储api吗?
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.