简体   繁体   中英

Spring Cloud DataFlow using launch task after new file in AWS S3 Bucket source

I'm trying to create a batch processing started by a new file in AWS S3.

So the flow is:

1 - A new file is uploaded to AWS S3 Bucket
2 - SCDF detects
3 - SCDF launch the task (Spring Batch application)
4 - Spring Batch application process the file and stores in a DB.

Something similar to this, but with S3 Bucket: https://dataflow.spring.io/docs/recipes/batch/sftp-to-jdbc/

Maybe is a misunderstood of the concept, but in SFTP Source I could set Port, Host, User and Pass, but in S3 Source I doesn't have region and credentials properties.

Where do I set the AWS properties?

There's an Amazon AWS common options section in the README (see: old-app / new-app ), which includes the common AWS-specific properties one can override.

You can pass them as inline properties in the stream definition or when deploying the stream by following the deployer properties convention.

I provided a detailed example of this use case here (with Minio S3). This works out of the box with the latest release of stream applications, but will require some customization if you are using previous versions.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM