简体   繁体   中英

Pass Input stream over HTTP in mule using HTTP connector

I have a requirement to pass the input stream over HTTP. I am reading the file using the File connector in mule and passing the input stream to the HTTP connector. The file size is going to be huge ranging from 250 mb to ~ 10 gb. I am trying with a 700 mb file and HTTP connector runs out of memory. I think the connector is loading everything into memory. Why is it not passing it as the stream. Let me know what is the best way to do this.

I believe it is more of Architect Level decision to design without performance impact. If it is going to be a huge file transfer, in GB level prefer File to File Transfer protocol. In your case already you are reading the file from FTP, process it and again upload to FTP or AmazonS3 or SFTP. Make sure you are compressing the file on Upload.

1.Write custom code using Java component for HTTP call and use Multipart upload concept. (or) 2. Split the file and push part by part in HTTP connector Mule. ( or)

3.Use Batch Component in Mule (Super fast)- Usually Batch will process huge amount of data in a seconds, Here you can split and push to HTTP part by part( use Batch size(handy).

Before choosing the 1st option, try with Batch Component Refer: https://docs.mulesoft.com/mule-user-guide/v/3.6/batch-processing . It will help you.Batch is specially designed to handle large Payload. Note: Make sure you need to push all in one go or split and Go.

Please share your config xml. Make sure that you have enabled streaming in your file connector. If not enabled entire file content will be loaded into your vm which kills the VM

Cheers!

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM