简体   繁体   中英

Sending image as encoded and uploading to s3 after decoding it

I am trying to upload very heavy files on s3. Right now the image first go to server temp, then i am uploading the same to s3. But this process is time taking and if multiple users are uploading heavy images at the same time, it is consuming high bandwidth which is affecting the production site. Given i have 3 more sizes to be uploaded, hence i am resizing the uploaded image the sending the other 3 resized images to s3 along with this.

I researched on this and found out these 3 possible solutions..

  1. Upload image directly to s3 ..this is ruled out as in this case we have to expose aws keys, which we can't.
  2. Add all images in the queue and run few jobs .. in this case jobs will be uploading the images later from the server to s3 after resizing them. This is ruled out since this will also consume some bandwidth and will definitely affect the server performance, also we have aws-eb configured, so as soon as we deploy the server the images will be lost, and hence not of use.
  3. Using javascript .. in this case we will be encoding the image into base64 from javascript and will send the image using ajax to server, and then will try to decode it there and save it somewhere and then upload it to s3.

So my question here is will it save the time for uploading an heavy image using 3rd option? I can send image encoded to server, but will it save user some time to upload?

If not, then what else i can do to save time and bandwidth of server while uploading heavy images to s3. Please help.

If at all possible, you should upload directly to Amazon S3 . This avoids 'double-handling' of the files and is a much more scalable solution.

You are concerned about "exposing AWS keys". There is no need to be concerned. You can generate temporary, time-limited credentials with a limited set of permissions using the Security Token Service . It works this way:

  • Your application authenticates users and checks to see that they are authorised to upload objects to Amazon S3
  • Your application calls the Security Token Service and requests a set of temporary credentials with permissions that only allow upload to a specific bucket and subdirectory within Amazon S3
  • Pass those credentials to your users, or use them in a web page, to allow them to upload objects

The key concept with AWS is to design for massive scale. Uploading directly to Amazon S3 makes this possible, whereas uploading to your own server first causes a bottleneck (and costs more money!).

Number 2 is actually the recommend option when dealing with image processing. You can also use Lambda functions to do the image conversion. ( http://docs.aws.amazon.com/lambda/latest/dg/walkthrough-s3-events-adminuser.html )

The job will be triggered as soon as an image is added to a specific S3 bucket.

If need be, you can also upload images directly to S3 ( http://docs.aws.amazon.com/AmazonS3/latest/dev/UsingHTTPPOST.html )

I'm not sure if you have done this yet, but I have used the javascript sdk to do this, I was doing what you where doing by saving the images on another server and then shifting them to s3, but with the javascript sdk you can go directly to s3, you can add events to process the images after they arrive.

  • Use Cognito to get Unauthenicated User/Authenticated User access.
  • Grant permissions as an Authenicated User in the relevant bucket.
  • Edit the CORS in the Configuration Editor.
  • Use the code in the javascript sdk to do what ever you want.

I hope this helps.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM