简体   繁体   中英

Laravel secure Amazon s3 bucket files

I am using Amazon s3 but here i am facing two problems

1.I can't directly upload files to amazon server when i submit form.I mean i have to upload images to an upload folder on my PHP server and from there i have to retrieve and upload them to s3 server . Is there a way to upload images directly to s3 when we click on submit?

2.If i pass 'public' in s3 put object then i can access or view files, but if i make it public every one can view files. But i need to protect all files and view only to the authenticated user. Can any one suggest me how to fix this issue?

try {           
    $s3 = \Storage::disk('s3');
    $s3->put($strFileName, file_get_contents($img_path.$strFileName), 'public');
} catch (Aws\Exception\S3Exception $e) {
    echo "There was an error uploading the file.\n"+$e;
}

Before asking questions i have read many answers from stackoverflow but it didnt helped me to fix my issue. Thanks.

For you first question, it can be directly upload images to AWS S3.

$s3 = \Storage::disk('s3')->getDriver();
$s3->put($filePath, file_get_contents($file), array('ContentDisposition' => 'attachment; filename=' . $file_name . '', 'ACL' => 'public-read'));

You are supposed to specify your file path and the file you get from the form.

1. is there a way to upload images directly when we click on submit

Yes

How:

You need to do this using JavaScript (with AJAX) in two parts;

a) when user clicks "submit" you trap this event, first upload the file (see http://docs.aws.amazon.com/AWSJavaScriptSDK/guide/browser-examples.html for an example), and b) then submit the form through AJAX and handle the response.

However:

This allows the user to upload anything and may cause problems. There are tips (just below the example) to create authenticated URLs for 15 minutes (variable) but what happens if a user takes longer than 15 minutes, or tries to upload 100 files in 15 minutes, or uploads something other than an images file, or a badly formatted image file etc.

It's much safer to pull into your server and verify they are images and of the type/size you need and then upload from the server.

Of course, if this is a simple admin tool and you are controlling who accesses the code then go for it - hopefully you'll only upload what you expect.

2. i need to protect all files and view only to the authenticated user

By "authenticated user": if you mean "the user that uploaded the image" then s3 alone does not provide the functionality, but CloudFront does. You can issue pre-authorised URLs or signed cookies: http://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/private-content-choosing-signed-urls-cookies.html

If "authenticated user" means the person that uploaded the file, then according to the docs, this is not possible in Laravel class without getting access to the underlying client. public and private are your only visibility options by default, which are translated to public-read , but you need authenticated-read , bucket-owner-read or one of the other canned grants (ref: http://docs.aws.amazon.com/AmazonS3/latest/dev/acl-overview.html#canned-acl ) If the authenticated-read or other canned ACL grants don't give the permissions profile you need, you can create your own (details further up on that same page).

Solution is to grab the underlying client and then do put-object directly. (And then if you go that far, you may as well ditch the Laravel library and pull in s3 and do it all yourself - then you have full control of everything).

$client = $disk->getDriver()->getAdapter()->getClient();

$client->putObject([
    'Bucket' => $bucket,
    'Key'    => $fileName,
    'Body'   => $fileBody,
    'ACL'    => 'authenticated-read'  /* Or which ACL suits */
]);

I recently tackled this problem. First off yes you can upload directly to s3 here is what I used for some info on this: http://docs.aws.amazon.com/AmazonS3/latest/dev/HTTPPOSTExamples.html

First off you need to create a policy and signature server side to add to your html form for uploading files.

$policy = base64_encode(json_encode([
            "expiration" => "2100-01-01T00:00:00Z",
            "conditions" => [
                ["bucket"=> "bucketname"],
                ["starts-with", '$key', "foldername"],
                ["acl" => "public-read"],
                ["starts-with", '$Content-Type', "image/"],
                ["success_action_status" => '201'],
            ]
        ]));
$signature = base64_encode(hash_hmac('sha1',$policy,getenv('S3_SECRET_KEY'),true));

Now on the frontend my form I don't use a submit button, you could use a submit button but you will need to catch the submit and prevent the form from actually submitting till after the upload finishes.

When we click save, it generates an md5 (use npm to install) filename so that file names can't really be guessed randomly, it then uses ajax to upload the file up to S3. After this is finished it puts the file data and returned aws data in a hidden input and submits the form. It should look something like this:

<form action="/post/url" method="POST" id="form">
    <input type="text" name="other_field" />
    <input type="file" class="form-control" id="image_uploader" name="file" accept="image/*" />
    <input type="hidden" id="hidden_medias" name="medias" value="[]" />
</form>
<input type="button" value="save" id="save" />
<script>
$(document).ready(function(){
    $('#save').click(function(){
            uploadImage(function () {
                $('#form').submit();
            });
    });
});
var uploadImage = function(callback) {
    var file = $('#image_uploader')[0].files[0];
    if(file !== undefined) {
        var data = new FormData();
        var filename = md5(file.name + Math.floor(Date.now() / 1000));
        var filenamePieces = file.name.split('.');
        var extension = filenamePieces[filenamePieces.length - 1];
        data.append('acl',"public-read");
        data.append('policy',"{!! $policy !!}");
        data.append('signature',"{!! $signature !!}");
        data.append('Content-type',"image/");
        data.append('success_action_status',"201");
        data.append('AWSAccessKeyId',"{!! getenv('S3_KEY_ID') !!}");
        data.append('key',filename + '.' + extension);
        data.append('file', file);

        var fileData = {
            type: file.type,
            name: file.name,
            size: file.size
        };

        $.ajax({
            url: 'https://{bucket_name}.s3.amazonaws.com/',
            type: 'POST',
            data: data,
            processData: false,
            contentType: false,

            success: function (awsData) {
                var xmlData = new XMLSerializer().serializeToString(awsData);
                var currentImages = JSON.parse($('#hidden_medias').val());
                currentImages.push({
                    awsData: xmlData,
                    fileData: fileData
                });
                $('#hidden_medias').val(JSON.stringify(currentImages));
                callback();
            },
            error: function (errorData) {
                console.log(errorData);
            }
        });
    }
};
</script>

The controller listening for the submit then parses the JSON from that input field and creates an instance of Media (a model I created) and it stores the awsData and fileData for each image.

Then instead of pointing html image tags to the s3 file like this:

<img src="https://{bucketname}.s3.amazonaws.com/filename.jpg" />

I do something like this:

<img src="/medias/{id}" />

Then the route can go through the normal auth middleware and all you need to do in Laravel. Finally, that route points to a controller that does this:

public function getResponse($id)
{
    $media = Media::find($id);
    return (new Response('',301,['Location' => $media->info['aws']['Location']]));
}

So what this does is simply uses a 301 redirect and sets the header location to the actual aws file. Since we generate an md5 filename when we upload the file to aws each filename is an md5 so people couldn't randomly search for aws files in the bucket.

Solution to your question which I'm also using but without laravel.

1. For uploading any file directly to Amazon AWS S3 Bucket in specific folder you can do it like this.

HTML

<form action="upload.php" enctype="multipart/form-data">
    <input type="file" name="file" id="file" />
    <button type="submit">Upload file</button>
</form>

PHP - upload.php

Include aws php sdk

require "vendor/autoload.php";

Initialize S3 Client

$credentials = new Aws\Credentials\Credentials(
    '<AWS ACCESS KEY>', 
    '<AWS ACCESS SECRET>'
);

$s3Client = Aws\S3\S3Client::factory(
    [
        'credentials' => credentials
        'region' => 'us-east-1',
        'version' => 'latest'
    ]
);

Create file upload entity

$uploadEntity = array(
    'Bucket' => <S3 Bucket Name>,
    'Key'    => '<Upload_Folder_If_Any>/<FileName>',
    'Body'   => fopen($_FILES['file']['tmp_name'], 'r+'),
    //'ContentDisposition' => 'attachment; filename="<FileName>"' <-- If need to allow downloading
);

Upload to s3 Bucket

try {
    // $result will be Aws\Result object
    $result = $s3Client->putObject($uploadEntity);  
} catch (Aws\S3\Exception\S3Exception $exception) {
    // S3 Exception
}

2. Now for serve the uploaded files to authenticated users only

Firstly, you will need to create private bucket policy for the s3 Bucket.

Bucket Policy - To generate the bucket policy you can use Policy Generator . Using this you can generate policy like this. Copied from Improve.dk website

{
  "Id": "Policy1319566876317",
  "Statement": [
    {
      "Sid": "Stmt1319566860498",
      "Action": [
        "s3:GetObject"
      ],
      "Effect": "Allow",
      "Resource": "arn:aws:s3:::<BucketName>/*",
      "Principal": {
        "CanonicalUser": [
          "<CannoicalUserId_of_OAI>"
        ]
      }
    }
  ]
}

Secondly, you will need to setup the private web distribution of cloudfront the s3 Bucket. Only doing that you will be able to serve your contents to authenticated users only through aws signed url or signed cookie

Thirdly, to generate the signed url, you will need the pem file which you will get from aws console only.

Generate signed url

$cdnName = '<AWS CLOUDFRONT WEB DISTRIBUTION CDN>';
$assetName = '<UPLOAD_FOLDER_IF_ANY>/<FILENAME>';
$expiry = time() + 300;     // 5 mins expiry time, ie. the signed url will be valid only for 5 mins

$cloudfront = CloudFrontClient::factory(array(
    'credentials' => $credentials,
    'region' => 'us-east-1',
    'version' => 'latest'
));

// Use this for creating signed url
$signedUrl = $cloudFront->getSignedUrl([
    'url'         => $cdnName . '/' . $assetName,
    'expires'     => $expiry,
    'private_key' => '/path/to/your/cloudfront-private-key.pem',
    'key_pair_id' => '<cloudfront key pair id>'
]);

// Use this for signed cookie    
$signedCookie = $cloudFront->getSignedCookie([
    'url'         => $cdnName . '/' . $assetName,
    'expires'     => $expiry,
    'private_key' => '/path/to/your/cloudfront-private-key.pem',
    'key_pair_id' => '<cloudfront key pair id>'
]);

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM