Good morning, I have an desktop app that uploads files to a WCF service and then WCF Service uploads to Amazon S3.
This is my WCF method that receives the file and uploads to S3.
public void UploadFile(RemoteFileInfo request)
{
config = new AmazonS3Config();
config.CommunicationProtocol = Protocol.HTTP;
accessKeyID = "XXXXXXX";
secretAccessKeyID = "YYYYYYYY";
client = Amazon.AWSClientFactory.CreateAmazonS3Client(accessKeyID, secretAccessKeyID, config);
int chunkSize = 2048;
byte[] buffer = new byte[chunkSize];
using (System.IO.MemoryStream writeStream = new System.IO.MemoryStream())
{
do
{
// read bytes from input stream
int bytesRead = request.FileByteStream.Read(buffer, 0, chunkSize);
if (bytesRead == 0) break;
// simulates slow connection
System.Threading.Thread.Sleep(3);
// write bytes to output stream
writeStream.Write(buffer, 0, bytesRead);
} while (true);
// report end
Console.WriteLine("Done!");
// start the uploading to S3
PutObjectRequest fileRequest = new PutObjectRequest();
fileRequest.WithInputStream(writeStream);
fileRequest.Key = "testfile.pdf";
fileRequest.WithBucketName("tempbucket");
fileRequest.CannedACL = S3CannedACL.Private;
fileRequest.StorageClass = S3StorageClass.Standard;
client.PutObject(fileRequest);
writeStream.Close();
}
}
On my client I get the progress in real time when upload the file to the WCF Service but when I get the 100% complete it doesnt mean that the file has already uploaded to S3 , so I would like to know if its possible to being uploading the file to S3 while Im writing the stream (inside of the using
(System.IO.MemoryStream writeStream = new System.IO.MemoryStream())
{
Is this possible? Any guideline on how to do it?
Appreciate in advance.
You can use InputStream property of PutObjectRequest
public void UploadFile(RemoteFileInfo request)
{
config = new AmazonS3Config();
config.CommunicationProtocol = Protocol.HTTP;
accessKeyID = "XXXXXXX";
secretAccessKeyID = "YYYYYYYY";
client = Amazon.AWSClientFactory.CreateAmazonS3Client(accessKeyID,secretAccessKeyID,config);
int chunkSize = 2048;
byte[] buffer = new byte[chunkSize];
PutObjectRequest fileRequest = new PutObjectRequest();
fileRequest.Key = "testfile.pdf";
fileRequest.WithBucketName("tempbucket");
fileRequest.CannedACL = S3CannedACL.Private;
fileRequest.StorageClass = S3StorageClass.Standard;
using (fileRequest.InputStream = new System.IO.MemoryStream())
{
do
{
// read bytes from input stream
int bytesRead = request.FileByteStream.Read(buffer, 0, chunkSize);
if (bytesRead == 0) break;
// simulates slow connection
System.Threading.Thread.Sleep(3);
// write bytes to output stream
fileRequest.InputStream.Write(buffer, 0, bytesRead);
} while (true);
// report end
Console.WriteLine("Done!");
client.PutObject(fileRequest);
}
}
I would recommend uploading the file to the WCF as chunks instead of a stream. I did so and it works just fine. also you need to return a message of the actual bytes written to the amazon later on you can increase the progress bar based on that. I know it will cause you to write a while loop in the client application but it will help you to show the progress with 100% accuracy for large files. Your WCF function should take the parameter like these
[DataContract]
class RemoteFileInfo
{
[DataMember]
Byte[] myChunk;
[DataMember]
long myOffset;
// other stuff you think you need to be sent each time.
}
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.