I would like to upload a file with Amazon S3 inside a .NET Core project. Is there any reference on how to create and use an AmazonS3 client? All i can find in AmazonS3 documentation for .Net Core is this(http://docs.aws.amazon.com/sdk-for-net/v3/developer-guide/net-dg-config-netcore.html) which is not very helpfull.
I did using IFormFile, like this:
(You need to install AWSSDK.S3)
public async Task UploadFileToS3(IFormFile file)
{
using (var client = new AmazonS3Client("yourAwsAccessKeyId", "yourAwsSecretAccessKey", RegionEndpoint.USEast1))
{
using (var newMemoryStream = new MemoryStream())
{
file.CopyTo(newMemoryStream);
var uploadRequest = new TransferUtilityUploadRequest
{
InputStream = newMemoryStream,
Key = file.FileName,
BucketName = "yourBucketName",
CannedACL = S3CannedACL.PublicRead
};
var fileTransferUtility = new TransferUtility(client);
await fileTransferUtility.UploadAsync(uploadRequest);
}
}
}
For simple file uploading in a .netcore project, I followed this link.
After finishing the simple file upload procedure, I followed the documentation on this and this links, which were very helpful. Following two links were also helpful for a quick start.
https://github.com/awslabs/aws-sdk-net-samples/blob/master/ConsoleSamples/AmazonS3Sample/AmazonS3Sample/S3Sample.cs
http://www.c-sharpcorner.com/article/fileupload-to-aws-s3-using-asp-net/
This was my final code snippets in the controller for file upload (I skipped the view part, which is elaborately explained in the link shared above).
[HttpPost("UploadFiles")]
public IActionResult UploadFiles(List<IFormFile> files)
{
long size = files.Sum(f => f.Length);
foreach (var formFile in files)
{
if (formFile.Length > 0)
{
var filename = ContentDispositionHeaderValue
.Parse(formFile.ContentDisposition)
.FileName
.TrimStart().ToString();
filename = _hostingEnvironment.WebRootPath + $#"\uploads" + $#"\{formFile.FileName}";
size += formFile.Length;
using (var fs = System.IO.File.Create(filename))
{
formFile.CopyTo(fs);
fs.Flush();
}//these code snippets saves the uploaded files to the project directory
uploadToS3(filename);//this is the method to upload saved file to S3
}
}
return RedirectToAction("Index", "Library");
}
This is the method to upload files to Amazon S3:
private IHostingEnvironment _hostingEnvironment;
private AmazonS3Client _s3Client = new AmazonS3Client(RegionEndpoint.EUWest2);
private string _bucketName = "mis-pdf-library";//this is my Amazon Bucket name
private static string _bucketSubdirectory = String.Empty;
public UploadController(IHostingEnvironment environment)
{
_hostingEnvironment = environment;
}
public void uploadToS3(string filePath)
{
try
{
TransferUtility fileTransferUtility = new
TransferUtility(new AmazonS3Client(Amazon.RegionEndpoint.EUWest2));
string bucketName;
if (_bucketSubdirectory == "" || _bucketSubdirectory == null)
{
bucketName = _bucketName; //no subdirectory just bucket name
}
else
{ // subdirectory and bucket name
bucketName = _bucketName + #"/" + _bucketSubdirectory;
}
// 1. Upload a file, file name is used as the object key name.
fileTransferUtility.Upload(filePath, bucketName);
Console.WriteLine("Upload 1 completed");
}
catch (AmazonS3Exception s3Exception)
{
Console.WriteLine(s3Exception.Message,
s3Exception.InnerException);
}
}
This was all for uploading files in Amazon S3 bucket. I worked on .netcore 2.0 and also, don't forget to add necessary dependencies for using Amazon API. These were:
AWSSDK.Core
AWSSDK.Extensions.NETCore.Setup
AWSSDK.S3
Hope, this would help.
I wrote a complete sample for upload a file to Amazon AWS S3 with asp.net core mvc
you can see my sample project in github link:
https://github.com/NevitFeridi/AWS_Upload_Sample_ASPCoreMVC
There was a function for uploading file to S3 using Amazon.S3 SDK in the HomeController.
In this function " UploadFileToAWSAsync " you can find every things you need
// you must set your accessKey and secretKey
// for getting your accesskey and secretKey go to your Aws amazon console
string AWS_accessKey = "xxxxxxx";
string AWS_secretKey = "xxxxxxxxxxxxxx";
string AWS_bucketName = "my-uswest";
string AWS_defaultFolder = "MyTest_Folder";
protected async Task<string> UploadFileToAWSAsync(IFormFile myfile, string subFolder = "")
{
var result = "";
try
{
var s3Client = new AmazonS3Client(AWS_accessKey, AWS_secretKey, Amazon.RegionEndpoint.USWest2);
var bucketName = AWS_bucketName;
var keyName = AWS_defaultFolder;
if (!string.IsNullOrEmpty(subFolder))
keyName = keyName + "/" + subFolder.Trim();
keyName = keyName + "/" + myfile.FileName;
var fs = myfile.OpenReadStream();
var request = new Amazon.S3.Model.PutObjectRequest
{
BucketName = bucketName,
Key = keyName,
InputStream = fs,
ContentType = myfile.ContentType,
CannedACL = S3CannedACL.PublicRead
};
await s3Client.PutObjectAsync(request);
result = string.Format("http://{0}.s3.amazonaws.com/{1}", bucketName, keyName);
}
catch (Exception ex)
{
result = ex.Message;
}
return result;
}
Addition to #Tiago's answers, AWSS3 SDK is changed a bit, so here is the updated method:
public async Task UploadImage(IFormFile file)
{
var credentials = new BasicAWSCredentials("access", "secret key");
var config = new AmazonS3Config
{
RegionEndpoint = Amazon.RegionEndpoint.EUNorth1
};
using var client = new AmazonS3Client(credentials, config);
await using var newMemoryStream = new MemoryStream();
file.CopyTo(newMemoryStream);
var uploadRequest = new TransferUtilityUploadRequest
{
InputStream = newMemoryStream,
Key = file.FileName,
BucketName = "your-bucket-name",
CannedACL = S3CannedACL.PublicRead
};
var fileTransferUtility = new TransferUtility(client);
await fileTransferUtility.UploadAsync(uploadRequest);
}
Per AWS SDK docs, .Net Core support was added in late 2016.
https://aws.amazon.com/sdk-for-net/
So the instructions for uploading files to S3 should be identical to any other instructions for .Net.
The "getting started" guide for the AWS SDK for .Net is literally the case you describe of connecting and uploading a file to S3 - and included as a sample project ready for you to run if you've installed the "AWS Toolkit for Visual Studio" (which should be installed with the .Net AWS SDK).
So all you need to do is open visual studio, find their sample S3 project, or you can look at it here:
// simple object put
PutObjectRequest request = new PutObjectRequest()
{
ContentBody = "this is a test",
BucketName = bucketName,
Key = keyName
};
PutObjectResponse response = client.PutObject(request);
This assumes you have instantiated an Amazon.S3.AmazonS3Client after including the namespace, and configured it with your own credentials.
You first need to install in the Package Manager Console:
Install-package AWSSDK.Extensions.NETCORE.Setup
Install-package AWSSDK.S3
Then you need to have the credentials file in the directory:
C:\Users\username\.aws\credentials
The credential file should have this format:
[default]
aws_access_key_id=[Write your access key in here]
aws_secret_access_key=[Write your secret access key in here]
region=[Write your region here]
I uploaded in github an example of a basic CRUD in ASP.NET CORE for S3 buckets.
We came across an issue when implementing a High-Level API in a .net core solution. When clients had low bandwidth 3mb/s approx an error was thrown by Amazon S3 (The XML you provided was not well-formed). To resolve this issue we had to make an implementation using the low-level API.
https://docs.aws.amazon.com/en_us/AmazonS3/latest/dev/LLuploadFileDotNet.html
// Create list to store upload part responses.
List<UploadPartResponse> uploadResponses = new List<UploadPartResponse>();
// Setup information required to initiate the multipart upload.
InitiateMultipartUploadRequest initiateRequest = new InitiateMultipartUploadRequest{
BucketName = bucketName,
Key = pathbucket
};
//Add metadata to file
string newDate = DateTime.Now.ToString("dd/MM/yyyy HH:mm:ss");
// Initiate the upload.
InitiateMultipartUploadResponse initResponse = await s3Client.InitiateMultipartUploadAsync(initiateRequest);
int uploadmb = 5;
// Upload parts.
long contentLength = new FileInfo(zippath).Length;
long partSize = uploadmb * (long)Math.Pow(2, 20); // 5 MB
try
{
long filePosition = 0;
for (int i = 1; filePosition < contentLength; i++) {
UploadPartRequest uploadRequest = new UploadPartRequest{
BucketName = bucketName,
Key = pathbucket,
UploadId = initResponse.UploadId,
PartNumber = i,
PartSize = partSize,
FilePosition = filePosition,
FilePath = zippath
};
// Track upload progress.
uploadRequest.StreamTransferProgress += new EventHandler<StreamTransferProgressArgs>(UploadPartProgressEventCallback);
// Upload a part and add the response to our list.
uploadResponses.Add(await s3Client.UploadPartAsync(uploadRequest));
filePosition += partSize;
}
// Setup to complete the upload.
CompleteMultipartUploadRequest completeRequest = new CompleteMultipartUploadRequest {
BucketName = bucketName,
Key = pathbucket,
UploadId = initResponse.UploadId
};
completeRequest.AddPartETags(uploadResponses);
// Complete the upload.
CompleteMultipartUploadResponse completeUploadResponse = await s3Client.CompleteMultipartUploadAsync(completeRequest);
}
catch (Exception exception)
{
Console.WriteLine("An AmazonS3Exception was thrown: { 0}", exception.Message);
// Abort the upload.
AbortMultipartUploadRequest abortMPURequest = new AbortMultipartUploadRequest {
BucketName = bucketName,
Key = keyName,
UploadId = initResponse.UploadId
};
}
Related
I am new to .NET and I have to perform this. Assuming we have the connection string and the Environment variable setup, could someone give me resources or code or guide on how to do it?
I just need to upload a pdf file in Azure Blob Storage using Minimal API
From the Minimal API document, we can see that the Minimal API does not support the binding the IFormFile.
No support for binding from forms. This includes binding IFormFile. We plan to add support for IFormFile in the future.
So, to upload file in the Minimal API, you can get the upload file from the HttpRequest Form. Refer to the following code:
app.MapPost("/upload", (HttpRequest request) =>
{
if (!request.Form.Files.Any())
return Results.BadRequest("At least one fie is need");
//Do something with the file
foreach(var item in request.Form.Files)
{
var file = item;
//insert the file into the Azure storage
}
return Results.Ok();
});
The screenshot as below:
Then, to upload the file to Azure Blob Storage, refer the following tutorial:
Upload images/files to blob azure, via web api ASP.NET framework Web application
Code like this:
CloudStorageAccount storageAccount;
Dictionary<string, object> dict = new Dictionary<string, object>();
string strorageconn = ConfigurationManager.AppSettings.Get("MyBlobStorageConnectionString");
if (CloudStorageAccount.TryParse(strorageconn, out storageAccount))
{
try
{
// Create the CloudBlobClient that represents the
// Blob storage endpoint for the storage account.
CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
// Create a container called 'quickstartblobs' and
// append a GUID value to it to make the name unique.
CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference("images");
await cloudBlobContainer.CreateIfNotExistsAsync();
// Set the permissions so the blobs are public.
BlobContainerPermissions permissions = new BlobContainerPermissions
{
PublicAccess = BlobContainerPublicAccessType.Blob
};
await cloudBlobContainer.SetPermissionsAsync(permissions);
var httpRequest = HttpContext.Current.Request;
foreach (string file in httpRequest.Files)
{
HttpResponseMessage response = Request.CreateResponse(HttpStatusCode.Created);
var postedFile = httpRequest.Files[file];
string imageName = ("images" + serverTime.Year.ToString() + serverTime.Month.ToString() + serverTime.Day.ToString() +
serverTime.Hour.ToString() + serverTime.Minute.ToString() + serverTime.Second.ToString() + serverTime.Millisecond.ToString()
+ postedFile.FileName );
if (postedFile != null && postedFile.ContentLength > 0)
{
int MaxContentLength = 1024 * 1024 * 1; //Size = 1 MB
IList<string> AllowedFileExtensions = new List<string> { ".jpg", ".gif", ".png" };
var ext = postedFile.FileName.Substring(postedFile.FileName.LastIndexOf('.'));
var extension = ext.ToLower();
if (!AllowedFileExtensions.Contains(extension))
{
var message = string.Format("Please Upload image of type .jpg,.gif,.png.");
dict.Add("error", message);
return Request.CreateResponse(HttpStatusCode.BadRequest, dict);
}
else if (postedFile.ContentLength > MaxContentLength)
{
var message = string.Format("Please Upload a file upto 1 mb.");
dict.Add("error", message);
return Request.CreateResponse(HttpStatusCode.BadRequest, dict);
}
else
{
CloudBlockBlob cloudBlockBlob = cloudBlobContainer.GetBlockBlobReference(imageName);
cloudBlockBlob.Properties.ContentType = postedFile.ContentType;
await cloudBlockBlob.UploadFromStreamAsync(postedFile.InputStream);
}
}
var message1 = string.Format("Image Updated Successfully.");
return Request.CreateErrorResponse(HttpStatusCode.Created, message1);
}
var res3 = string.Format("Please Upload a image.");
dict.Add("error", res3);
return Request.CreateResponse(HttpStatusCode.NotFound, dict);
}
catch (Exception ex)
{
HttpResponseMessage response2 = Request.CreateResponse(HttpStatusCode.BadRequest, ex.InnerException.ToString());
return response2;
}
}
else
{
var res = string.Format("Did not connect successfull.");
dict.Add("error", res);
return Request.CreateResponse(HttpStatusCode.NotFound, dict);
}
I am trying to upload a file to an S3 bucket using AWSSDK.S3. I am trying to use the TransferUtility.UploadAsync() method, as this is what we are using to upload files to other buckets, using other AWS credentials. However, when I use that here I am getting AccessDenied.
var credentials =
new BasicAWSCredentials(accessKey, secretKey);
var s3Client = new AmazonS3Client(credentials, RegionEndpoint.USEast1);
// Initiate the upload.
try
{
var transferUtility = new TransferUtility(s3Client);
await transferUtility.UploadAsync(filePath, bucketName, keyName+"_2.mpg",
CancellationToken.None);
}
catch (Exception e)
{
Console.WriteLine(e);
}
This get's AccessDenied.
However, if I attempt to use a MultiPartUpload, the file was successfully uploaded.
var credentials =
new BasicAWSCredentials(accessKey, secretKey);
var s3Client = new AmazonS3Client(credentials, RegionEndpoint.USEast1);
long _fileSizeMbGrowth = 0;
long _fileSizeTotal = 0;
DateTime _startTime = DateTime.Now;
// Create list to store upload part responses.
List<UploadPartResponse> uploadResponses = new List<UploadPartResponse>();
// Setup information required to initiate the multipart upload.
InitiateMultipartUploadRequest initiateRequest = new InitiateMultipartUploadRequest
{
BucketName = bucketName,
Key = keyName+"_3.mpg",
CannedACL = S3CannedACL.BucketOwnerFullControl
};
InitiateMultipartUploadResponse initResponse =
await s3Client.InitiateMultipartUploadAsync(initiateRequest, cancellationToken);
// Upload parts.
_fileSizeTotal = new FileInfo(filePath).Length;
long _fileSizeTotalDisplay = _fileSizeTotal / 1048576;
string _fileName = new FileInfo(filePath).Name;
long partSize = 5 * (long)Math.Pow(2, 20); // 5 MB
try
{
string hashMD5;
Console.WriteLine("Uploading parts");
#pragma warning disable SCS0006 // Weak hashing function
using (var md5 = MD5.Create())
#pragma warning restore SCS0006 // Weak hashing function
{
using (var stream = File.OpenRead(filePath))
{
hashMD5 = Convert.ToBase64String(md5.ComputeHash(stream));
}
}
long filePosition = 0;
for (int i = 1; filePosition < _fileSizeTotal; i++)
{
UploadPartRequest uploadRequest = new UploadPartRequest
{
BucketName = bucketName,
Key = keyName+"_3.mpg",
UploadId = initResponse.UploadId,
PartNumber = i,
PartSize = partSize,
FilePosition = filePosition,
FilePath = filePath,
ServerSideEncryptionCustomerProvidedKeyMD5 = hashMD5
};
uploadResponses.Add(await s3Client.UploadPartAsync(uploadRequest, cancellationToken));
filePosition += partSize;
}
// Setup to complete the upload.
CompleteMultipartUploadRequest completeRequest = new CompleteMultipartUploadRequest
{
BucketName = bucketName,
Key = keyName,
UploadId = initResponse.UploadId
};
completeRequest.AddPartETags(uploadResponses);
// Complete the upload.
CompleteMultipartUploadResponse completeUploadResponse =
await s3Client.CompleteMultipartUploadAsync(completeRequest, cancellationToken);
Console.WriteLine($" : Completed in {DateTime.Now.Subtract(_startTime).TotalSeconds} Second(s)");
}
catch (Exception exception)
{
Console.WriteLine("An AmazonS3Exception was thrown: { 0}", exception.Message);
// Abort the upload.
AbortMultipartUploadRequest abortMPURequest = new AbortMultipartUploadRequest
{
BucketName = bucketName,
Key = keyName,
UploadId = initResponse.UploadId
};
await s3Client.AbortMultipartUploadAsync(abortMPURequest, cancellationToken);
}
Is there a bucket policy, or set of access permissions that would allow a multipartupload request but not a PutObject request?
In this case I ended up asking the wrong question here. It isn't an issue of access permissions or bucket policy, but that since we are uploading to someone else's bucket we need to set a ACL.
try
{
var transferUtility = new TransferUtility(s3Client);
var transferUtilityRequest = new TransferUtilityUploadRequest()
{
BucketName = bucketName,
Key = keyName+"_5.mpg",
FilePath = filePath,
**CannedACL = S3CannedACL.BucketOwnerFullControl**
};
await transferUtility.UploadAsync(transferUtilityRequest, cancellationToken);
}
catch (Exception e)
{
Console.WriteLine(e);
}
Adding the CannedACL allowed the upload request to work.
I am new to .net core and c#. I have a file upload requirement in my project. I am trying to upload my file to aws s3. I upload file from a folder in my project root folder and then delete the uploaded file from there. But when I try to upload to s3 I am getting following error
The process cannot access the file because it is being used by another process.
Here is my code to upload file
string fileFolder = Path.Combine(hostingEnvironment.WebRootPath, "TempFiles");
uniqueFileName1 = Guid.NewGuid().ToString() + "_" + cm.UDocument1.FileName;
string filePath = Path.Combine(fileFolder, uniqueFileName1);
cm.UDocument1.CopyTo(new FileStream(filePath, FileMode.Create));
var tempPath = Path.Combine(hostingEnvironment.WebRootPath, "TempFiles", Path.GetFileName(uniqueFileName1));
UploadFile(cm.UDocument1,tempPath);
[HttpPost]
public ActionResult UploadFile(IFormFile file,string path)
{
var s3Client = new AmazonS3Client(accesskey, secretkey, bucketRegion);
var fileTransferUtility = new TransferUtility(s3Client);
try
{
if (file.Length > 0)
{
var fileTransferUtilityRequest = new TransferUtilityUploadRequest
{
BucketName = bucketName,
FilePath = path,
StorageClass = S3StorageClass.Standard,
Key = file.FileName
};
fileTransferUtilityRequest.Metadata.Add("param1", "Value1");
fileTransferUtilityRequest.Metadata.Add("param2", "Value2");
fileTransferUtility.Upload(fileTransferUtilityRequest);
fileTransferUtility.Dispose();
string[] tmp = { "" };
tmp = fileName.Split("-");
}
ViewBag.Message = "File Uploaded Successfully!!";
if (System.IO.File.Exists(path))
{
System.IO.File.Delete(path);
}
}
return ViewBag.Message;
}
what can I do about it?
On this line:
cm.UDocument1.CopyTo(new FileStream(filePath, FileMode.Create));
You create a file stream but you dont dispose it.
wrap in a using statement instead:
using (var iNeedToLearnAboutDispose = new FileStream(filePath, FileMode.Create))
{
cm.UDocument1.CopyTo(iNeedToLearnAboutDispose);
}
I have a very newbie question.
I am following this docs "Azure Blob storage client library v12 for .NET" - https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-dotnet
When I tested on my console, and my Azure storage, it works.
But I was wondering if I can make a controller out of the suggested 'Main' method?
Because I want these getting and posting to the server actions initiated when the user input changes from the front end side.
This is what the Main method inside of the Program.cs looks like based on the docs
static async Task Main()
{
Console.WriteLine("Azure Blob storage v12 - .NET quickstart sample\n");
string connectionString = Environment.GetEnvironmentVariable("My_CONNECTION_STRING");
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
string containerName = "quickstartblobs" + Guid.NewGuid().ToString();
BlobContainerClient containerClient = await blobServiceClient.CreateBlobContainerAsync(containerName);
string localPath = "./data/";
string fileName = "quickstart" + Guid.NewGuid().ToString() + ".txt";
string localFilePath = Path.Combine(localPath, fileName);
// Write text to the file
await File.WriteAllTextAsync(localFilePath, "Hello, World!");
// Get a reference to a blob
BlobClient blobClient = containerClient.GetBlobClient(fileName);
Console.WriteLine("Uploading to Blob storage as blob:\n\t {0}\n", blobClient.Uri);
// Open the file and upload its data
using FileStream uploadFileStream = File.OpenRead(localFilePath);
await blobClient.UploadAsync(uploadFileStream, true);
uploadFileStream.Close();
Console.WriteLine("Listing blobs...");
// List all blobs in the container
await foreach (BlobItem blobItem in containerClient.GetBlobsAsync())
{
Console.WriteLine("\t" + blobItem.Name);
}
Console.Write("Press any key to begin clean up");
Console.ReadLine();
string downloadFilePath = localFilePath.Replace(".txt", "DOWNLOAD.txt");
Console.WriteLine("\nDownloading blob to\n\t{0}\n", downloadFilePath);
// Download the blob's contents and save it to a file
BlobDownloadInfo download = await blobClient.DownloadAsync();
using (FileStream downloadFileStream = File.OpenWrite(downloadFilePath))
{
await download.Content.CopyToAsync(downloadFileStream);
downloadFileStream.Close();
}
}
So for example, in my HomeController Can I use post related functions as
[HttpPost]
public void Post([FromBody]string value)
{
//Create a unique name for the container
string containerName = "filedata" + Guid.NewGuid().ToString();
// Create the container and return a container client object
BlobContainerClient containerClient = await blobServiceClient.CreateBlobContainerAsync(containerName);
// Create a local file in the ./data/ directory for uploading and downloading
string localPath = "./data/";
string fileName = "textfile" + Guid.NewGuid().ToString() + ".txt";
string localFilePath = Path.Combine(localPath, fileName);
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
// Get a reference to a blob
BlobClient blobClient = containerClient.GetBlobClient(fileName);
Console.WriteLine("Uploading to Blob storage as blob:\n\t {0}\n", blobClient.Uri);
// Open the file and upload its data
using FileStream uploadFileStream = File.OpenRead(localFilePath);
await blobClient.UploadAsync(uploadFileStream, true);
uploadFileStream.Close();
}
Or is this a no-go?
Thanks for helping this super newbie!
So for example, in my HomeController Can I use post related functions Or is this a no-go?
Yes, you can achieve it.
You can use postman to send post request in local to test it. Remember to remove SSL for webserver setting.
Also, change public void Post to public async Task Post and remove using in code:
FileStream uploadFileStream = File.OpenRead(localFilePath);
await blobClient.UploadAsync(uploadFileStream, true);
uploadFileStream.Close()
I try to upload an image from my WCF service to Amazon S3 web server. I have the Amazon S3 code that is working in a web project and my image upload method that is uploading image in Uploded\test.jpg in WCF service. I am not sure how I can use Amazon S3 code working with WCF service. 1st I don't know how to put Amazon credential in the web confing when I add these line of code inside the it is not uploading:
<appSettings>
<add key="AWSAccessKey" value="myaccessKey"/>
<add key="AWSSecretKey" value="MySecretKey"/>
</appSettings>
and this is my method to upload to the WCF server I guess I have to add AWS code when I said //AWS here:
[WebInvoke(UriTemplate = "UploadImage", Method = "POST")]
Stream UploadImage(Stream request)
{
Stream requestTest = request;
StreamWriter sw = null;
string logpath = HttpContext.Current.Server.MapPath("Uploded\\logtest.txt");
logpath = logpath.Replace("SSGTrnService\\", "");
HttpMultipartParser parser = new HttpMultipartParser(request, "file");
string filePath = "";
string passed = parser._content;
string sLogFormat = DateTime.Now.ToShortDateString().ToString() + " " + DateTime.Now.ToLongTimeString().ToString() + " ==> ";
sw = new StreamWriter(logpath);
sw.Flush();
if (parser.Success)
{
// Save the file somewhere
//File.WriteAllBytes(FILE_PATH + title + FILE_EXT, parser.FileContents);
// Save the file
//SaveFile( mtp.Filename, mtp.ContentType, mtp.FileContents);
FileStream fileStream = null;
BinaryWriter writer = null;
try
{
filePath = HttpContext.Current.Server.MapPath("Uploded\\test.jpg"); // BuildFilePath(strFileName, true);
filePath = filePath.Replace("SSGTrnService\\", "");
fileStream = new FileStream(filePath, FileMode.Create);
fileStream.Write(parser.FileContents, 0, parser.FileContents.Length);
// return filePath;
}
catch (Exception ex)
{
return "Error: " + ex.Message;
}
finally
{
if (fileStream != null)
fileStream.Close();
if (writer != null)
writer.Close();
//AWS Code
}
}
//
// returning text for html DOM
//
string text = "Image uploaded: " + parser.Filename + " / " + parser.ContentType + filePath + passed;
System.Text.ASCIIEncoding encoding = new System.Text.ASCIIEncoding();
MemoryStream ms = new MemoryStream(encoding.GetBytes(text));
WebOperationContext.Current.OutgoingResponse.ContentType = "text/html";
return ms;
}
Any Guide in calling Amazon S3 from WCF service would be great.
There is a amazon dll that you would need to reference (AWSSDK.dll) and then use the below lines of code:
var transferUtility = new TransferUtility(accessKey, secretKey);
var bucketName = "Files";
transferUtility.Upload(filePath, bucketName, Guid.NewGuid().ToString());
NOTE: Please make sure that the Amazon S3 bucket "Files" exists. Else you need to check if the bucket exists and then perform the upload method call. Hope that helps.