How to move a blob document from one container to another with metadata from azure function? - blob

I am trying to copy a blob document from one container to another along with the metadata. I have tried the following code from azure function but getting error mentioned in the code.
HTTP Request:
{
"SourceUrl": "https://devstorage.blob.core.windows.net/test-docs/123.jpeg",
"DestinationUrl": "https://devstorage.blob.core.windows.net/test-docs-completed/123.jpeg"
}
Azure Function Code:
public static async Task<HttpResponseMessage> Run([HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)]HttpRequestMessage req, ILogger log)
{
string reqAsString = await req.Content.ReadAsStringAsync();
MoveProcessedDocumentRequest blobCopyRequest = JsonConvert.DeserializeObject<MoveProcessedDocumentRequest>(reqAsString);
CloudBlockBlob cloudBlockSource = new CloudBlockBlob(new Uri(blobCopyRequest.SourceUrl));
await cloudBlockSource.FetchAttributesAsync();
CloudBlobContainer cloudBlockDestinationContainer = new CloudBlockBlob(new Uri(blobCopyRequest.DestinationUrl)).Container;
string name = cloudBlockSource.Uri.Segments.Last();
CloudBlockBlob cloudBlockDestination;
cloudBlockDestination = cloudBlockDestinationContainer.GetBlockBlobReference(name);
// Copy metadata
foreach (var meta in cloudBlockSource.Metadata)
{
cloudBlockDestination.Metadata.Add(new KeyValuePair<string, string>(meta.Key, meta.Value));
}
await cloudBlockDestination.StartCopyAsync(cloudBlockSource);
// Exception: Microsoft.Azure.Storage.Common: The specified resource does not exist.
return req.CreateResponse(HttpStatusCode.OK);
}

You should modify your code with CloudBlobContainer instance.
Change:
CloudBlobContainer cloudBlockDestinationContainer = new CloudBlockBlob(new Uri(blobCopyRequest.DestinationUrl)).Container;
To:
var uri = new Uri("blobCopyRequest.DestinationUrl");
var storage = new StorageCredentials("your account name", "your storage key");
CloudBlobContainer cloudBlockDestinationContainer = new CloudBlobContainer(uri, storage);
And the DestinationUrl is destcontainer url.
HTTP Request:
{
"SourceUrl": "https://devstorage.blob.core.windows.net/test-docs/123.jpeg",
"DestinationUrl": "https://devstorage.blob.core.windows.net/test-docs-completed"
}

Related

How to perform POST using .NET Core 6 Minimal API in Azure Blob Storage

I am new to .NET and I have to perform this. Assuming we have the connection string and the Environment variable setup, could someone give me resources or code or guide on how to do it?
I just need to upload a pdf file in Azure Blob Storage using Minimal API
From the Minimal API document, we can see that the Minimal API does not support the binding the IFormFile.
No support for binding from forms. This includes binding IFormFile. We plan to add support for IFormFile in the future.
So, to upload file in the Minimal API, you can get the upload file from the HttpRequest Form. Refer to the following code:
app.MapPost("/upload", (HttpRequest request) =>
{
if (!request.Form.Files.Any())
return Results.BadRequest("At least one fie is need");
//Do something with the file
foreach(var item in request.Form.Files)
{
var file = item;
//insert the file into the Azure storage
}
return Results.Ok();
});
The screenshot as below:
Then, to upload the file to Azure Blob Storage, refer the following tutorial:
Upload images/files to blob azure, via web api ASP.NET framework Web application
Code like this:
CloudStorageAccount storageAccount;
Dictionary<string, object> dict = new Dictionary<string, object>();
string strorageconn = ConfigurationManager.AppSettings.Get("MyBlobStorageConnectionString");
if (CloudStorageAccount.TryParse(strorageconn, out storageAccount))
{
try
{
// Create the CloudBlobClient that represents the
// Blob storage endpoint for the storage account.
CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
// Create a container called 'quickstartblobs' and
// append a GUID value to it to make the name unique.
CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference("images");
await cloudBlobContainer.CreateIfNotExistsAsync();
// Set the permissions so the blobs are public.
BlobContainerPermissions permissions = new BlobContainerPermissions
{
PublicAccess = BlobContainerPublicAccessType.Blob
};
await cloudBlobContainer.SetPermissionsAsync(permissions);
var httpRequest = HttpContext.Current.Request;
foreach (string file in httpRequest.Files)
{
HttpResponseMessage response = Request.CreateResponse(HttpStatusCode.Created);
var postedFile = httpRequest.Files[file];
string imageName = ("images" + serverTime.Year.ToString() + serverTime.Month.ToString() + serverTime.Day.ToString() +
serverTime.Hour.ToString() + serverTime.Minute.ToString() + serverTime.Second.ToString() + serverTime.Millisecond.ToString()
+ postedFile.FileName );
if (postedFile != null && postedFile.ContentLength > 0)
{
int MaxContentLength = 1024 * 1024 * 1; //Size = 1 MB
IList<string> AllowedFileExtensions = new List<string> { ".jpg", ".gif", ".png" };
var ext = postedFile.FileName.Substring(postedFile.FileName.LastIndexOf('.'));
var extension = ext.ToLower();
if (!AllowedFileExtensions.Contains(extension))
{
var message = string.Format("Please Upload image of type .jpg,.gif,.png.");
dict.Add("error", message);
return Request.CreateResponse(HttpStatusCode.BadRequest, dict);
}
else if (postedFile.ContentLength > MaxContentLength)
{
var message = string.Format("Please Upload a file upto 1 mb.");
dict.Add("error", message);
return Request.CreateResponse(HttpStatusCode.BadRequest, dict);
}
else
{
CloudBlockBlob cloudBlockBlob = cloudBlobContainer.GetBlockBlobReference(imageName);
cloudBlockBlob.Properties.ContentType = postedFile.ContentType;
await cloudBlockBlob.UploadFromStreamAsync(postedFile.InputStream);
}
}
var message1 = string.Format("Image Updated Successfully.");
return Request.CreateErrorResponse(HttpStatusCode.Created, message1);
}
var res3 = string.Format("Please Upload a image.");
dict.Add("error", res3);
return Request.CreateResponse(HttpStatusCode.NotFound, dict);
}
catch (Exception ex)
{
HttpResponseMessage response2 = Request.CreateResponse(HttpStatusCode.BadRequest, ex.InnerException.ToString());
return response2;
}
}
else
{
var res = string.Format("Did not connect successfull.");
dict.Add("error", res);
return Request.CreateResponse(HttpStatusCode.NotFound, dict);
}

ASP.NET Core, Azure storage controller

I have a very newbie question.
I am following this docs "Azure Blob storage client library v12 for .NET" - https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-dotnet
When I tested on my console, and my Azure storage, it works.
But I was wondering if I can make a controller out of the suggested 'Main' method?
Because I want these getting and posting to the server actions initiated when the user input changes from the front end side.
This is what the Main method inside of the Program.cs looks like based on the docs
static async Task Main()
{
Console.WriteLine("Azure Blob storage v12 - .NET quickstart sample\n");
string connectionString = Environment.GetEnvironmentVariable("My_CONNECTION_STRING");
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
string containerName = "quickstartblobs" + Guid.NewGuid().ToString();
BlobContainerClient containerClient = await blobServiceClient.CreateBlobContainerAsync(containerName);
string localPath = "./data/";
string fileName = "quickstart" + Guid.NewGuid().ToString() + ".txt";
string localFilePath = Path.Combine(localPath, fileName);
// Write text to the file
await File.WriteAllTextAsync(localFilePath, "Hello, World!");
// Get a reference to a blob
BlobClient blobClient = containerClient.GetBlobClient(fileName);
Console.WriteLine("Uploading to Blob storage as blob:\n\t {0}\n", blobClient.Uri);
// Open the file and upload its data
using FileStream uploadFileStream = File.OpenRead(localFilePath);
await blobClient.UploadAsync(uploadFileStream, true);
uploadFileStream.Close();
Console.WriteLine("Listing blobs...");
// List all blobs in the container
await foreach (BlobItem blobItem in containerClient.GetBlobsAsync())
{
Console.WriteLine("\t" + blobItem.Name);
}
Console.Write("Press any key to begin clean up");
Console.ReadLine();
string downloadFilePath = localFilePath.Replace(".txt", "DOWNLOAD.txt");
Console.WriteLine("\nDownloading blob to\n\t{0}\n", downloadFilePath);
// Download the blob's contents and save it to a file
BlobDownloadInfo download = await blobClient.DownloadAsync();
using (FileStream downloadFileStream = File.OpenWrite(downloadFilePath))
{
await download.Content.CopyToAsync(downloadFileStream);
downloadFileStream.Close();
}
}
So for example, in my HomeController Can I use post related functions as
[HttpPost]
public void Post([FromBody]string value)
{
//Create a unique name for the container
string containerName = "filedata" + Guid.NewGuid().ToString();
// Create the container and return a container client object
BlobContainerClient containerClient = await blobServiceClient.CreateBlobContainerAsync(containerName);
// Create a local file in the ./data/ directory for uploading and downloading
string localPath = "./data/";
string fileName = "textfile" + Guid.NewGuid().ToString() + ".txt";
string localFilePath = Path.Combine(localPath, fileName);
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
// Get a reference to a blob
BlobClient blobClient = containerClient.GetBlobClient(fileName);
Console.WriteLine("Uploading to Blob storage as blob:\n\t {0}\n", blobClient.Uri);
// Open the file and upload its data
using FileStream uploadFileStream = File.OpenRead(localFilePath);
await blobClient.UploadAsync(uploadFileStream, true);
uploadFileStream.Close();
}
Or is this a no-go?
Thanks for helping this super newbie!
So for example, in my HomeController Can I use post related functions Or is this a no-go?
Yes, you can achieve it.
You can use postman to send post request in local to test it. Remember to remove SSL for webserver setting.
Also, change public void Post to public async Task Post and remove using in code:
FileStream uploadFileStream = File.OpenRead(localFilePath);
await blobClient.UploadAsync(uploadFileStream, true);
uploadFileStream.Close()

Asp.Net Core - Making API calls from backend

I have an application which is calling API's from a backend cs class, using IHostedService. With basic API calls ("http://httpbin.org/ip") it is working fine and returning the correct value, however I now need to call a Siemens API which requires me to set an Authorization header, and place "grant_type=client_credentials" in the body.
public async Task<string> GetResult()
{
string data = "";
string baseUrl = "https://<space-name>.mindsphere.io/oauth/token";
using (HttpClient client = new HttpClient())
{
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Basic", {ServiceCredentialID: ServiceCredentialSecret});
using (HttpResponseMessage res = await client.GetAsync(baseUrl))
{
using (HttpContent content = res.Content)
{
data = await content.ReadAsStringAsync();
}
}
}
I think I have the header set up correctly but I won't know for sure until the full request gets formatted. Is it even possible to set the the body of the request to "grant_type=client_credentials"?
As far as I can see from Siemens API documentation they expect Form data, so it should be like:
public async Task<string> GetResult()
{
string data = "";
string baseUrl = "https://<space-name>.mindsphere.io/oauth/token";
using (HttpClient client = new HttpClient())
{
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Basic", {ServiceCredentialID: ServiceCredentialSecret});
var formContent = new FormUrlEncodedContent(new[]
{
new KeyValuePair<string, string>("grant_type", "client_credentials")
});
using (HttpResponseMessage res = await client.PostAsync(baseUrl, formContent))
{
using (HttpContent content = res.Content)
{
data = await content.ReadAsStringAsync();
}
}
}
}

listing all blobs in container

I am trying to create a function for all blobs in a container. I took the help
How to get hold of all the blobs in a Blob container which has sub directories levels(n levels)?, which seems to use an overload that doesn't exist any more. I had added default values into the additional fields prefix and operationContext :
static internal async Task<List<string>> ListBlobNames(string ContainerName)
{
BlobContinuationToken continuationToken = null;
bool useFlatBlobListing = true;
BlobListingDetails blobListingDetails = BlobListingDetails.None;
var blobOptions = new BlobRequestOptions();
CloudBlobContainer container = Container(ContainerName);
var operationContext = new OperationContext();
var verify = container.GetBlobReference("A_Valid_Name.jpg");
var verify2 = container.GetBlobReference("NotAName.jpg");
using (var a = await verify.OpenReadAsync()) ;
//using (var a = await verify2.OpenReadAsync()); // doesn't work since it doesn't exist
return (await container.ListBlobsSegmentedAsync("", useFlatBlobListing, blobListingDetails, null, continuationToken, blobOptions, operationContext))
.Results.Select(s => s.Uri.LocalPath.ToString()).ToList();
}
The last line gave me an exception:
StorageException: The requested URI does not represent any resource on the server.
I then created the verfiy and verify2 variables to test if my container is valid. verify references a valid blob and verify2 references an invalid blob name. Running the code with the second using statement uncommented gave me an error in the second using statement. This shows that the verify works and thus the container is valid.
I am trying to create a function for all blobs in a container.
You could leverage the Azure Storage Client Library and install the package WindowsAzure.Storage, then you could follow the tutorial List blobs in pages asynchronously to achieve your purpose. For test, I just created my .Net Core console application as follows:
static void Main(string[] args)
{
MainAsync(args).GetAwaiter().GetResult();
}
static async Task MainAsync(string[] args)
{
// Retrieve storage account from connection string.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("{your-storage-connection-string}");
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
string containerName = "images";
// Retrieve reference to a previously created container.
CloudBlobContainer container = blobClient.GetContainerReference(containerName);
if (await container.ExistsAsync())
await ListBlobsSegmentedInFlatListing(container);
else
Console.WriteLine($"Your container with the name:{containerName} does not exist!!!");
Console.WriteLine("press any key to exit...");
Console.ReadLine();
}
ListBlobsSegmentedInFlatListing:
async public static Task ListBlobsSegmentedInFlatListing(CloudBlobContainer container)
{
//List blobs to the console window, with paging.
Console.WriteLine("List blobs in pages:");
int i = 0;
BlobContinuationToken continuationToken = null;
BlobResultSegment resultSegment = null;
//Call ListBlobsSegmentedAsync and enumerate the result segment returned, while the continuation token is non-null.
//When the continuation token is null, the last page has been returned and execution can exit the loop.
do
{
//This overload allows control of the page size. You can return all remaining results by passing null for the maxResults parameter,
//or by calling a different overload.
resultSegment = await container.ListBlobsSegmentedAsync("", true, BlobListingDetails.All, 10, continuationToken, null, null);
if (resultSegment.Results.Count<IListBlobItem>() > 0) { Console.WriteLine("Page {0}:", ++i); }
foreach (var blobItem in resultSegment.Results)
{
Console.WriteLine("\t{0}", blobItem.StorageUri.PrimaryUri);
}
Console.WriteLine();
//Get the continuation token.
continuationToken = resultSegment.ContinuationToken;
}
while (continuationToken != null);
}
Test:
This works for me...
String myContainerName = "images";
CloudBlobContainer blobContainer = blobClient.GetContainerReference(myContainerName);
CloudBlockBlob blockBlob;
blockBlob = blobContainer.GetBlockBlobReference("NotAName.jpg");
bool verify2 = await blockBlob.ExistsAsync();
if (!verify2)
{
// the blob image does not exist
// do something
}

Amazon S3 .NET Core how to upload a file

I would like to upload a file with Amazon S3 inside a .NET Core project. Is there any reference on how to create and use an AmazonS3 client? All i can find in AmazonS3 documentation for .Net Core is this(http://docs.aws.amazon.com/sdk-for-net/v3/developer-guide/net-dg-config-netcore.html) which is not very helpfull.
I did using IFormFile, like this:
(You need to install AWSSDK.S3)
public async Task UploadFileToS3(IFormFile file)
{
using (var client = new AmazonS3Client("yourAwsAccessKeyId", "yourAwsSecretAccessKey", RegionEndpoint.USEast1))
{
using (var newMemoryStream = new MemoryStream())
{
file.CopyTo(newMemoryStream);
var uploadRequest = new TransferUtilityUploadRequest
{
InputStream = newMemoryStream,
Key = file.FileName,
BucketName = "yourBucketName",
CannedACL = S3CannedACL.PublicRead
};
var fileTransferUtility = new TransferUtility(client);
await fileTransferUtility.UploadAsync(uploadRequest);
}
}
}
For simple file uploading in a .netcore project, I followed this link.
After finishing the simple file upload procedure, I followed the documentation on this and this links, which were very helpful. Following two links were also helpful for a quick start.
https://github.com/awslabs/aws-sdk-net-samples/blob/master/ConsoleSamples/AmazonS3Sample/AmazonS3Sample/S3Sample.cs
http://www.c-sharpcorner.com/article/fileupload-to-aws-s3-using-asp-net/
This was my final code snippets in the controller for file upload (I skipped the view part, which is elaborately explained in the link shared above).
[HttpPost("UploadFiles")]
public IActionResult UploadFiles(List<IFormFile> files)
{
long size = files.Sum(f => f.Length);
foreach (var formFile in files)
{
if (formFile.Length > 0)
{
var filename = ContentDispositionHeaderValue
.Parse(formFile.ContentDisposition)
.FileName
.TrimStart().ToString();
filename = _hostingEnvironment.WebRootPath + $#"\uploads" + $#"\{formFile.FileName}";
size += formFile.Length;
using (var fs = System.IO.File.Create(filename))
{
formFile.CopyTo(fs);
fs.Flush();
}//these code snippets saves the uploaded files to the project directory
uploadToS3(filename);//this is the method to upload saved file to S3
}
}
return RedirectToAction("Index", "Library");
}
This is the method to upload files to Amazon S3:
private IHostingEnvironment _hostingEnvironment;
private AmazonS3Client _s3Client = new AmazonS3Client(RegionEndpoint.EUWest2);
private string _bucketName = "mis-pdf-library";//this is my Amazon Bucket name
private static string _bucketSubdirectory = String.Empty;
public UploadController(IHostingEnvironment environment)
{
_hostingEnvironment = environment;
}
public void uploadToS3(string filePath)
{
try
{
TransferUtility fileTransferUtility = new
TransferUtility(new AmazonS3Client(Amazon.RegionEndpoint.EUWest2));
string bucketName;
if (_bucketSubdirectory == "" || _bucketSubdirectory == null)
{
bucketName = _bucketName; //no subdirectory just bucket name
}
else
{ // subdirectory and bucket name
bucketName = _bucketName + #"/" + _bucketSubdirectory;
}
// 1. Upload a file, file name is used as the object key name.
fileTransferUtility.Upload(filePath, bucketName);
Console.WriteLine("Upload 1 completed");
}
catch (AmazonS3Exception s3Exception)
{
Console.WriteLine(s3Exception.Message,
s3Exception.InnerException);
}
}
This was all for uploading files in Amazon S3 bucket. I worked on .netcore 2.0 and also, don't forget to add necessary dependencies for using Amazon API. These were:
AWSSDK.Core
AWSSDK.Extensions.NETCore.Setup
AWSSDK.S3
Hope, this would help.
I wrote a complete sample for upload a file to Amazon AWS S3 with asp.net core mvc
you can see my sample project in github link:
https://github.com/NevitFeridi/AWS_Upload_Sample_ASPCoreMVC
There was a function for uploading file to S3 using Amazon.S3 SDK in the HomeController.
In this function " UploadFileToAWSAsync " you can find every things you need
// you must set your accessKey and secretKey
// for getting your accesskey and secretKey go to your Aws amazon console
string AWS_accessKey = "xxxxxxx";
string AWS_secretKey = "xxxxxxxxxxxxxx";
string AWS_bucketName = "my-uswest";
string AWS_defaultFolder = "MyTest_Folder";
protected async Task<string> UploadFileToAWSAsync(IFormFile myfile, string subFolder = "")
{
var result = "";
try
{
var s3Client = new AmazonS3Client(AWS_accessKey, AWS_secretKey, Amazon.RegionEndpoint.USWest2);
var bucketName = AWS_bucketName;
var keyName = AWS_defaultFolder;
if (!string.IsNullOrEmpty(subFolder))
keyName = keyName + "/" + subFolder.Trim();
keyName = keyName + "/" + myfile.FileName;
var fs = myfile.OpenReadStream();
var request = new Amazon.S3.Model.PutObjectRequest
{
BucketName = bucketName,
Key = keyName,
InputStream = fs,
ContentType = myfile.ContentType,
CannedACL = S3CannedACL.PublicRead
};
await s3Client.PutObjectAsync(request);
result = string.Format("http://{0}.s3.amazonaws.com/{1}", bucketName, keyName);
}
catch (Exception ex)
{
result = ex.Message;
}
return result;
}
Addition to #Tiago's answers, AWSS3 SDK is changed a bit, so here is the updated method:
public async Task UploadImage(IFormFile file)
{
var credentials = new BasicAWSCredentials("access", "secret key");
var config = new AmazonS3Config
{
RegionEndpoint = Amazon.RegionEndpoint.EUNorth1
};
using var client = new AmazonS3Client(credentials, config);
await using var newMemoryStream = new MemoryStream();
file.CopyTo(newMemoryStream);
var uploadRequest = new TransferUtilityUploadRequest
{
InputStream = newMemoryStream,
Key = file.FileName,
BucketName = "your-bucket-name",
CannedACL = S3CannedACL.PublicRead
};
var fileTransferUtility = new TransferUtility(client);
await fileTransferUtility.UploadAsync(uploadRequest);
}
Per AWS SDK docs, .Net Core support was added in late 2016.
https://aws.amazon.com/sdk-for-net/
So the instructions for uploading files to S3 should be identical to any other instructions for .Net.
The "getting started" guide for the AWS SDK for .Net is literally the case you describe of connecting and uploading a file to S3 - and included as a sample project ready for you to run if you've installed the "AWS Toolkit for Visual Studio" (which should be installed with the .Net AWS SDK).
So all you need to do is open visual studio, find their sample S3 project, or you can look at it here:
// simple object put
PutObjectRequest request = new PutObjectRequest()
{
ContentBody = "this is a test",
BucketName = bucketName,
Key = keyName
};
PutObjectResponse response = client.PutObject(request);
This assumes you have instantiated an Amazon.S3.AmazonS3Client after including the namespace, and configured it with your own credentials.
You first need to install in the Package Manager Console:
Install-package AWSSDK.Extensions.NETCORE.Setup
Install-package AWSSDK.S3
Then you need to have the credentials file in the directory:
C:\Users\username\.aws\credentials
The credential file should have this format:
[default]
aws_access_key_id=[Write your access key in here]
aws_secret_access_key=[Write your secret access key in here]
region=[Write your region here]
I uploaded in github an example of a basic CRUD in ASP.NET CORE for S3 buckets.
We came across an issue when implementing a High-Level API in a .net core solution. When clients had low bandwidth 3mb/s approx an error was thrown by Amazon S3 (The XML you provided was not well-formed). To resolve this issue we had to make an implementation using the low-level API.
https://docs.aws.amazon.com/en_us/AmazonS3/latest/dev/LLuploadFileDotNet.html
// Create list to store upload part responses.
List<UploadPartResponse> uploadResponses = new List<UploadPartResponse>();
// Setup information required to initiate the multipart upload.
InitiateMultipartUploadRequest initiateRequest = new InitiateMultipartUploadRequest{
BucketName = bucketName,
Key = pathbucket
};
//Add metadata to file
string newDate = DateTime.Now.ToString("dd/MM/yyyy HH:mm:ss");
// Initiate the upload.
InitiateMultipartUploadResponse initResponse = await s3Client.InitiateMultipartUploadAsync(initiateRequest);
int uploadmb = 5;
// Upload parts.
long contentLength = new FileInfo(zippath).Length;
long partSize = uploadmb * (long)Math.Pow(2, 20); // 5 MB
try
{
long filePosition = 0;
for (int i = 1; filePosition < contentLength; i++) {
UploadPartRequest uploadRequest = new UploadPartRequest{
BucketName = bucketName,
Key = pathbucket,
UploadId = initResponse.UploadId,
PartNumber = i,
PartSize = partSize,
FilePosition = filePosition,
FilePath = zippath
};
// Track upload progress.
uploadRequest.StreamTransferProgress += new EventHandler<StreamTransferProgressArgs>(UploadPartProgressEventCallback);
// Upload a part and add the response to our list.
uploadResponses.Add(await s3Client.UploadPartAsync(uploadRequest));
filePosition += partSize;
}
// Setup to complete the upload.
CompleteMultipartUploadRequest completeRequest = new CompleteMultipartUploadRequest {
BucketName = bucketName,
Key = pathbucket,
UploadId = initResponse.UploadId
};
completeRequest.AddPartETags(uploadResponses);
// Complete the upload.
CompleteMultipartUploadResponse completeUploadResponse = await s3Client.CompleteMultipartUploadAsync(completeRequest);
}
catch (Exception exception)
{
Console.WriteLine("An AmazonS3Exception was thrown: { 0}", exception.Message);
// Abort the upload.
AbortMultipartUploadRequest abortMPURequest = new AbortMultipartUploadRequest {
BucketName = bucketName,
Key = keyName,
UploadId = initResponse.UploadId
};
}