Delete a file from Azure Data Lake store using .Net SDK? - authentication

I want to delete a specific file in the Azure Data Lake Store using .Net SDK
Used the Below code and it returns the error "Operation returned an invalid status code 'BadRequest'"
var clientCredential = new ClientCredential(CLIENTID, CLIENTSECRET);
var creds = ApplicationTokenProvider.LoginSilentAsync(DOMAINNAME,
clientCredential).Result;
_adlsFileSystemClient = new DataLakeStoreFileSystemManagementClient(creds);
var fileDeleteResult = _adlsFileSystemClient.FileSystem.Delete(_adlsAccountName, path);

I used to get this error, which I ended up solving by using the asynchronous methods instead of the synchronous methods.
You might also want to check the file path which you are passing to the "Delete" function; it has to be the whole path, including file name + extension. Something like "/rootFolder/subFolder1/subFolder2/DeleteMe.txt"
Try something like this:
private ServiceClientCredentials Authenticate(string _adlsDomain, string _adlsWebClientId, string _adlsClientSecret)
{
SynchronizationContext.SetSynchronizationContext(new SynchronizationContext());
/*_adlsDomain ==> DirectoryID or TenantID
_adlsWebClientId ==> Application ID
_adlsClientSecret ==> Active Directory APplication key1
*/
ClientCredential clientCredential = new ClientCredential(_adlsWebClientId, _adlsClientSecret);
return ApplicationTokenProvider.LoginSilentAsync(_adlsDomain, clientCredential).Result;
}
private async Task DeleteFile(string path)
{
string _adlsDomain = "xxxx";
string _adlsWebClientId = "xxxx";
string _adlsClientSecret = "xxxx";
string _subscription_id = "xxxx";
string _adlsAccountName = "xxxxxxx";
ServiceClientCredentials _creds = Authenticate(_adlsDomain, _adlsWebClientId, _adlsClientSecret)
// Create client objects and set the subscription ID
DataLakeStoreAccountManagementClient _adlsClient = new DataLakeStoreAccountManagementClient(_creds) { SubscriptionId = _subscription_id };
DataLakeStoreFileSystemManagementClient _adlsFileSystemClient = new DataLakeStoreFileSystemManagementClient(_creds);
await _adlsFileSystemClient.FileSystem.DeleteAsync(_adlsAccountName, path);
}

Related

Azure storage SAS failed to authenticate

Hi I am trying to create SAS token for my file on Azure. I am getting "Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature" error. Following is my code.
public async Task<IActionResult> TestBlobClient()
{
try
{
string storageAccount = "myaccount";
string password = "mykey";
var sharedKeyCredential = new StorageSharedKeyCredential(storageAccount, password);
var shareClient = new ShareClient(new Uri("https://aaa.file.core.windows.net/zttsox20201027154501"), sharedKeyCredential);
ShareDirectoryClient directory = shareClient.GetDirectoryClient("Output/14");
ShareFileClient file = directory.GetFileClient("637655759841727494_main.wav");
var shareSasBuilder = new ShareSasBuilder
{
ShareName = "aaa",
Protocol = SasProtocol.None,
ExpiresOn = DateTime.UtcNow.AddYears(50),
Resource = "b"
};
shareSasBuilder.SetPermissions(ShareFileSasPermissions.Read);
var url = new Uri(file.Uri + "?" + shareSasBuilder.ToSasQueryParameters(sharedKeyCredential).ToString());
return Ok(url);
}
catch (Exception ex)
{
return StatusCode(StatusCodes.Status400BadRequest, ex.Message);
}
The issue is with Resource = "b". When you set the resource as b, that means you are getting a SAS token for a blob. Considering you're getting a SAS token for a file share, the value of this parameter should be s.
Please try something like:
var shareSasBuilder = new ShareSasBuilder
{
ShareName = "zttsox20201027154501",
Protocol = SasProtocol.None,
ExpiresOn = DateTime.UtcNow.AddYears(50),
Resource = "s"
};
For more details, please see this link: https://learn.microsoft.com/en-us/dotnet/api/azure.storage.sas.sharesasbuilder.resource?view=azure-dotnet#Azure_Storage_Sas_ShareSasBuilder_Resource.

ASP.NET Core, Azure storage controller

I have a very newbie question.
I am following this docs "Azure Blob storage client library v12 for .NET" - https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-dotnet
When I tested on my console, and my Azure storage, it works.
But I was wondering if I can make a controller out of the suggested 'Main' method?
Because I want these getting and posting to the server actions initiated when the user input changes from the front end side.
This is what the Main method inside of the Program.cs looks like based on the docs
static async Task Main()
{
Console.WriteLine("Azure Blob storage v12 - .NET quickstart sample\n");
string connectionString = Environment.GetEnvironmentVariable("My_CONNECTION_STRING");
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
string containerName = "quickstartblobs" + Guid.NewGuid().ToString();
BlobContainerClient containerClient = await blobServiceClient.CreateBlobContainerAsync(containerName);
string localPath = "./data/";
string fileName = "quickstart" + Guid.NewGuid().ToString() + ".txt";
string localFilePath = Path.Combine(localPath, fileName);
// Write text to the file
await File.WriteAllTextAsync(localFilePath, "Hello, World!");
// Get a reference to a blob
BlobClient blobClient = containerClient.GetBlobClient(fileName);
Console.WriteLine("Uploading to Blob storage as blob:\n\t {0}\n", blobClient.Uri);
// Open the file and upload its data
using FileStream uploadFileStream = File.OpenRead(localFilePath);
await blobClient.UploadAsync(uploadFileStream, true);
uploadFileStream.Close();
Console.WriteLine("Listing blobs...");
// List all blobs in the container
await foreach (BlobItem blobItem in containerClient.GetBlobsAsync())
{
Console.WriteLine("\t" + blobItem.Name);
}
Console.Write("Press any key to begin clean up");
Console.ReadLine();
string downloadFilePath = localFilePath.Replace(".txt", "DOWNLOAD.txt");
Console.WriteLine("\nDownloading blob to\n\t{0}\n", downloadFilePath);
// Download the blob's contents and save it to a file
BlobDownloadInfo download = await blobClient.DownloadAsync();
using (FileStream downloadFileStream = File.OpenWrite(downloadFilePath))
{
await download.Content.CopyToAsync(downloadFileStream);
downloadFileStream.Close();
}
}
So for example, in my HomeController Can I use post related functions as
[HttpPost]
public void Post([FromBody]string value)
{
//Create a unique name for the container
string containerName = "filedata" + Guid.NewGuid().ToString();
// Create the container and return a container client object
BlobContainerClient containerClient = await blobServiceClient.CreateBlobContainerAsync(containerName);
// Create a local file in the ./data/ directory for uploading and downloading
string localPath = "./data/";
string fileName = "textfile" + Guid.NewGuid().ToString() + ".txt";
string localFilePath = Path.Combine(localPath, fileName);
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
// Get a reference to a blob
BlobClient blobClient = containerClient.GetBlobClient(fileName);
Console.WriteLine("Uploading to Blob storage as blob:\n\t {0}\n", blobClient.Uri);
// Open the file and upload its data
using FileStream uploadFileStream = File.OpenRead(localFilePath);
await blobClient.UploadAsync(uploadFileStream, true);
uploadFileStream.Close();
}
Or is this a no-go?
Thanks for helping this super newbie!
So for example, in my HomeController Can I use post related functions Or is this a no-go?
Yes, you can achieve it.
You can use postman to send post request in local to test it. Remember to remove SSL for webserver setting.
Also, change public void Post to public async Task Post and remove using in code:
FileStream uploadFileStream = File.OpenRead(localFilePath);
await blobClient.UploadAsync(uploadFileStream, true);
uploadFileStream.Close()

How to move a blob document from one container to another with metadata from azure function?

I am trying to copy a blob document from one container to another along with the metadata. I have tried the following code from azure function but getting error mentioned in the code.
HTTP Request:
{
"SourceUrl": "https://devstorage.blob.core.windows.net/test-docs/123.jpeg",
"DestinationUrl": "https://devstorage.blob.core.windows.net/test-docs-completed/123.jpeg"
}
Azure Function Code:
public static async Task<HttpResponseMessage> Run([HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)]HttpRequestMessage req, ILogger log)
{
string reqAsString = await req.Content.ReadAsStringAsync();
MoveProcessedDocumentRequest blobCopyRequest = JsonConvert.DeserializeObject<MoveProcessedDocumentRequest>(reqAsString);
CloudBlockBlob cloudBlockSource = new CloudBlockBlob(new Uri(blobCopyRequest.SourceUrl));
await cloudBlockSource.FetchAttributesAsync();
CloudBlobContainer cloudBlockDestinationContainer = new CloudBlockBlob(new Uri(blobCopyRequest.DestinationUrl)).Container;
string name = cloudBlockSource.Uri.Segments.Last();
CloudBlockBlob cloudBlockDestination;
cloudBlockDestination = cloudBlockDestinationContainer.GetBlockBlobReference(name);
// Copy metadata
foreach (var meta in cloudBlockSource.Metadata)
{
cloudBlockDestination.Metadata.Add(new KeyValuePair<string, string>(meta.Key, meta.Value));
}
await cloudBlockDestination.StartCopyAsync(cloudBlockSource);
// Exception: Microsoft.Azure.Storage.Common: The specified resource does not exist.
return req.CreateResponse(HttpStatusCode.OK);
}
You should modify your code with CloudBlobContainer instance.
Change:
CloudBlobContainer cloudBlockDestinationContainer = new CloudBlockBlob(new Uri(blobCopyRequest.DestinationUrl)).Container;
To:
var uri = new Uri("blobCopyRequest.DestinationUrl");
var storage = new StorageCredentials("your account name", "your storage key");
CloudBlobContainer cloudBlockDestinationContainer = new CloudBlobContainer(uri, storage);
And the DestinationUrl is destcontainer url.
HTTP Request:
{
"SourceUrl": "https://devstorage.blob.core.windows.net/test-docs/123.jpeg",
"DestinationUrl": "https://devstorage.blob.core.windows.net/test-docs-completed"
}

listing all blobs in container

I am trying to create a function for all blobs in a container. I took the help
How to get hold of all the blobs in a Blob container which has sub directories levels(n levels)?, which seems to use an overload that doesn't exist any more. I had added default values into the additional fields prefix and operationContext :
static internal async Task<List<string>> ListBlobNames(string ContainerName)
{
BlobContinuationToken continuationToken = null;
bool useFlatBlobListing = true;
BlobListingDetails blobListingDetails = BlobListingDetails.None;
var blobOptions = new BlobRequestOptions();
CloudBlobContainer container = Container(ContainerName);
var operationContext = new OperationContext();
var verify = container.GetBlobReference("A_Valid_Name.jpg");
var verify2 = container.GetBlobReference("NotAName.jpg");
using (var a = await verify.OpenReadAsync()) ;
//using (var a = await verify2.OpenReadAsync()); // doesn't work since it doesn't exist
return (await container.ListBlobsSegmentedAsync("", useFlatBlobListing, blobListingDetails, null, continuationToken, blobOptions, operationContext))
.Results.Select(s => s.Uri.LocalPath.ToString()).ToList();
}
The last line gave me an exception:
StorageException: The requested URI does not represent any resource on the server.
I then created the verfiy and verify2 variables to test if my container is valid. verify references a valid blob and verify2 references an invalid blob name. Running the code with the second using statement uncommented gave me an error in the second using statement. This shows that the verify works and thus the container is valid.
I am trying to create a function for all blobs in a container.
You could leverage the Azure Storage Client Library and install the package WindowsAzure.Storage, then you could follow the tutorial List blobs in pages asynchronously to achieve your purpose. For test, I just created my .Net Core console application as follows:
static void Main(string[] args)
{
MainAsync(args).GetAwaiter().GetResult();
}
static async Task MainAsync(string[] args)
{
// Retrieve storage account from connection string.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("{your-storage-connection-string}");
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
string containerName = "images";
// Retrieve reference to a previously created container.
CloudBlobContainer container = blobClient.GetContainerReference(containerName);
if (await container.ExistsAsync())
await ListBlobsSegmentedInFlatListing(container);
else
Console.WriteLine($"Your container with the name:{containerName} does not exist!!!");
Console.WriteLine("press any key to exit...");
Console.ReadLine();
}
ListBlobsSegmentedInFlatListing:
async public static Task ListBlobsSegmentedInFlatListing(CloudBlobContainer container)
{
//List blobs to the console window, with paging.
Console.WriteLine("List blobs in pages:");
int i = 0;
BlobContinuationToken continuationToken = null;
BlobResultSegment resultSegment = null;
//Call ListBlobsSegmentedAsync and enumerate the result segment returned, while the continuation token is non-null.
//When the continuation token is null, the last page has been returned and execution can exit the loop.
do
{
//This overload allows control of the page size. You can return all remaining results by passing null for the maxResults parameter,
//or by calling a different overload.
resultSegment = await container.ListBlobsSegmentedAsync("", true, BlobListingDetails.All, 10, continuationToken, null, null);
if (resultSegment.Results.Count<IListBlobItem>() > 0) { Console.WriteLine("Page {0}:", ++i); }
foreach (var blobItem in resultSegment.Results)
{
Console.WriteLine("\t{0}", blobItem.StorageUri.PrimaryUri);
}
Console.WriteLine();
//Get the continuation token.
continuationToken = resultSegment.ContinuationToken;
}
while (continuationToken != null);
}
Test:
This works for me...
String myContainerName = "images";
CloudBlobContainer blobContainer = blobClient.GetContainerReference(myContainerName);
CloudBlockBlob blockBlob;
blockBlob = blobContainer.GetBlockBlobReference("NotAName.jpg");
bool verify2 = await blockBlob.ExistsAsync();
if (!verify2)
{
// the blob image does not exist
// do something
}

Amazon S3 .NET Core how to upload a file

I would like to upload a file with Amazon S3 inside a .NET Core project. Is there any reference on how to create and use an AmazonS3 client? All i can find in AmazonS3 documentation for .Net Core is this(http://docs.aws.amazon.com/sdk-for-net/v3/developer-guide/net-dg-config-netcore.html) which is not very helpfull.
I did using IFormFile, like this:
(You need to install AWSSDK.S3)
public async Task UploadFileToS3(IFormFile file)
{
using (var client = new AmazonS3Client("yourAwsAccessKeyId", "yourAwsSecretAccessKey", RegionEndpoint.USEast1))
{
using (var newMemoryStream = new MemoryStream())
{
file.CopyTo(newMemoryStream);
var uploadRequest = new TransferUtilityUploadRequest
{
InputStream = newMemoryStream,
Key = file.FileName,
BucketName = "yourBucketName",
CannedACL = S3CannedACL.PublicRead
};
var fileTransferUtility = new TransferUtility(client);
await fileTransferUtility.UploadAsync(uploadRequest);
}
}
}
For simple file uploading in a .netcore project, I followed this link.
After finishing the simple file upload procedure, I followed the documentation on this and this links, which were very helpful. Following two links were also helpful for a quick start.
https://github.com/awslabs/aws-sdk-net-samples/blob/master/ConsoleSamples/AmazonS3Sample/AmazonS3Sample/S3Sample.cs
http://www.c-sharpcorner.com/article/fileupload-to-aws-s3-using-asp-net/
This was my final code snippets in the controller for file upload (I skipped the view part, which is elaborately explained in the link shared above).
[HttpPost("UploadFiles")]
public IActionResult UploadFiles(List<IFormFile> files)
{
long size = files.Sum(f => f.Length);
foreach (var formFile in files)
{
if (formFile.Length > 0)
{
var filename = ContentDispositionHeaderValue
.Parse(formFile.ContentDisposition)
.FileName
.TrimStart().ToString();
filename = _hostingEnvironment.WebRootPath + $#"\uploads" + $#"\{formFile.FileName}";
size += formFile.Length;
using (var fs = System.IO.File.Create(filename))
{
formFile.CopyTo(fs);
fs.Flush();
}//these code snippets saves the uploaded files to the project directory
uploadToS3(filename);//this is the method to upload saved file to S3
}
}
return RedirectToAction("Index", "Library");
}
This is the method to upload files to Amazon S3:
private IHostingEnvironment _hostingEnvironment;
private AmazonS3Client _s3Client = new AmazonS3Client(RegionEndpoint.EUWest2);
private string _bucketName = "mis-pdf-library";//this is my Amazon Bucket name
private static string _bucketSubdirectory = String.Empty;
public UploadController(IHostingEnvironment environment)
{
_hostingEnvironment = environment;
}
public void uploadToS3(string filePath)
{
try
{
TransferUtility fileTransferUtility = new
TransferUtility(new AmazonS3Client(Amazon.RegionEndpoint.EUWest2));
string bucketName;
if (_bucketSubdirectory == "" || _bucketSubdirectory == null)
{
bucketName = _bucketName; //no subdirectory just bucket name
}
else
{ // subdirectory and bucket name
bucketName = _bucketName + #"/" + _bucketSubdirectory;
}
// 1. Upload a file, file name is used as the object key name.
fileTransferUtility.Upload(filePath, bucketName);
Console.WriteLine("Upload 1 completed");
}
catch (AmazonS3Exception s3Exception)
{
Console.WriteLine(s3Exception.Message,
s3Exception.InnerException);
}
}
This was all for uploading files in Amazon S3 bucket. I worked on .netcore 2.0 and also, don't forget to add necessary dependencies for using Amazon API. These were:
AWSSDK.Core
AWSSDK.Extensions.NETCore.Setup
AWSSDK.S3
Hope, this would help.
I wrote a complete sample for upload a file to Amazon AWS S3 with asp.net core mvc
you can see my sample project in github link:
https://github.com/NevitFeridi/AWS_Upload_Sample_ASPCoreMVC
There was a function for uploading file to S3 using Amazon.S3 SDK in the HomeController.
In this function " UploadFileToAWSAsync " you can find every things you need
// you must set your accessKey and secretKey
// for getting your accesskey and secretKey go to your Aws amazon console
string AWS_accessKey = "xxxxxxx";
string AWS_secretKey = "xxxxxxxxxxxxxx";
string AWS_bucketName = "my-uswest";
string AWS_defaultFolder = "MyTest_Folder";
protected async Task<string> UploadFileToAWSAsync(IFormFile myfile, string subFolder = "")
{
var result = "";
try
{
var s3Client = new AmazonS3Client(AWS_accessKey, AWS_secretKey, Amazon.RegionEndpoint.USWest2);
var bucketName = AWS_bucketName;
var keyName = AWS_defaultFolder;
if (!string.IsNullOrEmpty(subFolder))
keyName = keyName + "/" + subFolder.Trim();
keyName = keyName + "/" + myfile.FileName;
var fs = myfile.OpenReadStream();
var request = new Amazon.S3.Model.PutObjectRequest
{
BucketName = bucketName,
Key = keyName,
InputStream = fs,
ContentType = myfile.ContentType,
CannedACL = S3CannedACL.PublicRead
};
await s3Client.PutObjectAsync(request);
result = string.Format("http://{0}.s3.amazonaws.com/{1}", bucketName, keyName);
}
catch (Exception ex)
{
result = ex.Message;
}
return result;
}
Addition to #Tiago's answers, AWSS3 SDK is changed a bit, so here is the updated method:
public async Task UploadImage(IFormFile file)
{
var credentials = new BasicAWSCredentials("access", "secret key");
var config = new AmazonS3Config
{
RegionEndpoint = Amazon.RegionEndpoint.EUNorth1
};
using var client = new AmazonS3Client(credentials, config);
await using var newMemoryStream = new MemoryStream();
file.CopyTo(newMemoryStream);
var uploadRequest = new TransferUtilityUploadRequest
{
InputStream = newMemoryStream,
Key = file.FileName,
BucketName = "your-bucket-name",
CannedACL = S3CannedACL.PublicRead
};
var fileTransferUtility = new TransferUtility(client);
await fileTransferUtility.UploadAsync(uploadRequest);
}
Per AWS SDK docs, .Net Core support was added in late 2016.
https://aws.amazon.com/sdk-for-net/
So the instructions for uploading files to S3 should be identical to any other instructions for .Net.
The "getting started" guide for the AWS SDK for .Net is literally the case you describe of connecting and uploading a file to S3 - and included as a sample project ready for you to run if you've installed the "AWS Toolkit for Visual Studio" (which should be installed with the .Net AWS SDK).
So all you need to do is open visual studio, find their sample S3 project, or you can look at it here:
// simple object put
PutObjectRequest request = new PutObjectRequest()
{
ContentBody = "this is a test",
BucketName = bucketName,
Key = keyName
};
PutObjectResponse response = client.PutObject(request);
This assumes you have instantiated an Amazon.S3.AmazonS3Client after including the namespace, and configured it with your own credentials.
You first need to install in the Package Manager Console:
Install-package AWSSDK.Extensions.NETCORE.Setup
Install-package AWSSDK.S3
Then you need to have the credentials file in the directory:
C:\Users\username\.aws\credentials
The credential file should have this format:
[default]
aws_access_key_id=[Write your access key in here]
aws_secret_access_key=[Write your secret access key in here]
region=[Write your region here]
I uploaded in github an example of a basic CRUD in ASP.NET CORE for S3 buckets.
We came across an issue when implementing a High-Level API in a .net core solution. When clients had low bandwidth 3mb/s approx an error was thrown by Amazon S3 (The XML you provided was not well-formed). To resolve this issue we had to make an implementation using the low-level API.
https://docs.aws.amazon.com/en_us/AmazonS3/latest/dev/LLuploadFileDotNet.html
// Create list to store upload part responses.
List<UploadPartResponse> uploadResponses = new List<UploadPartResponse>();
// Setup information required to initiate the multipart upload.
InitiateMultipartUploadRequest initiateRequest = new InitiateMultipartUploadRequest{
BucketName = bucketName,
Key = pathbucket
};
//Add metadata to file
string newDate = DateTime.Now.ToString("dd/MM/yyyy HH:mm:ss");
// Initiate the upload.
InitiateMultipartUploadResponse initResponse = await s3Client.InitiateMultipartUploadAsync(initiateRequest);
int uploadmb = 5;
// Upload parts.
long contentLength = new FileInfo(zippath).Length;
long partSize = uploadmb * (long)Math.Pow(2, 20); // 5 MB
try
{
long filePosition = 0;
for (int i = 1; filePosition < contentLength; i++) {
UploadPartRequest uploadRequest = new UploadPartRequest{
BucketName = bucketName,
Key = pathbucket,
UploadId = initResponse.UploadId,
PartNumber = i,
PartSize = partSize,
FilePosition = filePosition,
FilePath = zippath
};
// Track upload progress.
uploadRequest.StreamTransferProgress += new EventHandler<StreamTransferProgressArgs>(UploadPartProgressEventCallback);
// Upload a part and add the response to our list.
uploadResponses.Add(await s3Client.UploadPartAsync(uploadRequest));
filePosition += partSize;
}
// Setup to complete the upload.
CompleteMultipartUploadRequest completeRequest = new CompleteMultipartUploadRequest {
BucketName = bucketName,
Key = pathbucket,
UploadId = initResponse.UploadId
};
completeRequest.AddPartETags(uploadResponses);
// Complete the upload.
CompleteMultipartUploadResponse completeUploadResponse = await s3Client.CompleteMultipartUploadAsync(completeRequest);
}
catch (Exception exception)
{
Console.WriteLine("An AmazonS3Exception was thrown: { 0}", exception.Message);
// Abort the upload.
AbortMultipartUploadRequest abortMPURequest = new AbortMultipartUploadRequest {
BucketName = bucketName,
Key = keyName,
UploadId = initResponse.UploadId
};
}