I want to set cache control property of all uploaded blobs before, but it throws exception "The remote server returned an error: (404) Not Found."
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
CloudConfigurationManager.GetSetting("StorageConnectionString"));
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
List<CloudBlobContainer> containers = blobClient.ListContainers().ToList();
// Retrieve reference to a previously created container.
CloudBlobContainer container = blobClient.GetContainerReference(containerName);
List<IListBlobItem> blobs = container.ListBlobs().ToList();
int count = 0;
foreach (IListBlobItem blob in blobs)
{
CloudBlockBlob b = container.GetBlockBlobReference(blob.Uri.ToString());
b.Properties.CacheControl = "public, max-age=1296000";
b.SetProperties();
Console.WriteLine("cached"+count.ToString());
count++;
}
The error is being thrown at SetProperties.
You are doing a hierarchical listing, which can return virtual directories as well as blobs. For example, if you have a blob named "foo/bar" then your listing will return a CloudBlobDirectory named "foo/". When you try to use this as a blob name, the service retruns 404 because the blob does not exist.
To accomplish what you want, pass "useFlatBlobListing: true" to the call to ListBlobs and then cast each returned IListBlobItem to CloudBlob.
Related
I have prepared some C# code to create a container in the Azure Storage and then I am uploading a file into that azure container. The code is below:
var connectionString = _settings.appConfig.StorageConnectionString;
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
BlobContainerClient blobContainer = blobServiceClient.GetBlobContainerClient("nasir-container");
await blobContainer.CreateIfNotExistsAsync(); // Create the container.
BlobClient blobClient = blobContainer.GetBlobClient(fileName); // Creating the blob
string fileName = "D:/Workspace/Adappt/MyWordFile.docx";
FileStream uploadFileStream = System.IO.File.OpenRead(fileName);
blobClient.Upload(uploadFileStream);
uploadFileStream.Close();
Now I have updated my MyWordFile.docx with more content. Now I would like to upload this updated file to the same blob storage. How can I do this? I also want to create versioning too so that I can get the file content based on the version.
Now I have updated my MyWordFile.docx with more content. Now I would
like to upload this updated file to the same blob storage. How can I
do this?
To update a blob, you simply upload the same file (basically use the same code you wrote to upload the file in the first place). Upload operation will overwrite an existing blob.
I also want to create versioning too so that I can get the file
content based on the version.
There are two ways you can implement versioning for blobs:
Automatic versioning: If you want Azure Blob Storage service to maintain versions of your blobs, all you need to do is enable versioning on the storage account. Once you enable that, anytime a blob is modified a new version of the blob will be created automatically for you by service. Please see this link to learn more about blob versioning: https://learn.microsoft.com/en-us/azure/storage/blobs/versioning-overview.
Manual versioning: While automatic versioning is great but there could be many reasons why you would want to opt for manual versioning (e.g. you only want to version a few blobs and not all blobs, you are not using V2 Storage account etc.). If that's the case, then you can create a version of the blob by taking a snapshot of the blob before you update the blob. Snapshot creates a read-only copy of the blob at the point of time snapshot was taken. Please see this link to learn more about blob snapshot: https://learn.microsoft.com/en-us/azure/storage/blobs/snapshots-overview.
First you need to enable versioning in the blob storage through the portal in the storage account.
Just click on Disable it will take you to a different page and select enable version and click save
Here after uploading the blob when you update the blob it will automatically trigger the creating of versions.
public static async Task UpdateVersionedBlobMetadata(BlobContainerClient blobContainerClient,
string blobName)
{
try
{
// Create the container.
await blobContainerClient.CreateIfNotExistsAsync();
// Upload a block blob.
BlockBlobClient blockBlobClient = blobContainerClient.GetBlockBlobClient(blobName);
string blobContents = string.Format("Block blob created at {0}.", DateTime.Now);
byte[] byteArray = Encoding.ASCII.GetBytes(blobContents);
string initalVersionId;
using (MemoryStream stream = new MemoryStream(byteArray))
{
Response<BlobContentInfo> uploadResponse =
await blockBlobClient.UploadAsync(stream, null, default);
// Get the version ID for the current version.
initalVersionId = uploadResponse.Value.VersionId;
}
// Update the blob's metadata to trigger the creation of a new version.
Dictionary<string, string> metadata = new Dictionary<string, string>
{
{ "key", "value" },
{ "key1", "value1" }
};
Response<BlobInfo> metadataResponse =
await blockBlobClient.SetMetadataAsync(metadata);
// Get the version ID for the new current version.
string newVersionId = metadataResponse.Value.VersionId;
// Request metadata on the previous version.
BlockBlobClient initalVersionBlob = blockBlobClient.WithVersion(initalVersionId);
Response<BlobProperties> propertiesResponse = await initalVersionBlob.GetPropertiesAsync();
PrintMetadata(propertiesResponse);
// Request metadata on the current version.
BlockBlobClient newVersionBlob = blockBlobClient.WithVersion(newVersionId);
Response<BlobProperties> newPropertiesResponse = await newVersionBlob.GetPropertiesAsync();
PrintMetadata(newPropertiesResponse);
}
catch (RequestFailedException e)
{
Console.WriteLine(e.Message);
Console.ReadLine();
throw;
}
}
static void PrintMetadata(Response<BlobProperties> propertiesResponse)
{
if (propertiesResponse.Value.Metadata.Count > 0)
{
Console.WriteLine("Metadata values for version {0}:", propertiesResponse.Value.VersionId);
foreach (var item in propertiesResponse.Value.Metadata)
{
Console.WriteLine("Key:{0} Value:{1}", item.Key, item.Value);
}
}
else
{
Console.WriteLine("Version {0} has no metadata.", propertiesResponse.Value.VersionId);
}
}
The above code is from the following documentation.
I am trying to get a file from Azure container. I need to read its content.
The file has been uploaded to umbraco media, media are stored in our Azure container.
Its normal (umbraco) url would be like:
~/media/10890/filename.xls
I am trying to retrieve it like this:
var storageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["strorageconnstring"]);
var blobClient = storageAccount.CreateCloudBlobClient();
var container = blobClient.GetContainerReference("storagemedia");
The thing is - I am not sure how I am supposed to retrieve a particular file? I tried:
1.
CloudBlobDirectory dira = container.GetDirectoryReference("10890"); // file folder within media
var list = dira2.ListBlobs(useFlatBlobListing: true).ToList(); // Returns error saying "The requested URI does not represent any resource on the server."
However the 10890 folder within media storage exists and I can browse it with storage browser.
2.
CloudBlockBlob blobFile = container.GetBlockBlobReference("10890/filename.xls");
string text;
using (var memoryStream = new MemoryStream())
{
blobFile.DownloadToStream(memoryStream); // Throws "The specifed resource name contains invalid characters." error
var length = memoryStream.Length;
text = System.Text.Encoding.UTF8.GetString(memoryStream.ToArray());
}
Any idea how to read the file? And what am I doing wrong?
Thank You Gaurav for providing your suggestion in comment section.
Thank You nicornotto for confirming that your issue got resolved by changing the container name reference in the below statement.
var container = blobClient.GetContainerReference("storagemedia");
This question already has answers here:
How upload blob in Azure Blob Storage with specified ContentType with .NET v12 SDK?
(2 answers)
Closed 2 years ago.
I'm using the asp.net-core webapi to upload images to azure storage.
I was able to successfully upload a image blob to azure storage (using the quickstart). However, the content-type property (with azure) is set to application/octet-stream. The problem with this is the public url will not load in a browser due to this content type. I plan to eventually consume this url/image in my website so I'm thinking this might work. Is there any way to specify the CONTENT-TYPE to image/jpeg? I've also tried the following code but received error message: 404 (The specified blob does not exist.) during the SetHttpHeaders call (the UploadBlob method call that is currently commented out does work, but has the octet-stream content type).
BlobClient blobClient = containerClient.GetBlobClient(guids[index]);
using (var content = file.OpenReadStream())
{
blobClient.Upload(content);
blobClient.SetHttpHeaders(new BlobHttpHeaders() { ContentType = "image/jpeg" });
//containerClient.UploadBlob(guids[index], content);
}
Not specific contentType, with the filename.[ext] download ok. When create the reference to blobclient set the extension. Example download:
var fileName = $"{guids[index]}.jpg";
var pathStorage = Path.Combine(path, fileName);
BlobClient blobClient = containerClient.GetBlobClient(pathStorage);
BlobDownloadInfo download = await blobClient.DownloadAsync();
byte[] bytesContent;
using (var ms = new MemoryStream())
{
await download.Content.CopyToAsync(ms);
bytesContent = ms.ToArray();
}
return bytesContent;
Example upload:
var fileName = $"{guids[index]}.jpg";
var pathStorage = Path.Combine(path, fileName);
BlobClient blobClient = containerClient.GetBlobClient(pathStorage);
var stream = new MemoryStream(bytesContent);
var uploadInfo = await blobClient.UploadAsync(stream);
I am having trouble reading an ORC file from S3 with the OrcFile.createReader option. I am using hive-exec-2.2.0.jar at the moment and am wondering if this is supported at all? Am i missing any configuration settings? See code below. Any help will be appreciated.
String accessKey = "***";
String secretKey = "***";
Configuration configuration = new Configuration();
configuration.set("fs.s3.awsAccessKeyId", accessKey);
configuration.set("fs.s3.awsSecretAccessKey", secretKey);
configuration.set("fs.defaultFS", "s3://<bucket>");
//configuration.set("fs.default.name", "s3://<bucket>");
//configuration.set("fs.s3.impl", "org.apache.hadoop.fs.s3.S3FileSystem");
FileSystem fs = FileSystem.get(configuration);
Reader reader = OrcFile.createReader(new Path("/some/path/file.orc"), OrcFile.readerOptions(configuration).filesystem(fs));
Exception - java.io.IOException: No such file.
The ReaderImpl seems to require either the fileMetadata or the OrcTail (both of which are null). Anything that i might be missing?
Update: So i managed to get out of the file not found exception by creating the s3 object with additional info (also fix the key) with
--metadata="fs=Hadoop,fs-type=block,fs-version=1". See --metadata.
It seems wrong/weird that the type of the orc file on s3 has to have those values for it to successfully retrieve the metadata.
Of course after this it barfs at getting the data, probably because the file formats differ(?).
in INode of package org.apache.hadoop.fs.s3;
public static INode deserialize(InputStream in) throws IOException {
if (in == null) {
return null;
} else {
DataInputStream dataIn = new DataInputStream(in);
INode.FileType fileType = FILE_TYPES[dataIn.readByte()];
The dataIn.readByte is returning a larger value (FILE_TYPES is an array of size 2).
Is it possible to download a specific block from an Azure Block Blob if you know the Block Id?
Yes, you absolutely can, here's an example of how to download the first block:
var storageAccount = CloudStorageAccount.Parse("DefaultEndpointsProtocol=https;AccountName=...;AccountKey=...;EndpointSuffix=core.windows.net");
CloudBlobClient serviceClient = storageAccount.CreateCloudBlobClient();
var container = serviceClient.GetContainerReference("containerName");
var blockBlob = container.GetBlockBlobReference("blobName");
var blocklist = await blockBlob.DownloadBlockListAsync();
var firstBlock = blocklist.First();
var memStream = new MemoryStream();
await blockBlob.DownloadRangeToStreamAsync(memStream, 0, firstBlock.Length);
string contents;
using (var streamReader = new StreamReader(memStream))
{
contents = await streamReader.ReadToEndAsync();
}
You will need a couple of packages from nuget:
Microsoft.WindowsAzure.Storage
Microsoft.WindowsAzure.Storage.Blob
You could leverage Microsoft Azure Storage SDK for getting started with Azure Blob Storage quickly. And the Azure Storage SDK is a wrapper of Blob Service REST API. From the official tutorial about Blob Service REST API, we couldn't find anything about downloading the specific block via the Block Id. In addition, you could use Get Blob to download the bytes of your blob in the specified range by specifying the offset and the length of data to download from your blob.