Is there a solution to get the below informations of Window File Azure storage Account using Windows Azure Storage Client Library:
Azure Storage Account Capacity
Azure Storage Free and used Space
Azure Storage Account State (Active, Disable, Enable ….)
Client Transfer files (Mo, GO … ) per month, days …
Azure Storage Account Performance
...
Thanks
As far as I know, a azure standard account contains multiple services. Blob, table, queue, file.
If you want to know the information about he file service, you could use Windows Azure Storage Client Library. If you want to know the information about your storage account, I suggest you could use azure management library.
Azure Storage Account Capacity
As far as I know, the azure storage account capacity is 500TB.
Max size of a file share is 5TB.
Max size of a file is 1TB.
We could create multiple file share in one storage account. The only limit is the 500 TB storage account capacity.
More details, you could refer to this article.
Azure Storage Free and used Space
As far as I know, we could only get the quota and usage of a fileshare by using the Windows Azure Storage Client Library.
We could use CloudFileShare.Properties.Quota property to get the quota of the fileshare and use CloudFileShare.GetStats method to get the usage of the fileshare.
More details, you could refer to below codes:
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
"connectionstring");
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
CloudFileShare share = fileClient.GetShareReference("fileshare");
share.FetchAttributes();
//get the quota
int? i = share.Properties.Quota;
//get usage
var re = share.GetStats();
Console.WriteLine(i);
Console.WriteLine(re.Usage);
Azure Storage Account State (Active, Disable, Enable ….)
As far as I know, we couldn't get storage account state by using storage SDK. If you want to get this value, I suggest you could use azure management library. You could install it from Nuget package. You could get the StorageAccount.Properties.Status from the StorageAccounts class.
More details about how to use azure management library to access the storage account you could refer to this article.
Client Transfer files (Mo, GO … ) per month, days …
As far as I know, the Windows Azure Storage Client Library doesn't contain the method to get the client transfer files (Mo, GO … ) per month, days.
Here is a workaround, you could write codes to calculate the transfer files number in your application and store this number to azure table storage per day.(When uploading the file to the azure file storage, firstly get the number from the table and add one, then upload the number to the table storage)
If you want to get the number of the transfer files, you could use the azure table storage SDK to get the result.
Azure Storage Account Performance
As far as I know, if we want to check our azure storage account performance, we should firstly enable the diagnostics to log how the storage works. Then we could check the storage performance by using its service's metrics.
More details about how to access metrics data by using Windows Azure Storage Client Library. I suggest you could refer to this article.
Related
We have classic cloud storage account in Azure. It does not hold any Tables, Queues, File Shares except Containers. Also the containers have blob data. It does not have any VM disks. The following are few questions that I have:
Would the Access keys change post migration to ARM(both primary and secondary)?
If there are any SAS token generated with classic cloud storage, would that also change post migrating to ARM?
Is there any cost increase for the same set of data stored post the migration to ARM?
Would there be any change in the URL post ARM migration?
Any specific amount of time that would be taken during the migration of ARM?
Have not tried, but would like to make the decision based on the reply
We are setting up an active/active configuration using either front door or traffic manager as our front end. Our services are located in both Central and East US 2 paired regions. There is an AKS cluster in each region. The AKS clusters will write data to a storage account located in their region. However, the files in the storage accounts must be the same in each region. The storage accounts must be zone redundant and read/writeable in each region at all times, thus none of the Microsoft replication strategies work. This replication must be automatic, we can't have any manual process to do the copy. I looked at Data Factory but it seems to be regional, so I don't think that would work, but it's a possibility....maybe. Does anyone have any suggestions on the best way to accomplish this task?
I have tested in my environment.
Replication between two storage accounts can be implemented using the Logic App.
In the logic app, we can create two workflows. One for replicating data from storage account 1 to storage account 2. Other for replicating data from storage account 2 to storage account 1.
I have tried to replicate blob data between storage accounts in different regions.
The workflow is :
When a blob is added or modified in the storage account 1, the blob will be copied to the storage account 2
Trigger : When a blob is added or modified (properties only) (V2) (Use connection setting of storage account1)
Action : Copy blob (V2) ) (Use connection setting of storage account2)
Similar way, we can create another workflow for replication of data from Storage Account 2 to Storage Account 1.
Now, the data will be replicated between the two storage accounts.
the Metrics can show total transaction for all blob in a storage account, but I cannot filter by container or blob.
Thanks.
Lidong
the Metrics can show total transaction for all blob in a storage account, but I cannot filter by container or blob.
We could find the detail logs in the Azure storage $logs container. About how to enable and access log data please refer to this tutorial.
After that we could use Microsoft Message Analyzer to analyze log data.
You could filter the data what you wanted
I have to big files range in size between 20 GB to 90 GB. I will download files with Internet Download Manager (IDM) to my Windows server at Azure Virtual Machine. I will need to transfer these files to my Azure Storage account to use it later. The total files size about 550 GB.
Will Azure Storage Explorer do the job, or there are a better solution?
My Azure account is a BizSpark one with 150 $ limit, shall I remove the limit before transferring the files to the storage account?
Any other advice?
Thanks very much in advance.
You should look at the AzCopy tool (http://aka.ms/AzCopy) - it is designed for large transfers of data to and from Azure Storage.
You will save network egress cost if your storage account is in the same region as the VM where you are uploading from.
As for cost, this depends on what all you are using. You can use the Azure price calculator (http://azure.microsoft.com/en-us/pricing/calculator/) to help with estimating, or just use the pricing info directly from Azure website and calculate an estimated usage to see whether you will fit within your $150 limit.
Hello I have two blobs in my account:
Blob1
Blob2
Blob2 is empty, how can I take all the containers from Blob1 and move it to Blob2?
I am doing this because I would like to use a different subscription to help save some money. It doesn't seem like its possible any other way.
This is all under the same windows live account.
Thank you!
I am glad to hear that Azure Support was able to reassign your subscription. In the future, if you would like to copy Azure Storage blobs from one account to another, you can use the Copy Blob REST API. If you are using Azure Storage Client Library, the corresponding method is ICloudBlob.StartCopyFromBlob. The Blob service copies blobs on a best-effort basis and you can use the value of x-ms-copy-id header to check the status of a specific copy operation.