I have to big files range in size between 20 GB to 90 GB. I will download files with Internet Download Manager (IDM) to my Windows server at Azure Virtual Machine. I will need to transfer these files to my Azure Storage account to use it later. The total files size about 550 GB.
Will Azure Storage Explorer do the job, or there are a better solution?
My Azure account is a BizSpark one with 150 $ limit, shall I remove the limit before transferring the files to the storage account?
Any other advice?
Thanks very much in advance.
You should look at the AzCopy tool (http://aka.ms/AzCopy) - it is designed for large transfers of data to and from Azure Storage.
You will save network egress cost if your storage account is in the same region as the VM where you are uploading from.
As for cost, this depends on what all you are using. You can use the Azure price calculator (http://azure.microsoft.com/en-us/pricing/calculator/) to help with estimating, or just use the pricing info directly from Azure website and calculate an estimated usage to see whether you will fit within your $150 limit.
Related
Is there a solution to get the below informations of Window File Azure storage Account using Windows Azure Storage Client Library:
Azure Storage Account Capacity
Azure Storage Free and used Space
Azure Storage Account State (Active, Disable, Enable ….)
Client Transfer files (Mo, GO … ) per month, days …
Azure Storage Account Performance
...
Thanks
As far as I know, a azure standard account contains multiple services. Blob, table, queue, file.
If you want to know the information about he file service, you could use Windows Azure Storage Client Library. If you want to know the information about your storage account, I suggest you could use azure management library.
Azure Storage Account Capacity
As far as I know, the azure storage account capacity is 500TB.
Max size of a file share is 5TB.
Max size of a file is 1TB.
We could create multiple file share in one storage account. The only limit is the 500 TB storage account capacity.
More details, you could refer to this article.
Azure Storage Free and used Space
As far as I know, we could only get the quota and usage of a fileshare by using the Windows Azure Storage Client Library.
We could use CloudFileShare.Properties.Quota property to get the quota of the fileshare and use CloudFileShare.GetStats method to get the usage of the fileshare.
More details, you could refer to below codes:
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
"connectionstring");
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
CloudFileShare share = fileClient.GetShareReference("fileshare");
share.FetchAttributes();
//get the quota
int? i = share.Properties.Quota;
//get usage
var re = share.GetStats();
Console.WriteLine(i);
Console.WriteLine(re.Usage);
Azure Storage Account State (Active, Disable, Enable ….)
As far as I know, we couldn't get storage account state by using storage SDK. If you want to get this value, I suggest you could use azure management library. You could install it from Nuget package. You could get the StorageAccount.Properties.Status from the StorageAccounts class.
More details about how to use azure management library to access the storage account you could refer to this article.
Client Transfer files (Mo, GO … ) per month, days …
As far as I know, the Windows Azure Storage Client Library doesn't contain the method to get the client transfer files (Mo, GO … ) per month, days.
Here is a workaround, you could write codes to calculate the transfer files number in your application and store this number to azure table storage per day.(When uploading the file to the azure file storage, firstly get the number from the table and add one, then upload the number to the table storage)
If you want to get the number of the transfer files, you could use the azure table storage SDK to get the result.
Azure Storage Account Performance
As far as I know, if we want to check our azure storage account performance, we should firstly enable the diagnostics to log how the storage works. Then we could check the storage performance by using its service's metrics.
More details about how to access metrics data by using Windows Azure Storage Client Library. I suggest you could refer to this article.
I am trying to copy 25 TB of data to Azure. Do we have any option to move the date?
Tried to copy but it has taken 1 hr for 1 GB Data, do we have any better solution so that I can do it more quickly?
The problem statement is very general. I would start with asking, how are you transferring the data?
The speed is dependent on so many factors, a few being:
1. Location of the data.
2. Location of the storage account you're writing to.
3. Network speed and bandwidth on the client side.
4. Network speed and bandwidth on the azure storage side. (expected to be good)
If you're writing the data to a Azure Storage account which is in a region closer to you, you're expected to get better speed.
As for the options to write the data:
1. Look at AzCopy.
https://azure.microsoft.com/en-us/documentation/articles/storage-use-azcopy/
Use Import\Export service.
https://azure.microsoft.com/en-us/pricing/details/storage-import-export/
The best way to upload large datasets into the cloud is still the sneakernet
Azure do a thing called the Azure Import/Export Service Basically you buy a SATA hard drive, encrypt it with a numerical bitlocker key, copy data to it, create an Azure import job, then ship the hard drive to them.
This ends up being considerably quicker than trying to upload.
An alternative you might want to look into, would be the AWS Import/Export Snowball for which they will ship you an appliance to copy the data to which you ship back to them when complete. It might be worth considering copying data into AWS via Snowball then copying it across their much faster internet pipes into Azure instead of buying the hardware required to transfer that much data.
If you open the target Storage account in the Azure Portal, there's now a calculator that will accept basic details (how much data etc) and then recommend the best options to you. Its under the heading "Data transfer".
I have a couple of zipped shapefiles with around 100-150 features. I am trying to add them on ArcGIS Online (which accepts under 1000 features per shapefile) but it is unable to do so, indicating to me that the zipped shapefile is too big.
I am not sure why since the features are way under 1000
You may be encountering a problem with file size and/or other data on your account, rather than the record limit.
How much storage space do I get?
Subscriptions provide flexible storage capacity options for your organization.
If you have an organizational account, check with your
administrator for information about your storage limit. If you are an
administrator, you can view detailed reports about your organization's
storage of tiles, features, and files. A public account comes with 2
GB of total storage space.
Also note:
Organizational and public accounts can upload items through My Content that are up to 1 GB in size. This is a browser limit; larger file sizes may be supported when uploading through desktop applications such as ArcGIS for Desktop.
I am having a query related to pricing for Azure File Service that how pricing happens like as per usage of GB data of file or will it be based on available size for file share like 5TB per share.
I am referring this link http://azure.microsoft.com/en-us/pricing/details/storage/ but this is not giving me the exact pricing.
My Scenario is like let’s say:-
1) for two months I would require 4GB file data on File Share and then for another 4 month I would require 5GB of file data so how the costing will happen?
2) Would I require two VMs for this to maintain the availability and how the costing will happen for two VMs usage?
Kindly help me on this.
The price is per GB used.
Regarding your questions:
The share has a maximum if 5TB size, but you are only billed for what you actually use, not the maximum size. In your example, you will be billed for 4GB for the first 2 months, and 5GB for the next 4.
You do not require any VMs to maintain availability of the shares themselves - Azure Storage offers SLA on the share uptime. Of course, if you have an app that is running in a VM, and you need uptime guaranteed for the app, you have to look at how many VMs you need, but that is one level above storage - and those needs are not based on storage uptime considerations.
I am working on a project using Azure Table Storage. I am trying to document the network latency between my webrole and table storage. Does anyone know where I can find some preliminary numbers I could use for estimation?
Thanks
JThomas
Anecdotally, I expect a latency somewhere between 10 and 30 milliseconds if both the VM and the storage account are in the same data center.
It depends on which Gen of Azure Storage your Table entities where created. Here is information for both:
http://blogs.msdn.com/b/windowsazure/archive/2012/11/02/windows-azure-s-flat-network-storage-and-2012-scalability-targets.aspx
It has scalability targets and some network information. Network latency will be variable, but there are ways to mitigate it: place the web role/table storage in the same data center location.