Moving azure storage containers from one blob to another - azure-storage

Hello I have two blobs in my account:
Blob1
Blob2
Blob2 is empty, how can I take all the containers from Blob1 and move it to Blob2?
I am doing this because I would like to use a different subscription to help save some money. It doesn't seem like its possible any other way.
This is all under the same windows live account.
Thank you!

I am glad to hear that Azure Support was able to reassign your subscription. In the future, if you would like to copy Azure Storage blobs from one account to another, you can use the Copy Blob REST API. If you are using Azure Storage Client Library, the corresponding method is ICloudBlob.StartCopyFromBlob. The Blob service copies blobs on a best-effort basis and you can use the value of x-ms-copy-id header to check the status of a specific copy operation.

Related

Azure Sentinel referencing large sets of data

I've been trying to find the most effective (elegant) solution to achieve what I'm trying to do. I'd like to hear from the community, thank you.
Situation:
Need to geo-enrich IP Address records on Sentinel. Example: Successful SigninLogs, since MSFT enrichment sometimes generates "Unknown" results in the IP enrichment maps.
External reference file (subnet, country_code, country_name) are available publicly, however the size and # of records are rather large. (~12MB, 200K+records).
Issue:
Tried using storage account blob to host the "reference table", apparently hitting the limit on max. blob size in Storage Account.
Looks like there are max. 30.000 records on Workbooks to read from external sources using 'externaldata' command. Hence, only partial reference data can be read and referred to.
Options considered:
Ingest the reference table into the log analytics workspace, do a join/lookup to this custom reference table for enrichment
Export the IP addresses from SigninLogs table to a blob storage, enrich the IP address using logicapps, and then put it back to a 'reference' blob storage. then read the 'reference' blob storage using 'externaldata' syntax.
Limitation Observed:
Came to a realization that Sentinel couldn't perform API call for enrichment from external data. (CMIIW). I've done similar stuff with Splunk, and we could enrich the data on the fly, by calling in multiple API calls to outside database.
Ingest the Data - As you've mentioned, ingest the data and join the tables. You would need to regularly ingest this though to ensure you can lookup the data within the desired time range (e.g. If you have an Analytics Rule, then this only looks up data for a 14 day period).
Use a Playbook - If you want the Geo-IP lookup post incident, you can perform this with a Logic App
Use Jupyter Notebooks - This have the flexibility to perform API calls against external locations and join the data to that hosted in Sentinel. An example notebook is the IP Explorer Notebook. Use Jupyter notebooks to hunt for security threats
Threat Intelligence - Microsoft enriches all imported threat intelligence indicators with GeoLocation and WhoIs data, which is displayed together with other indicator details.
Since March 2022, you can upload large CSV files into a Sentinel Watchlist. This way, you can upload a complete GeoIP database and perform ipv4_lookups. This blog post explains you how to do this: https://cryptsus.com/blog/enrich-geolocation-sentinel-siem.html

How to copy many terabytes of data to Azure?

I am trying to copy 25 TB of data to Azure. Do we have any option to move the date?
Tried to copy but it has taken 1 hr for 1 GB Data, do we have any better solution so that I can do it more quickly?
The problem statement is very general. I would start with asking, how are you transferring the data?
The speed is dependent on so many factors, a few being:
1. Location of the data.
2. Location of the storage account you're writing to.
3. Network speed and bandwidth on the client side.
4. Network speed and bandwidth on the azure storage side. (expected to be good)
If you're writing the data to a Azure Storage account which is in a region closer to you, you're expected to get better speed.
As for the options to write the data:
1. Look at AzCopy.
https://azure.microsoft.com/en-us/documentation/articles/storage-use-azcopy/
Use Import\Export service.
https://azure.microsoft.com/en-us/pricing/details/storage-import-export/
The best way to upload large datasets into the cloud is still the sneakernet
Azure do a thing called the Azure Import/Export Service Basically you buy a SATA hard drive, encrypt it with a numerical bitlocker key, copy data to it, create an Azure import job, then ship the hard drive to them.
This ends up being considerably quicker than trying to upload.
An alternative you might want to look into, would be the AWS Import/Export Snowball for which they will ship you an appliance to copy the data to which you ship back to them when complete. It might be worth considering copying data into AWS via Snowball then copying it across their much faster internet pipes into Azure instead of buying the hardware required to transfer that much data.
If you open the target Storage account in the Azure Portal, there's now a calculator that will accept basic details (how much data etc) and then recommend the best options to you. Its under the heading "Data transfer".

Storing Uploaded Files in Azure Web Sites: File System or Azure Storage

When using Azure Web Sites (WAWS) general opinion seems to be that uploaded content such as photo's or files should be stored in Azure Storage Blobs and not in the WAWS File System.
Clearly using Azure Storage is a great idea if you have a lot of data and need scale and redundancy however for small or simple sites it seems to add another layer of complexity and also means you can't easily use things like ImageResizer without purchasing the Azure compatible licence etc.
So given that products like WordPress from the Azure Gallery uses "/site/wwwroot/wp-content/uploads/" to store all uploaded files on WAWS is there anything wrong with using the WAWS file system for storage or are there other considerations to take into account when using Azure WAWS?
The major drawback to using the WAWS storage is that your data is now intermingled with the application. By saving all of your plugins/images/blobs externally in a database or blob storage, you retain the flexibility to redeploy your application to a new region/datacenter by just pushing your code to the new website and changing connection strings.
If your plugins/images are stored on disk in WAWS, then you need to make sure that you are backing it up appropriately. If anything happens, you need to restore the site along with all of the data that had been uploaded.
Azure Web Sites is using Azure storage as a file storage so essentially the level of complexity you're talking about is abstracted.
Another great benefit that comes with this approach is if you scale your web site to multiple instances all of them will work with exact same file content.
Of course if you want to use pure Azure Storage features like snapshots or sharing specific content to specific users this is not available as is. But for the web site purposes is quite good.
Hope that helps

Transfer file from one cloud storage to another

I'm wondering if it's possible to transfer files from one cloud storage to another "on the fly"? Specifically I want to build an app that will transfer my photos from Skydrive to Box.net without saving files temporarily to my databases, but saving files directly to Box.net storage?
Thanks,
Lojza
I'm not aware of any cloud provider that provides an API for transferring data to a competing cloud provider, but I'm not familiar with skydrive or box.net. Cloud providers have no incentive to help you move your data to the competition. You will almost certainly have to read the data to your local machine then write it to Box.net
You can do it with https://app.mover.io/ and http://otixo.com/
These are just for this purpose..

Notification about azure blob object changes

Can I somehow subscribe for notifications about Azure's blob object changes?
My purpose is to delegate file uploads to the client using SAS and lately (after upload is complete) update the database. It looks like I need to continuously check blob's state, but it is quite resource consuming process.
You can't be notified by the Blob Storage about a change made to a blob, but as you point out, you can monitor it, requesting the ETag on a scheduled basis to see if it's done.
That being said, the cost to monitor a blob (or even a whole container) can be close to negligible if correctly implemented. Pinging the Blob Storage once per second costs you roughly $2.5 / month. Then, by using some heuristic you can probably lower this cost to $0.25 (one check per 10s on average). At this point, it's not really interesting to try to optimize more.
You can now do this using Azure functions
Create a blob trigger by specifying your storage account connection
string and your container/{name}
In outputs, select the place where
you want your notification to go to
Another option to consider is to have the client notify you when it's done uploading.
I created a file change monitor for monitoring blobs - full details at http://ben.onfabrik.com/posts/monitoring-files-in-azure-blob-storage