I have used blob storage as a file storage account in .NET Core Web application hosted on Azure app service(PaaS).
My requirement is to create .zip files and then attach to email where it requires UNC path for attachment.
Here I have one option to use app service local storage for temporary file creation and use in attachment.
I am searching other option to map blob storage to any virtual drive in cloud and get its UNC path or any other option?
Also, Can you please suggest what are the possible options to map Azure Blob storage drive in network? I know the following one - App service local storage, VM, Local machine network drive.
First of all, you need to know the concept of UNC path, and then azure webapp can be regarded as a virtual machine in essence, and azure blob storage can also be regarded as a machine. Therefore, it is not feasible to send mail directly through azure blob.
Suggestion:
1. I check the information, you can try azure files to store files and use them.
I think this should be the fastest way, without using other azure products.
Download the file to the project directory, you can create a temporary folder, such as: MailTempFolder, you can download the file from the blob to this folder, and then you can get the UNC path to send mail.
After the sending is successful, just delete the file, it will not occupy too much space of the azure webapp, even if the sending fails, you can still get the zip file without downloading it again.
Related
I am trying to create a Pool using Azure Batch . I have uploaded content to Azure Storage using File Shares.
I would like my Pool to mount this Azure File Share as virtual file system (ref: https://learn.microsoft.com/en-us/azure/batch/virtual-file-mount#mount-a-virtual-file-system-on-a-pool ).
I am creating AzureFileShareConfiguration object using code:
mount_configuration=batchmodels.MountConfiguration(azure_file_share_configuration=batchmodels.AzureFileShareConfiguration(
account_name="mystorage",
azure_file_url="https://mystorage.file.core.windows.net/my-share1",
account_key="mystorage/key==",
relative_mount_path="S"
)
)
Using this, I get "CMDKEY: Credentials added successfully" in fsmounts. But when I RDP to the node in the pool, the S drive appears "Disconnected".
My Azure batch package versions are:
azure-batch==8.0.0
azure-common==1.1.24
Can you please help diagnose the issue or suggest the right usage?
Thanks in Advance!
I think this is windows VM you are trying?, just by looking at the drive letter : ).
Here is the key issue with RDP permissions is different then your Batch level model when your code runs and mount.
At Batch level when you mount your Drive: and you can see it via your Start task then it is working. i.e. that a Batch level permissioning model and when you RDP into Node it will be as a "user" you are logged-in. If you want to see via UI RDP user you should re-run the command from your RDP login to update that you have key to see that drive.
Although having said that try it with /persistent:Yes as mount_options.
The best test is going to be -- You mount the drive and from your start task go to the mounted directory via : S:\\Whatever_file.txt or read the mounted file which will add the result in your stdout.txt of batch node or might be just dir it or something.
Rest extra stuff below
try with this mount_options value
Also specifically this will help for various SMB version et. al. support: https://learn.microsoft.com/en-us/azure/storage/files/storage-how-to-use-files-windows and I think this you already know : https://learn.microsoft.com/en-us/azure/batch/virtual-file-mount#azure-files-share
In order to use an Azure file share outside of the Azure region it is
hosted in, such as on-premises or in a different Azure region, the OS
must support SMB 3.0.
So add this to your API and give it a try:
MountOptions = "/persistent:Yes" i.e. mount_options = "/persistent:Yes"
Also: key needs to be Storage account Key, i.e. it should not start with mystorage/key :) but it could be you hiding it, so just a mention and fyi.
Sample code:
I think SDK you have is python?
mount_configuration=batchmodels.MountConfiguration(azure_file_share_configuration=batchmodels.AzureFileShareConfiguration(
account_name="mystorage",
azure_file_url="https://mystorage.file.core.windows.net/my-share1",
account_key="mystorage/key==",
relative_mount_path="S",
mount_options = "/persistent:Yes"
)
hope this helps!
relative_mount_path: The relative path on the compute node where the file system will be mounted. All file systems are mounted relative to the Batch mounts directory, accessible via the AZ_BATCH_NODE_MOUNTS_DIR environment variable.
Azure Files is the standard Azure cloud file system offering. To learn more about how to get any of the parameters in the mount configuration code sample, see Use an Azure Files share.
I cannot find a way to copy files\folders from Blob storage to a SharePoint document library. So far, I've tried AZCopy and PowerShell:
*AZCopy cannot connect to SP as the destination
*PowerShell works for local files but the script cannot connect to Blob storage ( Blob storage cannot be mapped as a networkdrive)
For anyone else who needs to do this, AZCopy worked. I just had to use a different destination. When you map a SharePoint document library as a mapped drive, it assigns a drive letter but it also shows the UNC path. That's what you have to use:
/Dest:"\\Tenant.sharepoint.com#SSL\DavWWWRoot\Sites\sitename\library"
Let's assume I have a dropbox pro account which gives 1TB of storage & the storage is fully occupied with data.If my local machine storage is less than 1 TB, can anyone please explain me about the behavior of "my local dropbox folder"?
I know that all data cannot be downloaded to my desktop(local dropbox folder)due to lack of storage.I have following questions
1.What will happen if I access a file which is not there in local dropbox folder. Will it be downloaded?
Which files are stores locally out of all the files in cloud storage.
Does dropbox consider about file access patterns?
Thank you in advance
Can't access a file which is not there in local dropbox folder.
No, Dropbox has an option called "selective sync" - user determines which folders should be in local out of all the folders. If your local storage is not sufficient it gives a message stating that.
I need to create some set of folders in a client machine or in some network machine. The idea is user will upload some files to the server instaed of storing that files in our server we plan to save that files in the client sepefied machine. That machine will also been in the network only and also in the same domain. Can any one guide me how to create a folder and also upload the files to that network machine??
Thanks in advance
For silverlight you can use Isolated storage as you don't have access to the users complete hard drive, you can use it like such
var myStore = IsolatedStorageFile.GetUserStoreForApplication();
myStore.CreateDirectory("theFolder")
No, I don't want to store in Isolated Storage. The Idea is Once the User uploads the document instead on storing the doc in our server we need to save the doc in some other server.
I have a link to a file (like so: http://example.com/tmp/database.csv). I want to upload it directly into S3, instead of downloading it on my computer first (and then uploading). Is this possible?
The file will have to move through some application you write. Amazon S3 does not have any mechanism to execute code or pull files, so the only way to do this is to send it directly from the server where the file is hosted or from another server.