Accessing Local Storage in azure - pdf

I have a website in azure which I want to create pdfs from templates. I need some where to store the pdf while I'm creating it. After some searching round I thought the best way to handle this would be through Local Storage. I added a Windows azure cloud service project to my web app. I then added local storage to the role for my web app. Locally I can now create pdfs from templates and store them in blob storage. However when I publish the app to azure it no longer works. I thought I might need to create a cloud service in azure from my local cloud service project so created a package to do this. The cloud service is now running but I still can't access local storage.
The line:
Dim myReportsStorage As LocalResource = RoleEnvironment.GetLocalResource("myReports")
works locally but fails when I publish the website to azure. I'm not sure if I somehow need to link my website and my cloud service in azure but I can't see how to do this.
If anyone can help with this I would be very grateful.

Related

Azure Gov Cloud and Azure Functions trigger on Storage

I have hard time with Azure Functions on Azure Government. I need to create a C# trigger bases process on Azure Storage. The goal is to automate the process of the loading the files into Azure SQL DB when a file is dropped into Azure Storage.
Since Azure Functions in Azure Government are not fully comparable to Azure Function in regular Azure and not all UIs are the same, I can't deploy the function to trigger on a storage file.
I was able to build the process in regular Azure Cloud following instructions from https://github.com/yorek/AzureFunctionUploadToSQL but since Azure Government is missing the UI for Azure Functions I'm having hard time to replicating the process in Azure Government.
Portal UI support is not yet available in Azure Government, but it is coming soon. Additionally, Azure Government currently supports "App Service plan" ("Consumption plan" coming soon).
In the meantime, you can do everything you need. First, provision your Azure Function in Azure Gov via the Azure CLI by following this Quickstart example for Functions on Azure Gov. That same link also shows you how you can use Visual Studio to set up your triggers (in your case, a Blob trigger).
Once complete, deploy your Function to Azure Gov with Visual Studio.

deploy Rshiny application connected to an sql database

Please how can i deploy my shiny application developed with R studio and connected an sql database. I created the mysql database on a local server (wampserver) and the application connects to it perfectly but i have to deploy it with the database.
Consider deploying your sql database in the same place that you will be deploying your application. Otherwise, you will need to look into options for exposing your database in such a way that your app will be able to access it.
For example, you could deploy both the database and app on an AWS EC2 instance. Alternatively, you could deploy the database to an RDS instance and connect to it remotely with your app on EC2. Those examples are only pertinent to Amazon resources, but the logic applies regardless of your platform.

Is There a Local Emulator for the Azure Data Lake Store

When developing for Azure storage accounts, I can run the Microsoft Storage Emulator to locally keep Blobs, Queues, and Tables without having to connect to Azure online.
Is there something equivalent for the Azure Data Lake Store? It would be nice to develop locally for a while without having to connect to Azure online.
Have you tried Visual Studio with the Azure Data Lake Tools plug-in?
As pointed out by David, you can develop Azure Data Lake Analytics (ADLA) projects locally without needing connectivity to Azure for the ADLA account or the associated Azure Data Lake Store (ADLS) account. Is there some other application you would like to use with ADLS?
Thanks,
Sachin Sheth
Azure Data Lake team
Same problem here.
AFAIK the Storage Emulator is not yet able to really handle Data Lake (ADSL Gen2) Requests.
This Uri works (but looks for a file, not a dir):
http://127.0.0.1:10000/devstoreaccount1/packages-container/Dir/SubDir?sv=2020-04-08&se=2022-10-13T14%3A43%3A39Z&sr=b&sp=rcwl&sig=d2SxwYCkJGyx%2BHac9vntYQZOTt5QVs1bKgKb4%2FgcQ9k%3D
This one doesn't:
Error: Status: 403 (Server failed to authenticate the request. Make sure the value of the Authorization header is formed correctly including the signature.)
ErrorCode: AuthorizationFailure
http://127.0.0.1:10000/devstoreaccount1/packages-container/Dir/SubDir?sv=2020-04-08&se=2022-10-13T14%3A43%3A39Z&sr=d&sp=rcwl&sdd=2&sig=KU%2Fcu6W0Nsv8CucMgusubo8RbXWabFO8nDMkFxU1tTw%3D
The difference is that the second one uses the resource 'sr=d' (directory) while the first uses 'sr=b' (blob).
Both items are working on real Azure Storage (with ADSL Gen2).
The request is already tracked here: https://github.com/Azure/Azurite/issues/553
Tested on VS 2022 17.3.6 using Server: Azurite-Blob/3.18.0

Debug Azure application without using Azure Storage Emulator

When debugging in Visual Studio, Azure Storage Emulator always starts automatically, even if my web role configuration points to the storage account credentials (*.blob.core.windows.net) and not the storage emulator. Is there any way to debug and not use the development storage at all?
The emulator starts for your web roles probably. What application is using storage? You can't debug Azure storage. You can only debug an app that is calling storage.

Tool to migrate Azure storage to local development storage

Are there any good tools to take a snapshot of my Azure tables and blob containers and copy it into local development storage?
Developers sometimes need to work in a isolated environment but would like a copy of some "real" application data. Right now we have data creation scripts that we can run to populate local storage but it would be helpful to be able to grab a snapshot and move into development storage.
I generally use Cloud Storage Studio for all handling of Azure Storage. Using that you can easily download from your live blob storage and then upload to your local storage.
You can also use the Azure Storage Synctool to upload the local storage to a live storage blob on Azure, or download (vice versa).