Cannot create queue/container in Azure Storage Emulator - azure-storage

I have a very simple console project with the following code:
// NOTE: We piggyback on web jobs storage for now
var connString = AmbientConnectionStringProvider.Instance
.GetConnectionString(ConnectionStringNames.Storage);
var storageAccount = CloudStorageAccount.Parse(connString);
var queueClient = storageAccount.CreateCloudQueueClient();
var queue = queueClient.GetQueueReference(InputQueueName);
queue.CreateIfNotExists();
When I try to run it locally against the Azure Storage Emulator (I am using version 4.3), I get a 404 Not Found "The specified queue does not exist." exception thrown from the CreateIfNotExists line.
If I manually create the queue in Visual Studio Cloud Explorer (under the (Development) storage account), this code works.
When I use an actual storage account in Azure, the code works.
The same thing happens with blob containers.
I have deleted the localdb database and recreated. The init command runs without errors.
Any ideas?
EDIT
The connection string ending up in connString variable is "UseDevelopmentStorage=true;".
EDIT2
I am using version 7.0.0 of the NuGet package WindowsAzure.Storage.

Related

How to get .Net Core 3.1 Azure WebJob to read the AzureWebJobsStorage connection string from the Connected Services setup?

I'm building a WebJob for Azure to run in an App Service using .Net Core 3.1.
The WebJob will be triggered via Timers (it's basically a cronjob).
Timer triggers require the AzureWebJobsStorage connection string as storage is required for Timer events.
When deployed to Azure App Service, I want the WebJob to read the AzureWebJobsStorage value from the properties on the App Service.
I have a Resource Manager template that deploys my infrastructure and sets the connection string on my App Service resource:
"connectionStrings": [
{
"name": "AzureWebJobsStorage",
"value": "[concat('DefaultEndpointsProtocol=https;AccountName=', variables('_StoreAccountName'), ';AccountKey=', listKeys(resourceId('Microsoft.Storage/storageAccounts', variables('_StoreAccountName')), '2019-04-01').keys[0].value,';EndpointSuffix=core.windows.net')]"
}
],
When testing my WebJob locally, I need to set that AzureWebJobsStorage value so that my local builds can connect to storage.
Since I re-deploy the infrastructure all the time as I make tweaks and changes to it, I do not want to manually maintain the long connection string in my appsettings.json or a local.settings.json file.
In Visual Studio, In theory, I can add a Service Dependency to the project for Azure Storage and that will store the connection string in my local Secrets.json file. Then, when I redeploy the infrastructure I can use the Visual Studio UI to edit the connection and re-connect it to the newly deployed storage account (i.e. it will create and update the connection string without me having to do it manually).
When I add Azure Storage as a connected service, Visual Studio adds a line like this in my Secrets.json file:
"ConnectionStrings:<LABEL>": "DefaultEndpointsProtocol=https;AccountName=<LABEL>;AccountKey=_____________;BlobEndpoint=https://<LABEL>.blob.core.windows.net/;TableEndpoint=https://<LABEL>.table.core.windows.net/;QueueEndpoint=https://<LABEL>.queue.core.windows.net/;FileEndpoint=https://<LABEL>.file.core.windows.net/",
and this in my ServiceDependencies/serviceDependencies.local.json:
"storage1": {
"resourceId": "/subscriptions/[parameters('subscriptionId')]/resourceGroups/[parameters('resourceGroupName')]/providers/Microsoft.Storage/storageAccounts/<LABEL>",
"type": "storage.azure",
"connectionId": "<LABEL>",
"secretStore": "LocalSecretsFile"
}
and this in my ServiceDependencies/serviceDependencies.json:
"storage1": {
"type": "storage",
"connectionId": "<LABEL>"
}
Where <LABEL> is the name of the Storage Account (in both JSON snippits).
When I run the WebJob locally, it loads the appsettings.json, appsettings.Development.json, secrets.json, and Environment Variables into the IConfiguration.
However, when I run the WebJob locally it dies with:
Microsoft.Azure.WebJobs.Host.Listeners.FunctionListenerException: The listener for function 'Functions.Run' was unable to start.
---> System.ArgumentNullException: Value cannot be null. (Parameter 'connectionString')
at Microsoft.Azure.Storage.CloudStorageAccount.Parse(String connectionString)
at Microsoft.Azure.WebJobs.Extensions.Timers.StorageScheduleMonitor.get_TimerStatusDirectory() in C:\azure-webjobs-sdk-extensions\src\WebJobs.Extensions\Extensions\Timers\Scheduling\StorageScheduleMonitor.cs:line 77
at Microsoft.Azure.WebJobs.Extensions.Timers.StorageScheduleMonitor.GetStatusBlobReference(String timerName) in C:\azure-webjobs-sdk-extensions\src\WebJobs.Extensions\Extensions\Timers\Scheduling\StorageScheduleMonitor.cs:line 144
at Microsoft.Azure.WebJobs.Extensions.Timers.StorageScheduleMonitor.GetStatusAsync(String timerName) in C:\azure-webjobs-sdk-extensions\src\WebJobs.Extensions\Extensions\Timers\Scheduling\StorageScheduleMonitor.cs:line 93
at Microsoft.Azure.WebJobs.Extensions.Timers.Listeners.TimerListener.StartAsync(CancellationToken cancellationToken) in C:\azure-webjobs-sdk-extensions\src\WebJobs.Extensions\Extensions\Timers\Listener\TimerListener.cs:line 99
at Microsoft.Azure.WebJobs.Host.Listeners.SingletonListener.StartAsync(CancellationToken cancellationToken) in C:\projects\azure-webjobs-sdk-rqm4t\src\Microsoft.Azure.WebJobs.Host\Singleton\SingletonListener.cs:line 72
at Microsoft.Azure.WebJobs.Host.Listeners.FunctionListener.StartAsync(CancellationToken cancellationToken, Boolean allowRetry) in C:\projects\azure-webjobs-sdk-rqm4t\src\Microsoft.Azure.WebJobs.Host\Listeners\FunctionListener.cs:line 69
I have confirmed that if I add the ConnectionStrings:AzureWebJobsStorage value to my appsettings.json then the program runs fine.
So I know it's an issue with the loading of the AzureWebJobsStorage value.
Has anyone figured out how to get an Azure WebJob, running locally, to properly read the connection string that Visual Studio configures when adding the Azure Storage as a Connected Service?
What's the point of adding the Connected Service to the WebJob if it won't read the connection string?
(note: I realize that the WebJobs docs https://learn.microsoft.com/en-us/azure/app-service/webjobs-sdk-how-to#webjobs-sdk-versions state that Because version 3.x uses the default .NET Core configuration APIs, there is no API to change connection string names. but it's unclear to me if that means the underlying WebJobs code also refuses to look in the Connected Services setup or if I'm just missing something)
I found a work-around, but I don't like it... basically check if there's a ConnectionStrings:AzureWebJobsStorage value at the end of my ConfigureAppConfiguration code and if not, try and read the one from the secrets.json file and set the ConnectionStrings:AzureWebJobsStorage to that value.
private const string baseAppSettingsFilename = "appsettings.json";
private const string defaultStorageAccountName = "<LABEL>";
...
IHostBuilder builder = new HostBuilder();
...
builder.ConfigureAppConfiguration(c =>
{
c.AddJsonFile(
path: baseAppSettingsFilename.Replace(".json", $".{Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT")}.json"),
optional: true,
reloadOnChange: true);
if (Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT") == "Development")
{
c.AddUserSecrets<Program>();
}
// Add Environment Variables even though they are already added because we want them to take priority over anything set in JSON files
c.AddEnvironmentVariables();
IConfiguration config = c.Build();
if (string.IsNullOrWhiteSpace(config["ConnectionStrings:AzureWebJobsStorage"]))
{
string storageConnectionString = config[$"ConnectionStrings:{defaultStorageAccountName}"];
if (string.IsNullOrWhiteSpace(storageConnectionString))
{
throw new ConfigurationErrorsException($"Could not find a ConnectionString for Azure Storage account in ConnectionStrings:AzureWebJobsStorage or ConnectionStrings:{defaultStorageAccountName}");
}
c.AddInMemoryCollection(new Dictionary<string, string>() {
{ "ConnectionStrings:AzureWebJobsStorage", storageConnectionString }
});
}
});
This seems exceedingly dumb but even looking at the Azure SDK source code I'm thinking it's just hard coded to a single key name and the Service Configuration in Visual Studio is simply not supported: https://github.com/Azure/azure-webjobs-sdk-extensions/blob/afb81d66749eb7bc93ef71c7304abfee8dbed875/src/WebJobs.Extensions/Extensions/Timers/Scheduling/StorageScheduleMonitor.cs#L77
I just ran into a similar problem where VS2019 automatically configured Function and Function1 with Connection = "ConnectionStrings:AzureWebJobsStorage" and it couldn't find that. Simply changing it to Connection = "AzureWebJobsStorage" worked like a charm.
FYI - I also had to change BlobTrigger("Path/{name}"... to BlobTrigger("path/{name}"...
re: Microsoft.Azure.StorageException: The specified resource name contains invalid characters

Need PCS_AAD_APPID and more to run Azure IoT storage-adapter microserver locally

I'm trying the Azure IoT Accelerators Remote Monitoring solution and trying to follow the instructions here:
https://learn.microsoft.com/en-us/azure/iot-accelerators/iot-accelerators-remote-monitoring-create-simulated-device
In it, I need to run the storage adapter microservice locally and for that to work, it seems that I need three environmental variables WHICH I DON'T KNOW HOW TO FIND THE VALUES FOR:
PCS_AAD_APPID = { Azure service principal id }
PCS_AAD_APPSECRET = { Azure service principal secret }
PCS_KEYVAULT_NAME = { Name of Key Vault resource that stores settings and configuration }
I can create those environmental variables but I have no idea what values I should put in there. Anyone?
FYI, right now when I'm running the storage adapter microservice locally, I get this error:
"{"Name":"StorageAdapter","Status":{"IsHealthy":false,"Message":"Storage check failed"}..."
...which is preceeded by a caught exception with this messae:
"AuthKey = '((Microsoft.Azure.Documents.Client.DocumentClient)this.client).AuthKey' threw an exception of type 'System.ArgumentNullException'"

NServiceBus endpoint is not starting on Azure Service Fabric local cluster

I have a .NetCore stateless WebAPI service running inside Service Fabric local cluster.
return Endpoint.Start(endpointConfiguration).GetAwaiter().GetResult();
When I'm trying to start NServiceBus endpoint, I'm getting this exception :
Access to the path 'C:\SfDevCluster\Data_App_Node_0\AppType_App10\App.APIPkg.Code.1.0.0.diagnostics' is denied.
How can it be solved ? VS is running under administrator.
The issue you are having is because the folder you are trying to write to is not supposed to be written by your application.
The package folder is used to store you application binaries and can be recreated dynamically whenever an application is hosted in the node.
Also, the binaries are reused by multiple service instances running on same node, so it might compete to use the files by different instances.
You should instead instruct your application to write to the WorkFolder,
public Stateless1(StatelessServiceContext context): base(context)
{
string workdir = context.CodePackageActivationContext.WorkDirectory;
}
The code above will give you a path like this:
'C:\SfDevCluster\Data_App_Node_0\AppType_App10\App.APIPkg.Code.1.0.0.diagnostics\work'
This folder is dynamic, will change depending on the node or instance of your application is running, when created, your application should already have permission to write to it.
For more info, see:
how-do-i-get-files-into-the-work-directory-of-a-stateless-service?forum=AzureServiceFabric
Open folder properties Security tab
Select ServiceFabricAllowedUsers
Add Write permission

Windows Azure Console for Worker Role Cloud Service

I have a worker role cloud service that I have recently developed on my local machine. The service exposes a WCF interface that receives a file as a byte array, recompiles the file, converts it to the appropriate format, then stores it in Azure Storage. I managed to get everything working using the Azure Compute Emulator on my machine and published the service to Azure and... nothing. Running it on my machine again, it works as expected. When I was working on it on my computer, the Azure Compute Emulator's console output was essential in getting the application running.
Is there a similar functionality that can be tapped into on the Cloud Service via RDP? Such as starting/restarting the role at the command prompt or in power shell? If not, what is the best way to debug/log what the worker role is doing (without using Intellitrace)? I have diagnostics enabled in the project, but it doesn't seem to be giving me the same level of detail as the Computer Emulator console. I've rerun the role and corresponding .NET application again on localhost and was unable to find any possible errors in the console.
Edit: The Next Best Thing
Falling back to manual logging, I implemented a class that would feed text files into my Azure Storage account. Here's the code:
public class EventLogger
{
public static void Log(string message)
{
CloudBlobContainer cbc;
cbc = CloudStorageAccount.Parse(RoleEnvironment.GetConfigurationSettingValue("StorageClientAccount"))
.CreateCloudBlobClient()
.GetContainerReference("errors");
cbc.CreateIfNotExist();
cbc.GetBlobReference(string.Format("event-{0}-{1}.txt", RoleEnvironment.CurrentRoleInstance.Id, DateTime.UtcNow.Ticks)).UploadText(message);
}
}
Calling ErrorLogger.Log() will create a new text file and record whatever message you put in there. I found an example in the answer below.
There is no console for worker roles that I'm aware of. If diagnostics isn't giving you any help, then you need to get a little hacky. Try tracing out messages and errors to blob storage yourself. Steve Marx has a good example of this here http://blog.smarx.com/posts/printf-here-in-the-cloud
As he notes in the article, this is not for production, just to help you find your problem.

Running RavenDB as an EmbeddableDocumentStore and accessing RavenDB Management Studio

I'm playing with an embedded RavenDB => RavenDB-Embedded.1.0.499 package installed via NuGet in Visual Studio 2010. It's being used in a current project that I started after reading this excellent MSDN article:
Embedding RavenDB into an ASP.NET MVC 3 Application
Now I'd like to access the RavenDB Management Studio (Web UI).
I followed the steps described here: Is it possible to connect to an embedded DB with Raven Management Studio and here Running RavenDB in embedded mode with HTTP enabled but I didn't get the point.
This is the code I'm using to initialize the DocumentStore:
_documentStore = new EmbeddableDocumentStore
{
ConnectionStringName = "RavenDB",
UseEmbeddedHttpServer = true
};
and this is the ConnectionString present in Web.config:
<add name="RavenDB" connectionString="DataDir = ~\App_Data\Database" />
I also read the steps described in RavenDB: Embedded Mode. I tried to start the server manually:
// Start the HTTP server manually
var server = new RavenDbHttpServer(documentStore.Configuration,
documentStore.DocumentDatabase);
server.Start();
but the above code seems outdated since I have no RavenDbHttpServer, documentStore.Configuration and documentStore.DocumentDatabase. I managed to find Raven.Database.Server.HttpServer but the other objects are missing in the _documentStore.
So, the question is:
How can I hit the Web UI to visualize my embedded database docs? What's the URL I should put in my browser address bar?
Any advice is appreciated.
EDIT: I've found a way of getting it to work. As I described in my blog post it may not be the best approach but it does work:
RavenDB Embedded with Management Studio UI
Note: one downside of the above approach is that I'm not able to access the database in my app because once it has been opened by the server it gets locked. This way I have to stop the server and then reload my app in the browser.
I hope RavenDB's gurus out there have a better/correct approach... just let us know.
I've never had to run the server manually in order to access the management studio. The only few steps that haven't been mentioned in your question that I usually do:
// Add the following line prior to calling documentStore.Initialize()
Raven.Database.Server.NonAdminHttp.EnsureCanListenToWhenInNonAdminContext(8080);
Copy Raven.Studio.xap into the root folder of my web project.
When my web application is running, the RavenDB Management Studio is then accessible at http://localhost:8080.