I'm following the SAP tutorial Create an application using SAP HANA and the Cloud Application Programming model. Steps 1-4 have been successfully completed - I even get the output "1:44:30 PM (HDB) Build of /APP/db completed successfully." at the end of step 4.
When I right click on the db folder and chose Open HDI Container the below error occurs.
The applications running in my space I are below.
Info about my space.
Below are the service instances in my dev space.
Am I:
Doing something wrong,
Missing some prerequisite, or
Does this not work in the trial account?
Thanks,
Mike
Assuming your hdi container is indeed there (go into service instances, not applications), it may be because of the region. Change the region of your trial to Europe Frankfurt: https://blogs.sap.com/2019/04/16/how-to-change-the-region-in-your-cloud-foundry-trial/
Cheers,
Lucia.
Related
I am having pay as you go subscription and I am creating an Azure SQL server.
While adding server, on selection of location, I am getting this error:
This location is not available for subscriptions
Please help.
There's an actual issue with Microsoft servers. They have too many Azure SQL database creation requests. They're currently trying to handle the situation. This seems to affect all types of subscriptions even paid ones. I have a Visual Studio Enterprise Subscription and I get the same error (This location is not available for subscriptions) for all locations.
See following Microsoft forum thread for more information:
https://social.msdn.microsoft.com/Forums/en-US/ac0376cb-2a0e-4dc2-a52c-d986989e6801/ongoing-issue-unable-to-create-sql-database-server?forum=ssdsgetstarted
As the other answer states, this is a (poorly handled) restriction on Azure as of now and there seems to be no ETA on when it shall be lifted
In the meantime, you can still get an SQL database up and running in Azure, if you don't mind doing a bit of extra work and don't want to wait - just set up a Docker instance and put MSSQL on it!
In the Azure Portal, create a container instance. Use the following docker image: https://hub.docker.com/r/microsoft/mssql-server-windows-express/
while creating, you might have to set the ACCEPT_EULA environment variable to "Y".
after it boots up (10-20 minutes for me), in the portal, connect to it with the "sqlcmd" command and set up your login. In my case, I just needed a quick demo db, so I took the "sa" login, ran "alter login SA with password ='{insert your password}'" and "alter login SA enable". See here for details: https://learn.microsoft.com/en-us/sql/t-sql/statements/alter-login-transact-sql?view=sql-server-ver15#examples
and voila, you have an SQL instance on Azure. Although it's unmanaged and poorly monitored, it might be enough for a short-term solution. The IP address of the docker instance can be found in the Properties section of the container instance blade.
Maybe you can reference this blog: Azure / SQL Server / This location is not available for subscription. It has the same error with you.
Run this powershell command to check if the location you choose is available:
Get-AzureRmLocation | select displayname
If the location is available, the best way to resolve this issue just contact the Azure support to have this enabled for you. You can do this for free using support page on your Azure Portal.
They well contact you can help you solve it.
Hope this helps.
This is how I solved myself. Let me tell you the problem first. Then the solution.
Problem: I created a brand new free Azure account (comes with $250 free credit) for a client. Then upgraded to pay-as-you-go subscription. I was unable to create Azure SQL db. The error was 'location is not available'.
How I solved: I created another pay-as-you-go subscription in the same account. Guess what - I was able to create SQL db in my new subscription right away. Then I deleted the first subscription from my account. And yes, I lost the free credit.
If your situation is similar to mine, you can try this.
PS: I have 3 clients with their own Azure accounts. I was able to create SQL Db in all of their accounts. I think the problem arises only for free accounts and/or for free accounts that upgraded to pay-as-you-go accounts.
EDIT - 2020/04/22
This is still an ongoing problem up to today, but I was told by Microsoft support that on April 24th, a new Azure cluster will be available in Europe. Thus it might get possible to finally deploy SQL Server instances on Free accounts around there.
Deploy a docker container running SQL Server
To complement on #Filip's answer, and given that the problem still remains with Azure SQL Server, a docker container running a SQL Server is a great alternative. You can set yourself one very easily running the following command on the cloud shell:
az container create --image microsoft/mssql-server-windows-express --os-type Windows --name <ContainerName> --resource-group <ResourceGroupName> --cpu <NumberOfCPUs> --memory <Memory> --port 1433 --ip-address public --environment-variables ACCEPT_EULA=Y SA_PASSWORD=<Password> MSSQL_PID=Developer --location <SomeLocationNearYou>
<ContainerName> : A container name of your choice
<ResourceGroupName> : The name of a previously created Resource Group
<NumberOfCPUs> : Number of CPUs you want to use
<Memory> : Memory you want to use
<Password> : Your password
<SomeLocationNearYou> : A location near you. For example,
westeurope
Access SQL Server
Once the container instance is deployed, in the Overview you will be able to find an IP address. Use that IP address and the password you chose in the az container command to connect to the SQL Server, either using Microsoft's SSMS, or the sqlcmd utility
Some documentation regarding the image I have used can be found here.
More information on the command I have used here.
I have setup a Log Analytics Workspace, install MMA on a few computers with correct workspace id and workspace key (heartbeats are logged). The location of the workspace is set to North Europe.
I cannot add data sources, as shown here: https://learn.microsoft.com/en-us/azure/azure-monitor/platform/data-sources-windows-events
I have this view in "Advanced settings" for the workspace:advanced settings in azure portal for workspace
Provided that you have required access as per this document, I believe it (Advanced settings tile) opens up as expected after waiting for little longer i.e., say ~10 seconds or so!!
Hope this helps! Cheers!
I need a PowerShell script to trigger an alert whenever azure application gateway backend health turn to red. I have 2 subscriptions, under this, we have around 150+ Application Gateway provisioned. so the script should be reusable across all the subscription. it would be great, if any sample script available for the reference.
Thanks
Suri
You can use this command:
Get-AzureRMApplicationGatewayBackendHealth
https://learn.microsoft.com/en-us/powershell/module/azurerm.network/get-azurermapplicationgatewaybackendhealth?view=azurermps-6.13.0
With this command, you can create a runbook inside an automation account, and if the output of the command is a specific value, it can trigger an alert. Spending a few minutes on google shows you how to combine these techniques ;)
I work with SharePoint. I was given a project where I need to call NetBackup web services and download all the failed Backup jobs. Backup Status = failed or something like it.
All I know they (backup team) gave me a url http://netbk004/Operation/opscenter.home.landing.action? I have worked with asmx before but I have no clue how to consume exceptions from NetBackup. Is there an API that comes with NetBackup that I can use to populate a SharePoint list? Or web services, it doesn't matter as long as I can download the exceptions to a SharePoint List.
Not sure about through the webservice, but I know you can access the state of backup jobs by running the bpdbjobs command and parsing through the output.
Go to Netbackup activity monitor. Then filter the "Status" column with ">1".
This will give you all tha failed jobs
I'm working on SQL 2012 Enterprise and I have a set of SSIS package exports which push data out to text files on a shared network folder. The packages aren't complex and under most circumstances they work perfectly. The problem I'm facing is that they do not work when scheduled - despite reporting that they have succeded.
Let me explain the scenarios;
1) When run manually from within BIDS, they work correctly, txt files are created and populated with data.
2) When deployed to the SSISDB and run from the Agent job they also work as expected - files are created and populate with data.
3) When the Agent job is scheduled to run in the evening, the job runs and reports success. The files are created but the data is not populated.
I've checked the reports on the Integration Services Catalogs and compared the messages line by line from the OnInformation. Both runs reports that the Flat File Destination wrote xxxx rows.
The data is there, the Agent account has the correct access. I cannot fathom why the job works when started manually, but behaves differently when scheduled.
Has anyone seen anything similar? It feels like a very strange bug....
Kind Regards,
James
Make sure that the account you have set up as the proxy for the SSIS task has read/write access to the file.
IMX, when you run an SQL Agent job manually, it appears to use the context of the user who initiates it in some way. I always assumed it was a side effect of impersonation. It's only when it actually runs with the schedule that everything uses the assigned security rights.
Additionally, I think when the user starts the job, the user is impersonating the proxy, but when the job is run via the schedule, the agent's account is impersonating the proxy. Make sure the service account has the right to impersonate the proxy. Take a look at sp_grant_login_to_proxy and sp_enum_login_for_proxy.
Here's a link that roughly goes through the process:
http://www.mssqltips.com/sqlservertip/2163/running-a-ssis-package-from-sql-server-agent-using-a-proxy-account/
I also recall this video being useful:
http://msdn.microsoft.com/en-us/library/dd440761(v=SQL.100).aspx
I had the same problem with Excel files. It was permission rights.
What worked for me was adding the SERVICE account to the folder's security tab. Then the SQL Agent can access the files.