I am trying to create a SQL Azure db (I've done this dozens of times via the portal) but the process hangs on the Create Status and a half hour later it finally quits and deletes the database.
Are their any logs I can look at to see what is going on?
From the Azure portal, click on your username in the top right and select "Switch to New Portal".
On the left hand side of the new portal is a Notification tile. Click the notification tile, which expands a blade on the right hand side. Within that blade is a link which states "See all Audit Logs". Click that link to see the basic logs giving more detailed information regarding processes that you have attempted from the admin portal.
Clicking on each log entry expands into more details on individual components of each action. See imagery below for when I created a SQL Azure Elastic DB Pool.
I have had databases hang in a COPYING state on multiple occasions and each time have had to open a case with Microsoft to effect resolution.
Related
I'm attempting this quickstart, have got to this point but I've come unstuck.
https://learn.microsoft.com/en-us/azure/synapse-analytics/get-started-analyze-sql-pool
Second step "2.Go to SQLPOOL1 > Tables."
I can't see a database or dbo.NYCtaxi tables in any tab or menu of the UI.
What I expected
-somehere in the red rectangle below, to see a database directory structure like the first screenshot in this SO Synapse Analytics - Can't see the tables list in the dedicated sql pool
What I have tried
-I'm able to query them using SQL script connected to SQLPOOL1[server.]SQLPOOL1[database], but can't browse them.
-refreshing all the pages\workspaces
-checking that the status for this pool is online (it is)
-double-clicking the SQLPOOL1 link in the Manage tab of the UI. I thought it might open a list of tables (it only shows the properties)
-this SO Synapse Analytics - Can't see the tables list in the dedicated sql pool
Found it...I was looking at the wrong tab
In an on-prem SQL Server I have the option to set up scheduled Jobs with the SQL Server Agent. This feature is not present in Azure. Is there any way to do this easily in Azure or will I have to rely on automation scripts / powershell scripting for this?
The task I want to accomplish is to export a bunch of SQL views to CSV and send them to a remote FTP server.
In Azure, through Logic Apps, you can achieve this. Please check below steps.
Go to Azure Portal ( http://portal.azure.com/ ) and Search Logic Apps.
Click Add and fill the details like Logic App Name, Subscription,Resource group, Location and click Create.
After refreshing the page, click the Created Logic App. In the home page, choose Blank Logic App.
In the Logic Apps designer Page, Search for Schedule or Recurrence and click it.
Fill the Interval, Frequency, Time zone (Format is important – 2018-10-16T21:00:00Z ), Start time, at these hours, at these minutes, Check the Preview.
Choose an Action, Search for SQL Server and Click Execute Stored Procedure in the list.
Click Add New connection at the end if you want to create new connection. Then Click Manually Enter Connection Information at the end if you want to create new connection.
Or else use anyone of the Existing connections. Fill the Procedure name with the required SP & below are the input parameters of the SP that you selected.
Choose an Action, Search for Create CSV Table and Click it. Fill the From (choose dynamic result set from the right side-Choose first result set alone), Include Headers (Yes), Columns (Automatic).
Choose an Action, Search for Office 365 Outlook & Search for Send an Email and Click it. Before proceeding, Please check mail id at the bottom. Change as yours. Fill the To (zzzzzz#xxxxxx.com;zzzzz#xxxxx.com), Subject (Demo mail), Body (Please check the test attachment), From & CC & BCC (email id’s for whom you want to send), Important (Normal), Is HTML (NO), Attachment Name {choose expression from the right hand side and type concat('Test_mail',utcnow('dd-MM-yyyy'),'.csv') }, Attachment Content (Choose output from the right hand side).
Finally Click Save at the left hand side top. Click Designer option to edit, after edit completes, again save that. Click Run for Demo Run.
Click Run to initiate the trigger. Then only the automated mails should come at mentioned intervals.
Select the Required Logic App and click Delete to delete it (will ask Logic App name to delete).
You should have access to do the above changes and you have to sign in to open the azure portal.
One option is to use Azure Data Factory and create a copy activity that use Azure SQL Database as a source and SFTP as a sink. Use copy activity to copy data from any supported data store to your SFTP server located on-premises or in the cloud. You can schedule execution on Azure Data Factory as shown here.
Another option is using Azure Logic Apps with the Azure SQL Database connector and FTP connector to access/manage SQL Database and FTP server. You can create, schedule, and run recurring tasks with Azure Logic Apps as shown here.
I have tried to delete a SQL database from the azure portal. It looks like it has failed part way though. The database doesn't show up under the list of SQL servers in the Azure portal. However if I login to the server through SSMS it is still there. I now can't delete the database or create a new one with that name.
I've tried deleting the database with a query and get an error saying the database doesn't exits. If I try to create it either from the Azure portal or SSMS it gets an error saying it already exists.
I had a similar problem once, with SSL settings, where it would return that it is linked to the app even tho it wasn't, hence I was not able to delete it. After a couple of weeks of back and forward with the support, we removed it through azure resource explorer.
How to:
Once you are logged in, set read/write
In serach box find your resource
Click actions (POST/DELETE) // these should be available now since you have set read/write
Click Delete
Hopefully, this would help anyone who has any corrupted resources in Azure.
Through Infor XA Power-Link, I have access to a sales history view with things like quantity, price, item number, etc.
Using System i Navigator, I would like to be able to run a SQL script that produces a similar table, but customized to my liking.
However, I'm having trouble locating the tables involved in producing the Power-Link view. Is there a way to check the source using Power-Link so that I can get the table names?
Sure is! You need to use the SQL Monitor. Start the business object and go to Help then About Infor. The about screen will appear. Hit Control-D to start the SQL Monitor. You should see 3 columns: Start, Duration and SQL. Refresh the view and see the SQL Monitor will populate. Right-click and copy the entry that has 'SQL SW executeQuery' entry and paste into your favorite SQL editor. This will be the exact SQL select statement used by PowerLink. Modify to your needs.
Let me know if you need anything else.
Matt
Note: You can prevent users from accessing any of the Diagnostic tools which includes the SQL monitor. This is done via security task DSPDGNINF. It's located under the CAS application > Client System Preferences security area.
Load menu AMZM38 - CAS Security Maintenance
Select Option 1 Area and task authorizations
Select CAS as the application
Use Option 2 Change on Client System Preferences security area
Use Option 11 Authorize users to configure who should have access and who should not (revoke)
F12 to go back one screen
Lock task DSPDGNINF
Please note, security is not active until the task is locked. Also, to test, one must use a new Power-Link session as security is cached.
I am new to SSRS and have deployed several reports. When I force windows authentication, everything works fine, but when I do this:
I am getting this error:
An error has occurred during report processing. (rsProcessingAborted)
Query execution failed for dataset 'DataSet1'.
(rsErrorExecutingCommand) Login failed for user 'DWH_Reporting_User'.
I've configured the DWH_Reporting_User like this:
here are the details on DWH_Reporting_User for the ReportServer database:
here is the security on the server:
when i try to set security for that specific folder:
i am getting this error msg:
is there something obviously wrong with the way I've configured things?
The answer to this question was a series of comments. I went ahead and put it in the chat as well as an answer.
Go to the security of the server not the database and map the user to that database.
Can you recycle the application pool on the server where reporting services is running? Or reboot the IIS server if possible.
Another thing that comes to mind, when you launch this report it goes into the report server "Portal". Does this user have access to the actual report to view it? Click the details section of this "Portal" and assign this user as a content manager role.
DWH_REPORTING_USER is this an admin account on your domain?
Local admin on the server, so did you grant \COMPUTER_NAME\DWH_Reporting_User rights as a content manager in the roles section? Why dont you use a domain admin account?
So how do you access the actual report - that is your issue.
Do me a favor aprem, write up a stored procedure or sql query in the first tab of reporting services and run it using that user. Meaning in the shared data source it should be using this user DWH_Reporting_user. Test the connection then write a small sql statement to retrieve some data. Run the SQL script from within RS do you see any data?
#Aprem look at the three tabs at the top of rs, its the first tab to define datasets, this is where you can create an SQL query (and run it using the red exclamation mark). – JonH 20 mins ago
i defined a new dataset as "select top 1000 * from mytable", i rebuilt the project, deployed it, and now what do id o? – Артём Царионов 16 mins ago
In the "Shared Datasets" you have a dataset right? Double click on it and go to "Query Designer" it is a button on this form. Click on it and "Execute" the query (red exclamation mark). You dont need to deploy it right now, just do it on a test machine.ago
You specify the user in the datasource section "Shared Data Sources", that account is the account being used to "pull" the data. You really need to experiement with RS some more or read some material on it. This is as basic as it gets.
*Ok aprem do you understand your issue now, the user you are using to get the data has no issues, in addition, it is functioning correctly. Now you have to view the report. To view the report is to view the webpage, and to view a web page means you need to either use "Anon" access or windows authentication. So you need a domain or local WINDOWS account to view the report. This account needs to be setup on the RS portal as a content manager role. *
You are dealing with two beasts, one is the database (db server) and one is IIS (web server) each serve a specific purpose. The database serves to allow you to pull data while the web server hosts the pages.
The reportserver database is very important, it keeps a listing of all your reports and the meta data associated with your reports in the database. It also stores job ids and subscriptions associated with your reports. Basically it is the backend database for all reporting services stuff like reports, datasets, and datasources. Think of it this way when you create a new "Report Project" you are allowed to create reports, datasets, and datasources. Without the reportserver database how would the system remember your datasets, datasources, and report names? It is the heart of rs.
your data set is not used to connect to the reporting services databse, it is used to connect to the database you are getting the data to display on your report.
Does that user exist on the database the report is accessing?