Azure Machine Learning Write output to Azure SQL Database - azure-sql-database

I am using Azure Machine Learning to clustering data.
The input data is from an Azure SQL Database, and it works fine.
At the end of everything I want to write the output to a table in the same Azure SQL Database, but I get this error:
Error: Error 1000: AFx Library library exception:
Sql encountered an error: Login failed for user
Anyone any idea?
Thank you very much!

Please follow the instructions and examine the examples provided here to properly use the Export Data module to save the data of ML to Azure SQL Database.
How to Export Data to an Azure SQL Database
Add the Export Data module to your experiment. You can find this module in the Data Input and Output group in the experiment items list in Azure Machine Learning Studio.
Connect it to the module that produces the data that you want to export to Azure SQL DB.
For Data destination, select Azure SQL Database. This option supports Azure SQL Data Warehouse as well.
Set the following options specific to Azure SQL Database or Azure SQL Data Warehouse.
Database server name
Type the server name that is generated by Azure. Typically it has the form <generated_identifier>.database.windows.net.
Database name
Type the name of a database on the server you just specified.The database must already exist; the Export Data cannot create it.
Server user account name
Type the user name of an account that has access permissions for the database.
Server user account password
Provide the password for the specified user account.
Comma-separated list of columns to be saved
Type the names of the columns in the experiment that you want to write to the database.
Data table name
Type the name of the table where data will be stored.
For Azure SQL Database, if the table does not exist, it will be created. For Azure SQL Data Warehouse, the table must already exist and have the correct schema, so be sure to create it in advance.
Comma-separated list of datatable columns
Type the names of the columns as you wish them to appear in the destination table. The columns should correspond in order with the column names that you list in Comma-separated list of columns to be saved.
if you are writing to Azure SQL Data Warehouse, the columns names must match those already in the destination table schema.
Number of rows written per SQL Azure operation
Indicate how many rows should be written to the destination table in each batch. By default, the value is set to 50, which is the default batch size for Azure SQL Database. However, you should increase this value if you have a large number of rows to write.
TIP:
For Azure SQL Data Warehouse, we recommend that you set this value to 1. If you use a larger batch size, the size of the command string that is sent to Azure SQL Data Warehouse can exceed the allowed string length, causing an error.
If you don't want to write new results each time you run the experiment, select the Use cached results option. If there are no other changes to module parameters, the experiment will write the data the first time the module is run, and thereafter not perform writes.
However, a write will always be performed if any parameters have been changed in Export Data that would change the results.
Run the experiment.

Find the issue!
I needed to create an specific user with this SQL code:
CREATE USER AMLApplicationUser WITH PASSWORD = '************';
and then add the user to these roles on the database I want to write.
ALTER ROLE db_datareader ADD MEMBER AMLApplicationUser;
ALTER ROLE db_datawriter ADD MEMBER AMLApplicationUser;
I guess only the datawriter role is enough, but I needed datareader too.
So in conclusion, seems that database admin role can be used to read data, but not to write data from AML.
Thank you for your help!

Related

How Paramterize Copy Activity to SQL DB with Azure Data Factory

I'm trying to automatically update tables in Azure SQL Database from another SQLDB with Azure Data Factory. At the moment, the only way to update the table Azure SQL Database is to physically select the table you want to update in Azure SQL Database, as shown here:
My configuration to automatically select a table the SQLDB that I want to copy to Azure SQL Database is as follows:
The parameters are as follows:
#concat('SELECT * FROM ',pipeline().parameters.Domain,'.',pipeline().parameters.TableName)
Can someone let me know how to configure my SINK and/or connection to automatically insert the table selected from SOURCE.
My SINK looks like the following:
And my connection looks like the following:
Can someone let me know how to configure my SINK and/or connection to
automatically insert the table selected from SOURCE.
You can use Edit option in the SQL dataset.
Create a dataset parameter for the sink table name. In the SQL sink dataset check the Edit checkbox in it and use the dataset parameter. If you want, you can use dataset parameter for the database name also. Here I have given directly (dbo).
Now in the copy activity sink, you can give the table name dynamically from any pipeline parameter (give your parameter in this case) or any variable using the dynamic content.
Also, enable the Auto create table which will create new table if the table with the given name not exists and if it exists it ignores creation and copies data to it.
My sample result:

Export Data from Azure SQL Server SP or table to Excel using Azure Resources

I am having few data which gets returned from my Store procedure or table from Azure SQL Server,
Client is having some formatted excel and I need to export data from SP to that formatted excel.
The excel has pre defined columns with different name than DB Column names .
Is it possible by using any Azure resources like Azure data factory or logic Apps to export data from sql server and append in the excel.
Through Azure Data Factory -Data Flow Activity we can achieve this.
Source here is SQL Database where column names are coming from Table:
Connect Sink(Target) and in the Mapping Tab uncheck Auto Mapping and Add Output column names as required at the target :

Easier way of Doing SQL Data Export - Azure

I have two tables Table A and B that is present in a Azure SQL Database. I have the same database clone running in my local, but I want to populate the data that is present in Azure by using SSMS Export Data option. While using that option I specify the source and destination and then choose the option of "Write a query to specify the data to transfer"
And then I add the query "Select * from A where Condition1" and select the destination table here:
The issue is if I have 5 tables to export data from, I have to do this whole process 5 times, only difference is the queries and destination tables. Anyone has any idea how can I do this whole thing faster by some other means? I just need to copy data using some select statements with where clauses.
As per the Official Documentation
When you select  Write a query to specify the data to transfer, you can only copy the results of one query to one destination table.
So, you have to repeat the entire process for multiple times if you want to export data like that.
You can use the following ways for importing and exporting data:
Use Transact-SQL statements.
Use BCP (Bulk Copy Program) from the command prompt.
If you want to design a custom data import, you can use SQL Server Integration Services.
Use Azure Data factory.
Use BACPAC file. Refer this material by accu web hosting to know about it. Rather than querying before exporting the data, instead you can delete the unwanted data in destination database after exporting using delete statement.
REFERENCE:
Import & export data from SQL Server & Azure SQL Database - SQL Server | Microsoft Docs

Move data between two Azure SQL databases without using elastic query

I am in need of suggestion to move data from a particular table in one azure sql database to the other azure sql database which has the same table structure without using elastic query
Using SQL Server Management Studio to connect to SQL azure database, right click the source database and select generate scripts.
During the wizard, after have select the tables that you want to output to a query window, then click advanced. About half way down the properties window there is an option for "type of data to script". Select that and change it to "data only", then finish the wizard.
The heck the script, rearrange the inserts for constraints, and change the using at the top to run it against my target DB.
Then right click on the target database and select new query, copy the script into it, and run it.
This will migrate the data.
Please consider using the "Transfer SQL Server Objects task" in SSIS. You can learn all the advantages it provides on this article.
You can use PowerShell to query each database and move data between them as needed. Here's an example article on how to get this done.
Using PowerShell when working with Azure has a number of other benefits in what you can do and can control as well. It's a good choice to spend time learning.
In the source database I created SPs to select the data from the tables.
In the target database I created table types (which would be available in programmability) for the tables with the same structure as in the source.
I used Azure function to move the data into table type from source.
In the target database I created SPs to insert data into the tables from their respective table types.
After ensuring the transfer of data, I would be deleting those records moved to the target in the source database and for this I created SPs.

SSRS Report Data Source for Query with Multiple Databases

I have a dataset that pulls from multiple databases on the same server. Historically (without doing research) in this case I would set the data source to ReportServer (the database that houses the execution log for the server, ect.) and noticed the dataset doesn't seem to care what the data source is.
I did a few hours of digging and couldn't find an answer. When using (or in my case, unioning) multiple data bases in a dataset, what should the dataset data source be in Visual Studio?
Specifying the database in the connection string sets the starting, default database for the query. If your permissions are adequate, then there is nothing to stop you from accessing other databases.
The database in the connection string will give your query the context that is used when you don't specify a database name as part of a table. If your query is simply:
SELECT * FROM vw_Interactions
then this will run against the database specified in your connection string.
For your case, when using a table with the same name across multiple databases, the default database doesn't matter much, as long as the data access account has permissions that let the query work.