SSAS Tabular - Power Query Editor stuck at "Operation in progress" - "identifying schemas" - ssas

In Visual Studio, with an SSAS Tabular model open, in the Power Query Editor window, when I make a change to a large partition (1 million+ rows) that sources its data from an Azure SQL Database, the edit and preview happens quickly in the Power Query Editor window itself. However, when I click "Close & Update" or "Close & Update Without Processing" this message appears for a very long time ("Operation in progress" - "identifying schemas"):
At the same time, Task Manager shows Visual Studio downloading at several Mbps the entire time, so I am assuming that Visual Studio is attempting to download the full contents of the table.
Is there a way to prevent this behavior? I was thinking that "Close & Update Without Processing" would prevent this behavior but it does not.
My current workaround is:
Rename the Azure SQL Database table.
Create a new empty Azure SQL Database table with the same name and fields as the original table.
Perform the "Close & Update" letting it use this empty table as a source so it completes instantly.
Delete the new empty Azure SQL Database table.
Rename the Azure SQL Database table back to what it was previously.

The work around that I have found is to use a parameter to limit the number or rows being used for development. An example of how to set this up is here:
https://blog.crossjoin.co.uk/2018/02/26/filtering-data-loaded-into-a-workspace-database-in-analysis-services-tabular-2017-and-azure-analysis-services/

Related

How to avoid running query twice - ODBC Excel (new query or editing existing query)

I have an ODBC connection to an AWS mysql database instance. It's extremely frustrating that it appears I'm obligated by the excel UI to run the query twice.
First, I have to run the query like this:
After this runs, which returns a limited amount of rows (2nd image below), then I have to run it again to load the data into excle.
My question is, is there any possible way to skip step 1 or step 2, so that I can input my query and have it load directly into the workbook?
I'm not understanding the problem. You are configuring a Query Connection. The first execution returns a preview and the "Transform data" option (if you want to further tailor the query). The second execution loads it. From that point on the query is set up. It only needs configured once.
To get new/changed data, you just need to do a "Refresh All" or configure it to automatically Refresh Data when the Excel Workbook is opened.
If you are adding a Query to many workbooks you could probably setup one then code a query substitution script.

Move data between two Azure SQL databases without using elastic query

I am in need of suggestion to move data from a particular table in one azure sql database to the other azure sql database which has the same table structure without using elastic query
Using SQL Server Management Studio to connect to SQL azure database, right click the source database and select generate scripts.
During the wizard, after have select the tables that you want to output to a query window, then click advanced. About half way down the properties window there is an option for "type of data to script". Select that and change it to "data only", then finish the wizard.
The heck the script, rearrange the inserts for constraints, and change the using at the top to run it against my target DB.
Then right click on the target database and select new query, copy the script into it, and run it.
This will migrate the data.
Please consider using the "Transfer SQL Server Objects task" in SSIS. You can learn all the advantages it provides on this article.
You can use PowerShell to query each database and move data between them as needed. Here's an example article on how to get this done.
Using PowerShell when working with Azure has a number of other benefits in what you can do and can control as well. It's a good choice to spend time learning.
In the source database I created SPs to select the data from the tables.
In the target database I created table types (which would be available in programmability) for the tables with the same structure as in the source.
I used Azure function to move the data into table type from source.
In the target database I created SPs to insert data into the tables from their respective table types.
After ensuring the transfer of data, I would be deleting those records moved to the target in the source database and for this I created SPs.

Excel 2016- Load power query directly into power pivot

Is there a way to load power query data directly into power pivot without creating an excel table as an intermediary step?
All the examples I've found reference Excel 2010 and 2013. Although the instructions are similar, it does not work in 2016.
In following the steps found. when I go to "Existing Connections" in Power Pivot and try to open the Power Query connection. I get a message:
"The connection you’re trying to open was created with Power Query. To change this connection, use Power Query."
Is it possible to clean/transform data using power query and load it directly to power pivot in excel 2016?
I would go to the Data Ribbon and choose Show Queries. Depending on your Office/Excel update schedule, this may be changed to Queries & Connections. Either way, you are trying to open the Workbook Queries pane (appears on the right).
Within the Workbook Queries pane, right-click each Query and choose Load To. Ensure the first option is set to Only Create Connection, and that the Add this data to the Data Model option is checked.
With those options set, Load performance should be a lot faster, and you can exceed Excel's million row per table limit.
On the Excel Workbook Ribbon:
Option 1:
Go to: Power Pivot ↦ Add To Data Model
Option 2:
Go to Data ↦ Queries & Connections
right-click over the query you want to add to PowerPivot ↦ Edit
On the Power Query Editor: File ↦ Options & Settings ↦ Query Options ↦ Check Load to Data Model.
Finally: File ↦ Close & Load

How to transfer data from Tabular model (SSAS Tabular) to Relational Database?

I am new to SSAS and Tabular model cubes. I have a data migration task that aims to import data from a Tabular model into a SQL Server database.
I have tried SSIS with not success. Using OLEDB Data source connected SSAS instance with Tabular database, I configured the data source to execute and MDX command " evaluate ('Tabular Table Name') ". It returns successfully but only partial number of records (around 1000) are returned. So i reconfigured the data source to use OPENROWSET and selected the tabular table that i want to query but I am always getting an error related to Column mapping, even though the column mapping is correct.
Okay, I suggest you try this way (I tried it with our PowerPivot databases, which is pretty similar to SSAS Tabular_:
Run SSMS and connect to SSAS Tabular instance
Right-click the database you want to dump into SQL Table and select Script -> Create to -> New Query Window. Be patient and wait until SSMS generates XMLA script for you
Find SQL queries that were used to fill SSAS tabular tables by searching XMLA script (keyword = QueryDefinition)
This is what I got for our database:
<Partitions>
<Partition>
<ID>factLinksSeller_484c3291-2123-4391-8627-fd4b584d1726</ID>
<Name>factLinksSeller</Name>
<Annotations>
<Annotation>
<Name>IsQueryEditorUsed</Name>
<Value>True</Value>
</Annotation>
<Annotation>
<Name>QueryEditorSerialization</Name>
</Annotation>
<Annotation>
<Name>TableWidgetSerialization</Name>
</Annotation>
</Annotations>
<Source xsi:type="QueryBinding">
<DataSourceID>15719e99-95fb-44c1-8399-18a769ae1be4</DataSourceID>
<QueryDefinition>select
*
from
dbo.factLinksFull X
where X.signaturePersonID=16
</QueryDefinition>
Now you can use this queries to load data into your SQL Server DB via SSIS
Even if you filled Excel files by external tool, after importing them to SSAS this data should be stored somewhere, so you could check it in XMLA script and make the decision what to do next
There exists a much more simple way to do it, if you have local PowerPivot workbooks and your memory can handle datasets from those:
Open your PP workbook, run PowerPivot
Switch to Data View
Copy whole table from the tab you are on, by pressing icon in the left upper corner of it, then press CTRL+C and wait for the data to be transferred to the memory
Paste your data in another Excel 2013 file, since they now can handle almost any size of data, or MS Access table. Then import data to SQL Server

How to backup Sql Server to sql file?

In "Back UP" I only get a bak file, but I would like to create .sql file
Use SQL Server's Generate Scripts commend
right click on the database; Tasks -> Generate Scripts
select your tables, click Next
click the Advanced button
find Types of data to script - choose Schema and Data.
you can then choose to save to file, or put in new query window.
results in CREATE and INSERT statements for all table data selected in bullet 2.
This is a possible duplicate of: SQL script to get table content as "SELECT * FROM tblname"
To do a full database backup to File/Query you can use the 'Generate Scripts...' option on the Database.
Open SQL Server Management studio, right click on the database and choose 'Tasks->Generate Scripts...'
Then use the wizard to backup the database. You can script the whole database or parts of it. Two important options: In the 'Advanced' section, you will probably want to ensure 'Type of backup = 'Schema and Data' and the 'Script Statistics' is on.
This will produce a *.sql file that you can use as a backup that includes the schema and table data.
Ok, I read through most of these, but I had no "advanced button". But, there is still a way to do it, it's just a little hard to find, so here you go:
You can generate a script from a database, see http://msdn.microsoft.com/en-us/library/ms178078.aspx
If you want to create a script of your database you right-click on the databases and Generate Scripts (it's in different sub-menus depending on what version of SQL and Enterprise Manager / SQL Server Management studio you're using).
That will, however, only get you the database objects. It will not generate scripts for data. Backing up a database will give you all of the database objects as well as the data, depending on what recovery model your database is set to.
This fellow may have achieved what you are trying to do by creating the backup, and then restoring it and giving it a new name.
This approach copies the data along with all of the database objects.
If you want a file with insert statements for your data have a look here:
This procedure generates INSERT statements using existing data from the given tables and views. Later, you can use these INSERT statements to generate the data. It's very useful when you have to ship or package a database application. This procedure also comes in handy when you have to send sample data to your vendor or technical support provider for troubleshooting purposes.
http://vyaskn.tripod.com/code.htm#inserts