I created a query in Big Query using a script to then visualize in Data Studio.
The query runs correctly into BQ, I can open a Data Studio session but when I save the dashboard, I run into the error Dataset Configuration Error - Data Studio cannot connect to your data set.
My query creates 2 temporary tables and then joins them using the BEGIN ... END; scripts, more or less like so:
BEGIN
CREATE TEMP TABLE <table_1> as SELECT ()
CREATE TEMP TABLE <table_2> as SELECT ()
SELECT <table_1>.*,
<table_2>.*
FROM <table_1>
LEFT JOIN <table_2>
ON <table_1>.id = <table_2>.id;
END;
I would like to visualize the results of the final query into data studio, but when I save the dashboard I get the following error:
Does it mean that the table doesn't exist anymore since it lives in the session? I am stuck; I would like to have Data Studio to refresh this query in order to be usable in a dashboard.
Any help is appreciated!
Related
I build a data studio report, with a Bigquery data source custom query that fetches data from a dataset and table name. I want to pass a parameter to the connector and use different datasets and table names, so the visualisation could show data from different tables, without duplicating the report for each table.
When I've tried to use a parameter in the SQL query for the table name, like this:
select id, name from #tablename
i got:
Query parameters cannot be used in place of table names
I have a remote database that I want to copy on my local SQL Server.
IMPORTANT: I only want a sample data (1k rows for instance) from each table, and there are about 130 different tables.
I have tried to use the export data procedure in SSMS. Put simply, I go to TASKS> EXPORT DATA > CHOSE SOURCE (the remote db)> CHOSE DESTINATION (my local db) > CHOSE THE TABLES TO COPY > COPY
What I have tried:
I've tried to write down in this tool the SQL query like
SELECT TOP 1000 *
FROM TABLE1 GO ...
SELECT TOP 1000 *
FROM TABLE130
But on the mapping step, it puts every result within a single table instead of creating the 130 different output tables.
FYI, the above procedure is taking 2min for one table. Doing it one by one for each table will take 130 *2min = 4hours and half... plus it is so boring
Do you have any idea for resolving this situation?
Thank you
regards
If you only want a subset you are going to have problems with foreign keys, if there are any in your database.
Possible approaches to extract all data or a subset
Use SSIS to extract the data from the source db and load into your local db
Write a custom application that does the job. (You can use SQL Bulk Copy)
If you purely wan't to do do it in SSMS you can create a linked server on your local server to the remote server.
This way you can do something like this if the tables or not yet created on your local server:
SELECT TOP 1000 *
INSERT INTO [dbo].Table1
FROM [yourLinkedServer].[dbo].[Table1]
Changing the INTO table and FROM table for each table you want to copy.
Trying to optimize a copy of table's content between two azure sql databases.
Currently, one DB has an external table setup:
CREATE EXTERNAL TABLE [dbo].[Database2_TableA] (
[Col1] [varchar](100) NULL,
[ColN] [varchar](200) NULL
)
WITH (
DATA_SOURCE = [Database2],
SCHEMA_NAME = N'dbo',
OBJECT_NAME = N'TableA'
);
Then, inside a stored proc, this statement copies the data
insert into TableA1 select * from Database2_TableA
The table is large (lots of large columns and rows), and the copy takes too long.
Is there more efficient way of doing this?
If the table has large data, I think you can think about using bellow ways:
SSMS Import and Export Data. it support copy data between two Azure SQL databases.
Using the SSMS Generate Scripts to get the data:
Launch SQL Server Management Studio and login to your database
Right click on your database name and click on Generate Scripts
Select Choose Objects on the left hand side menu
Click on Select Specific database objects on the right part of the
window
Checkmark the tables you wish to copy
Click on Set Scripting Options on the left
Select Save scripts to a specific location and Save to new query
window
Click on the Advanced button as shown below:
When you click on Advanced you will get a list of options, go down to the bottom of the list and select either Data, Schema and Data or Schema only for Types of data to script:
Reference: Copy Data Between Two Azure Databases.
Hope this helps.
I cannot use linked server.
Both databases on both servers have the same structure but different data.
I have 10k rows to transfer from the DB on one server to the same DB on the other. I cannot restore the DB on the other server as it will take a huge amount of space that I don't have on the other server.
So, I have 2 options that I don't know how to execute:
Backup and restoring only one table - the table is linked to many other tables but these other tables exist on the other server too. Can I somehow delete or remove the other tables or make a backup only over one table?
I need to transfer 10k rows. How is it possible to create 10k insert queries based on selected 10k rows?
Can I somehow delete or remove the other tables or make a backup only over one table?
No you can not do this, unfortunately
How is it possible to create 10k insert queries based on selected 10k rows?
Right-click on Database -> Tasks -> Generate scripts -> (Introduction) Next
Chose Select specific database objects -> Tables, chose table you need -> Next
Advanced -> Search for Types of data script change from Schema only (by default) to Data only -> OK
Chose where to save -> Next -> Next. Wait the operation to end.
This will generate the file with 10k inserts.
Another way is to use Import/Export wizard (the most simple way for one-time-import/export) if you have link between databases.
There are many ways to choose from, here is one way using BCP. That's a tool that ships with SQL Server to Import and Export Bulk Data.
The outlines of the process:
Export the data from the source server to a file using BCP - BCP OUT for a whole table, or BCP QUERYOUT with a query to select the 10k rows you want exported
Copy the file to the destination server
Import the data using BCP on the destination database - BCP IN.
My suggestion would be to export these rows to excel( you can do this by copy pasting your query output) and transfer this to other server and import it there.
this is the official method :-
https://learn.microsoft.com/en-us/sql/relational-databases/import-export/import-data-from-excel-to-sql
and this is the the unofficial method :
http://therealdanvega.com/blog/2010/08/04/create-sql-insert-statements-from-a-spreadsheet.
Here I have assumed that you only need to transfer the transactional data and your reference data is same on both server. So you will need to execute only one query for exporting your data
I would definietely go down the SSIS route once you use SSIS to do a task like this you will not use anything else very simple to script up. You can use any version and it will be a simple job and very quick.
Open new SSIS project in available visual studio version/s there are many different but even a 2008 version will do this simple task you may have to install integration services or something similar used to be called bids (business information development studio in 2008) (anything up to 2015 although support is nearly there in 2017)
add a data flow task
double click the data flow task
Bottom of screen add two connection managers (1 to source and 1 to destination database)
add oledb source pointing to source database table
add oledb destination pointing to destination database table
drag line between the source and destination (should auto map all columns if the same name)
hit Start and the data will flow very quickly
you have create DbInstaller. using dbInstaller you have share whole database. Dbinstaller work both ado.Net and Entity Frame Work but I have using Entity Frame Work.
you can do it by sql server query
first select first database like
Use database1 --- this will be your first database
after select first database we will put our table row in temp table by this query
select * into #Temp from select * from table1
now we select second database and insert temp table data into second database table by this code
use secondDatabaseName
INSERT INTO tableNameintoinsert (col1, col2, )
SELECT col1, col2 FROM #temp;
I am writing an ETL to extract data from HANA table and load into SQL Server in BODS.
My job is to create a new table on SQL Server every time I run my job with name as date of that day. I know we can do that for flat files by using global variable but not sure how we can declare similar variable in template table to get desired results?
Why you want to use template tables. You can do the same as below:
Load the data in a standard staging table using BODS
Using DS scripting mechanism generate a query to create a table
Execute the query using SQL transform
Generate another query to copy data from staging table to the table created above
Several other ways also like you can write a DB procedure to create a table with the desired name and copy over the data from stage to that table. This procedure you can call from DS.
Hope this helps.
Cheers.
Shaz