Keeping temp tables in memory after switching servers? - sql

How can I get #tempTable to stay in memory after switching servers. Is this possible?
Select *
into #tempTable
from dbo.table
I have data in server 1 that I want to compare in server 2, but I only have readonly access to server 2 (so I can't just move my data there), and the table in server 2 is too big to move to server 1. This is why I want to know how to keep a temp table in memory after connecting to a new server.
Any help would be appreciated, thanks.

What you write is possible, but it just creates a temporary table on the server where the table is.
You probably want:
select *
into #Server2Table
from server2.database.dbo.table
You can then use #Server2Table in additional queries on the same connection where you copied it (such as the same window in SSMS or the same job step or the same stored procedure). If you need it in a more permanent location, either use a global temporary table (starts with ##) or a "real" table.
This requires the ability to link servers, using something like:
sp_addlinkedserver server2
You would run this on server1. Perhaps your DBA will need to set it up.
I have found that queries often run faster when loading cross-server tables into a temporary table. This is not because temporary tables are stored in memory. This is because there is more information available about a table on a local server for the SQL optimizer to take advantage of.

You could export the data from your query and import that into a database on your first server. You can use the SQL server import/export wizard. Bit convoluted, but if this happens a lot you can use SSIS to automate this move or even simply tick the 'Save this package' box. Once you have the data in your first server you can do whatever you like with it.

Related

Copy a subset of data from remote database to local SQL Server

I have a remote database that I want to copy on my local SQL Server.
IMPORTANT: I only want a sample data (1k rows for instance) from each table, and there are about 130 different tables.
I have tried to use the export data procedure in SSMS. Put simply, I go to TASKS> EXPORT DATA > CHOSE SOURCE (the remote db)> CHOSE DESTINATION (my local db) > CHOSE THE TABLES TO COPY > COPY
What I have tried:
I've tried to write down in this tool the SQL query like
SELECT TOP 1000 *
FROM TABLE1 GO ...
SELECT TOP 1000 *
FROM TABLE130
But on the mapping step, it puts every result within a single table instead of creating the 130 different output tables.
FYI, the above procedure is taking 2min for one table. Doing it one by one for each table will take 130 *2min = 4hours and half... plus it is so boring
Do you have any idea for resolving this situation?
Thank you
regards
If you only want a subset you are going to have problems with foreign keys, if there are any in your database.
Possible approaches to extract all data or a subset
Use SSIS to extract the data from the source db and load into your local db
Write a custom application that does the job. (You can use SQL Bulk Copy)
If you purely wan't to do do it in SSMS you can create a linked server on your local server to the remote server.
This way you can do something like this if the tables or not yet created on your local server:
SELECT TOP 1000 *
INSERT INTO [dbo].Table1
FROM [yourLinkedServer].[dbo].[Table1]
Changing the INTO table and FROM table for each table you want to copy.

Copy 10k rows from table to table on other server

I cannot use linked server.
Both databases on both servers have the same structure but different data.
I have 10k rows to transfer from the DB on one server to the same DB on the other. I cannot restore the DB on the other server as it will take a huge amount of space that I don't have on the other server.
So, I have 2 options that I don't know how to execute:
Backup and restoring only one table - the table is linked to many other tables but these other tables exist on the other server too. Can I somehow delete or remove the other tables or make a backup only over one table?
I need to transfer 10k rows. How is it possible to create 10k insert queries based on selected 10k rows?
Can I somehow delete or remove the other tables or make a backup only over one table?
No you can not do this, unfortunately
How is it possible to create 10k insert queries based on selected 10k rows?
Right-click on Database -> Tasks -> Generate scripts -> (Introduction) Next
Chose Select specific database objects -> Tables, chose table you need -> Next
Advanced -> Search for Types of data script change from Schema only (by default) to Data only -> OK
Chose where to save -> Next -> Next. Wait the operation to end.
This will generate the file with 10k inserts.
Another way is to use Import/Export wizard (the most simple way for one-time-import/export) if you have link between databases.
There are many ways to choose from, here is one way using BCP. That's a tool that ships with SQL Server to Import and Export Bulk Data.
The outlines of the process:
Export the data from the source server to a file using BCP - BCP OUT for a whole table, or BCP QUERYOUT with a query to select the 10k rows you want exported
Copy the file to the destination server
Import the data using BCP on the destination database - BCP IN.
My suggestion would be to export these rows to excel( you can do this by copy pasting your query output) and transfer this to other server and import it there.
this is the official method :-
https://learn.microsoft.com/en-us/sql/relational-databases/import-export/import-data-from-excel-to-sql
and this is the the unofficial method :
http://therealdanvega.com/blog/2010/08/04/create-sql-insert-statements-from-a-spreadsheet.
Here I have assumed that you only need to transfer the transactional data and your reference data is same on both server. So you will need to execute only one query for exporting your data
I would definietely go down the SSIS route once you use SSIS to do a task like this you will not use anything else very simple to script up. You can use any version and it will be a simple job and very quick.
Open new SSIS project in available visual studio version/s there are many different but even a 2008 version will do this simple task you may have to install integration services or something similar used to be called bids (business information development studio in 2008) (anything up to 2015 although support is nearly there in 2017)
add a data flow task
double click the data flow task
Bottom of screen add two connection managers (1 to source and 1 to destination database)
add oledb source pointing to source database table
add oledb destination pointing to destination database table
drag line between the source and destination (should auto map all columns if the same name)
hit Start and the data will flow very quickly
you have create DbInstaller. using dbInstaller you have share whole database. Dbinstaller work both ado.Net and Entity Frame Work but I have using Entity Frame Work.
you can do it by sql server query
first select first database like
Use database1 --- this will be your first database
after select first database we will put our table row in temp table by this query
select * into #Temp from select * from table1
now we select second database and insert temp table data into second database table by this code
use secondDatabaseName
INSERT INTO tableNameintoinsert (col1, col2, )
SELECT col1, col2 FROM #temp;

How to backup script for subset of tables in SQL Express DB

I have developed a SQL Express database. I need to backup all but one table in that database in an automated way. I was thinking i could write a SQL script to do this, trigger it using sqlcmd from a batch file but not sure how to write that SQL script.
I was also thinking, if nothing else possible, i could create a second db that has the tables i want to backup then i write a script that copys data 'into' the second db and then do a auto backup of that entire db. This has the disadvantage of having a procrastinated unpacking of that backup when wanting to use it - its not a small install script.
Is this a possibility, is it the only option or is there tools for SQL Express to do this?
There is no option to exclude just one table while backing up .Few things i could think of
1.Right click database ->Tasks ->generate scripts ->exclude the table you want and choose to save the script and run this every time
2.you could also choose Export option,but since you are using SQL Express,you wont have the option to save this package
Keep the large table in a different database and just backup the original database. You can still use the large table even in a different database, i.e.
SELECT *
FROM MyDb.dbo.SomeTable s
JOIN OtherDb.dbo.LargeTable l
ON (expression);

Getting data from different database on different server with one SQL Server query

Server1: Prod, hosting DB1
Server2: Dev hosting DB2
Is there a way to query databases living on 2 different server with a same select query? I need to bring all the new rows from Prod to dev, using a query
like below. I will be using SQL Server DTS (import export data utility)to do this thing.
Insert into Dev.db1.table1
Select *
from Prod.db1.table1
where table1.PK not in (Select table1.PK from Dev.db1.table1)
Creating a linked server is the only approach that I am aware of for this to occur. If you are simply trying to add all new rows from prod to dev then why not just create a backup of that one particular table and pull it into the dev environment then write the query from the same server and database?
Granted this is a one time use and a pain for re-occuring instances but if it is a one time thing then I would recommend doing that. Otherwise make a linked server between the two.
To backup a single table in SQL use the SQl Server import and export wizard. Select the prod database as your datasource and then select only the prod table as your source table and make a new table in the dev environment for your destination table.
This should get you what you are looking for.
You say you're using DTS; the modern equivalent would be SSIS.
Typically you'd use a data flow task in an SSIS package to pull all the information from the live system into a staging table on the target, then load it from there. This is a pretty standard operation when data warehousing.
There are plenty of different approaches to save you copying all the data across (e.g. use a timestamp, use rowversion, use Change Data Capture, make use of the fact your primary key only ever gets bigger, etc. etc.) Or you could just do what you want with a lookup flow directly in SSIS...
The best approach will depend on many things: how much data you've got, what data transfer speed you have between the servers, your key types, etc.
When your servers are all in one Active Directory, and when you use Windows Authentification, then all you need is an account which has proper rights on all the databases!
You can then simply reference all tables like server.database.schema.table
For example:
insert into server1.db1.dbo.tblData1 (...)
select ... from server2.db2.dbo.tblData2;

How do I recreate a VIEW as a local table in SQL Server?

I am using Microsoft SQL Server Management Studio and have access to a bunch of views without the original tables that the view depends on. I have copied some data from this view into a file and would like to import it into a database that I locally created to do some analysis.
The brute-force way of doing this is to manually write down the CREATE TABLE statement looking at the columns in the view but is there a better way to get the CREATE or CREATE VIEW statement that I can directly use to recreate a similar table in my localhost?
Create a linked server in your localhost to this server. Then use (while connected to localhost)
SELECT * INTO NewTableName FROM LinkedServer.DBName.SchemaName.View
and a new table in your current DB in localhost would be created.
What I typically prefer to do is use SSIS for data transforms. The first step in the package would be to grab the definition using a SELECT INTO ... WHERE 1=0 so that it doesn't bring over any data and minimizes the locking time (SELECT INTO's result in database wide locks). Then once you have the resulting table with the source view's definition, then copy the data over.
If you're afraid the view's definition can change, stick with an INSERT INTO ... SELECT * FROM SQL task. Otherwise, save the definition that you retrieved from the SQL above and create the table (if it does not already exist). Then use a data flow task to transfer the data over.
With either of these approaches, you avoid the potential double hop scenario (if you're using Windows authentication). It's also reusable in a SQL agent job if you need to do this periodically. Otherwise, it may be a little overkill.
Or you can just run the first part in SSMS but I definitely recommend using the WHERE 1=0 then using an INSERT INTO rather than a straight SELECT INTO. Again, to minimize database locking.