No PK or FK when exporting SQL Server database - sql

We have a Production and a Staging database hosted on Microsoft Azure. We're using Microsoft SQL Server Management Studio 19 to do the migrating.
Here goes the steps
Right click the database -> Tasks -> Export Data...
We select SQL Server Native Client 11.0 as the Data Source.
For authentication we select "Use SQL Server authentication" and type in our user name and password.
In "Database", we select our Production database
After clicking Next, we'll now select our destination. For Destination we use "SQL Server Native Client 11.0" and the same setup as above, but just with our Staging database as destination.
Now it's time to "Specify Table Copy or Query" and here we select "Copy data from one or more tables or views".
We now mark all the tables and select Edit Mappings..., where we enable Enable identity insert.
If we then press Next and start the migration, everything seems to be completed without any errors or warnings.
BUT - For some reason all our PK and FK relations are not exported into our Staging database.
We have tried to look up a single table, just to see the SQL query which are generated and it looks like the image attached.
Can anyone tell us, what we're doing wrong. We have no idea, why the PK and FK's aren't migrated. Stored procedures and everything are working as expected.

There are two ways to export the PK and FK.
Using SSMS to generate the sql script. We just need to select the tables. It will generate a script.sql in your local PC.
2 We can query System tables and views to splicing FK and PK's SQL statement.

Related

select values ​from database to another with sql server

Hello I have to pass a select from a database that is on an ip address to another (identical) database that is on a completely different IP, below the query how to pass to make the switch?
Sql Code:
/*Insert into database with same name into same table addres:: 172.16.50.98*/
Insert into
/* select from database address: 172.16.50.96*/
SELECT IdUtente,Longitudine,Latitudine,Stato,DataCreazione
FROM Quote.dbo.Marcatura
where DataCreazione>'2019-01-08 18:37:28.773'
Linked Server/ OpenQuery is the way to achieve this. have a look on this.
including parameters in OPENQUERY
If the data that's being imported isn't large and this won't be a reoccurring task a linked server would probably be the better option. Creating one through the SSMS GUI is easier if you haven't done this before, but an example of creating one using the SP_ADDLINKEDSERVER stored procedure through T-SQL is below. If your account doesn't have access to the other server the SP_ADDLINKEDSRVLOGIN stored procedure will need to be used to configure the linked server with an account that has the appropriate permissions on the source server, as well as database and any referenced objects. While using the linked server syntax (4 part name) is simpler and easier to read, I'd strongly recommend doing the insert with OPENQUERY instead if only one linked server will be used. This will execute the SQL on the source server, applying any filters there and only return the necessary rows, whereas the linked server syntax will return all the rows before performing the filtering. You can read more about the differences between the two here. You indicated the database name is the same on both servers, and this assumes the same for the table and schema names as well. Make sure to update these accordingly if they differ.
If a large volume of the data will imported or if this will be a regular process creating an SSIS package and setting this to run as a SQL Agent job will be the better approach. If you choose to go this route there are a number of things to consider, but the links below will help you get started. SQL Server Data Tools (SSDT) is where the packages can be developed. While not necessary, executing the packages from the SSIS Catalog, SSISDB, will be much more beneficial than just the using the file system. Either an OLE DB or SQL Server Destination can be used since the table that's being loaded to is on SQL Server, however a SQL Server Destination can only be used on a local database.
Linked Server:
--Create linked server
--SQL product name and SQLNCLI11 provider for SQL Server
EXEC [MASTER].DBO.SP_ADDLINKEDSERVER #server = N'MyLinkedServer', #srvproduct=N'SQL',
#provider=N'SQLNCLI11', #datasrc=N'ServerIPAddress'
--OPENQUERY insert
INSERT INTO Quote.dbo.Marcatura (IdUtente, Longitudine, Latitudine, Stato, DataCreazione)
SELECT
IdUtente,
Longitudine,
Latitudine,
Stato,
DataCreazione
FROM OPENQUERY(MyLinkedServer, '
SELECT
IdUtente,
Longitudine,
Latitudine,
Stato,
DataCreazione
FROM Quote.dbo.Marcatura')
SSIS:
SSIS
SSDT
SSISDB
Execute SQL Task
Data Flow Task
OLE DB Source
OLE DB Destination
SQL Server Destination
SQL Server Agent SSIS Packages
SSIS solution
I think this requires a very simple SSIS package to be achieved:
Create two OLEDB Connection manager; one for each server
Add a data flow task
Inside the Data flow task addan OLEDB Source and OLEDB destination
In the OLEDB source (172.16.50.98 connection manager) select SQL command as Access mode and use the following command:
SELECT IdUtente,Longitudine,Latitudine,Stato,DataCreazione
FROM Quote.dbo.Marcatura
where DataCreazione >'2019-01-08 18:37:28.773'
Map the source columns to the OLEDB destination (172.16.50.96 connection manager)
Helpful links
Extract Data by Using the OLE DB Source
SSIS OLEDB Source to OLE DB Destination example

How to transfer/migrate tables with schema from SQL Server to Oracle?

I have tables from my SQL Server 2008 R2 that have a schema. I manage to transfer some tables to sql developer but the tables with schema did't transfer. How to do?.
You can use DTS (Data Transformation Service) at SQL Server Side or you can configure Gateway at oracle side. To Use DTS you should have oracle client on the machine where you are executing DTS.
DTS will ask you for source and destination and it's credential. So as you need to migrate from SQL to Oracle. In source tab select "Microsoft OLEDB Provide for SQL Server" as Data Source, Put IP Address / Server Name in Server Name DDown. If you have Window Authentication then let it be and select database and go to Next. If you have login ID and Password, select SQL Server Authentication put Login Id and Password, Select you database and then click on Next button.
In destination Tab select "Microsoft OLEDB Provider for Oracle", then click on Properties. And put TNSName into Server Name (You configured using NETCA in Oracle Client on the machine) and LoginId and Password.
After you get connected it will display to use query or complete table, you just click on next without selecting any option.
Then it will display all tables from source, select the table you need to migrate then click on next till finish.
For Gateway configuration, take help from this link.
https://docs.oracle.com/cd/B28359_01/gateways.111/b31043/conf_sql.htm
Thanks

Copying an SQL table from one Server to another on SQL Server 2000 / 2005

I’m trying to copy a SQL Server table, schema and data, from Server A to Server B. The SQL Server table is just a reference table which hasn't been populated for some reason on Server B. Can anyone advise how the entire table could be copied across please? On SQL Server 2000/2005.
So far we've tried a long-winded approach by copying the .mdf and .ldf files from Server A to Server B with a plan to then copy the table across into the Server B database but we are having some difficulty re-attaching the database to Server B.
Please can anyone help?
Kind Regards
James
Using SQL Server Management Studio (SSMS):
In Object Explorer right click on source database name, Tasks.. -> Generate Scripts.. - opens Generate and Publish Scripts dialog. Click Next to choose objects, choose "Select specific DB objects", expand Tables, choose your table. Next, setup script destination, for example New query window and (important step!!) - click Advanced, and set "Types of data to script"="Schema and data" and "Script USE DATABASE"=False, OK, Next, Next, .. wait .. Finish. Now you have got complete SQL script to reproduce this table with data. Connect to destination DB and run it.
Tested with SSMS 2014, but as I recall this feature should be available starting from SSMS 2005.
you can use the import/export data wizard in management studio, the wizard will create for you a new table in the server B with the same structure of the table in the server A. before using it you need to have at least one database in sever B.
This confirms why this is one of favourite forums.
Both these methods work beautifully :
Generate Scripts (when altering Types of data to script"="Schema and
data")
Export and Import
Interestingly Generate Scripts works with SQL Express perfectly but the Export method does not save unless you have at least SQL Server Standard Edition.
Thanks so much everyone
Cheers
James
Try this:
SELECT * INTO destination FROM source
But, it will not copy the indexes and key information or you can also try import/export data task from SSMS.

create sql server database as other sql database structure (design)

I need to create sql database just like other sql database (without the data) via sql script
some one told me that Oracle has this ability by some code like
create database <your new DB name> as <the old DB name>
so is there a similar statement or workaround in SQL
I use MS-SQL Server 2008/2008R2
I don't want the 'generate scrip' solution as this provide a very long script, I just need the Oracle statement (mentioned above) but in SQL
to be more clear I need the tables structure only (no data) and need all other objects such as functions, views etc...
Please advice,
In SQL Server, you can right click on the database then select "Tasks" then "Generate Scripts" and you can build one script for all your object sans data.
If you are asking about SQL Server (at least in 2005 or newer, not sure about previous versions) if you right click on a Database in SQL Server Management Studio Go To Tasks and then Generate Scripts you have the option to Generate a script to create all the object ins the Database w/o Data.
Update:
You can also right click on a User Database and select Script Database as -> Create To just to get the TSQL to create the DB itself without any of the tables, stored procedure etc.

SQL Azure - copy table between databases

I am trying to run following SQL:
INSERT INTO Suppliers ( [SupplierID], [CompanyName])
Select [SupplierID], [CompanyName] From [AlexDB]..Suppliers
and got an error "reference to database and/or server name in is not supported in this version of sql server"
Any idea how to copy data between databases "inside" the server?
I can load data to client and then back to server, but this is very slow.
I know this is old, but I had another manual solution for a one off run.
Using SQL Management Studio R2 SP1 to connect to azure, I right click the source database and select generate scripts.
during the wizard, after I have selected my tables I select that I want to output to a query window, then I click advanced. About half way down the properties window
there is an option for "type of data to script". I select that and change it to "data only", then I finish the wizard.
All I do then is check the script, rearrange the inserts for constraints, and change the using at the top to run it against my target DB.
Then I right click on the target database and select new query, copy the script into it, and run it.
Done, Data migrated.
Since 2015, this can be done by use of elastic database queries also known as cross database queries.
I created and used this template, it copies 1.5 million rows in 20 minutes:
CREATE MASTER KEY ENCRYPTION BY PASSWORD = '<password>';
CREATE DATABASE SCOPED CREDENTIAL SQL_Credential
WITH IDENTITY = '<username>',
SECRET = '<password>';
CREATE EXTERNAL DATA SOURCE RemoteReferenceData
WITH
(
TYPE=RDBMS,
LOCATION='<server>.database.windows.net',
DATABASE_NAME='<db>',
CREDENTIAL= SQL_Credential
);
CREATE EXTERNAL TABLE [dbo].[source_table] (
[Id] BIGINT NOT NULL,
...
)
WITH
(
DATA_SOURCE = RemoteReferenceData
)
SELECT *
INTO target_table
FROM source_table
Unfortunately there is no way to do this in a single query.
The easiest way to accomplish it is to use "Data Sync" to copy the tables. The benefit of this is that it will also work between servers, and keep your tables in sync.
http://azure.microsoft.com/en-us/documentation/articles/sql-database-get-started-sql-data-sync/
In practise, I haven't had that great of an experience with "Data Sync" running in production, but its fine for once off jobs.
One issue with "Data Sync" is that it will create a large number of "sync" objects in your database, and deleting the actual "Data Sync" from the Azure portal may or may not clean them up. Follow the directions in this article to clean it all up manually:
https://msgooroo.com/GoorooTHINK/Article/15141/Removing-SQL-Azure-Sync-objects-manually/5215
SQL-Azure does not support USE statement and effectively no cross-db queries. So the above query is bound to fail.
If you want to copy/backup the db to another sql azure db you can use the "Same-server" copying or "Cross-Server" copying in SQL-Azure. Refer this msdn article
You could use a tool like SQL Data Compare from Red Gate Software that can move database contents from one place to another and fully supports SQL Azure. 14-day free trial should let you see if it can do what you need.
Full disclosure: I work for Red Gate Software
An old post, but another option is the Sql Azure Migration wizard
Use the following steps, there is no straight forward way to do so. But by some trick we can.
Step1 : Create another one table with the same structure of Suppliers table inside [AlexDB], Say it as SuppliersBackup
Step2 : Create table with the same structure of Suppliers table inside DesiredDB
Step3 : Enable Data Sync Between AlexDB..Suppliers and DesiredDB..Suppliers
Step4 : Truncate data from AlexDB..Suppliers
Step5 : Copy data from AlexDB..SuppliersBackup to AlexDB..Suppliers
Step6 : Now run the sync
Data Copied to DesiredDB.
If you have onprem version that has the sp_addlinkedsrvlogin, you can setup Linked Servers for both source and target database then you can run your insert into query.
See "SQL Server Support for Linked Server and Distributed Queries against Windows Azure SQL Database" in this blog: https://azure.microsoft.com/en-us/blog/announcing-updates-to-windows-azure-sql-database/
Ok, i think i found answer - no way. have to move data to client, or do some other tricks. Here a link to article with explanations: Limitations of SQL Azure: only one DB per connection
But any other ideas are welcome!
You can easily add a "Linked Server" from SQL Management Studio and then query on the fully qualified table name. No need for flat files or export tables. This method also works for on-prem to azure database and vice versa.
e.g.
select top 1 ColA, ColB from [AZURE01_<hidden>].<hidden>_UAT_RecoveryTestSrc.dbo.FooTable order by 1 desc
select top 1 ColA, ColB from [AZURE01_<hidden>].<hidden>_UAT_RecoveryTestTgt.dbo.FooTable order by 1 desc
A few options (rather workarounds):
Generate script with data
Use data sync in Azure
Use MS Access (import and then export), with many exclusions (like no GUID in Access)
Use 3-rd party tools like Red Gate.
Unfortunately no easy and built-in way to do that so far.
I would recommend SSMS SQL Server Import and Export feature. This feature supports multiple connection configurations and cross-server copy of selected tables. I have tried .NET Sql Server connector, which works very well for the Azure SQL databases.