Copying production database setup to development database - sql

I am using Oracle 9i, Please suggest how can I select data from one remote database and insert the data in the local database?
Also suggest how the data can be copied from a remote to remote database.

You need to create a database link.
Please refer to this link: http://download.oracle.com/docs/cd/B10501_01/server.920/a96521/ds_concepts.htm#12354
excerpts:
example:
CREATE DATABASE LINK sales.us.americas.acme_auto.com CONNECT TO scott IDENTIFIED BY tiger USING 'sales_us';
query:
For example, using a database link to database sales.division3.acme.com, a user or application can reference remote data as follows:
SELECT * FROM scott.emp#sales.division3.acme.com; # emp table in scott's schema
SELECT loc FROM scott.dept#sales.division3.acme.com;

Based on the vagueness of the question. Make a backup of production and restore it in development.

If you are talking Microsoft SQL then you can create a Linked Server. Here is an article about doing this in SQL 2008, but you can do it in earlier versions as well. Then you can select from it using a four part name LinkedServer.database.schema.table
http://msdn.microsoft.com/en-us/library/ff772782.aspx

Define a link from the development server to the prooduction server. You can then use a select based insert to copy data into the development server.
Use the SAMPLE clause on the select to retrieve a percentage of the data. For child tables use a WHERE exists clause to copy child rows for which the parent was sampled.

Related

Update from linked server (mysql) to local sql database.

I am looking for a way to setup a scheduled update from a linked server I created to a local db, I am not familiar with triggers but from what I've read you have to set them up on the originating server, and I only have read access to the mysql Database. Basically all that I am trying to do is make a local copy of two tables from the mysql db. I can manually do so with select into statements, but I would like to have some automation if possible. Any thoughts on how to achieve this? Also I am using SQL server 2008 R2. Thanks!
You have several options to do:
Copy all data from the source table (do not use this if the source table is big)
If you have a column in the source table which can be used to determine which records should be copied, use that (this is mostly an auto updated timestamp column in MySQL)
Set up a trigger to track modifications
To copy, you can set up a Linked Server or you can use SSIS
To use a linked server you can use OPENQUERY()
You can schedule your task with SQL Server Agent

T-SQL Script to copy data from one server to another?

Is it possible to copy data from one server to another using a T-SQL script? We have a code promotion process that makes using the import wizard less than optimal for our team so I am looking into a script I can simply have someone run in Management Studio that will do the trick.
Yes,
First create a Linked Server to other server, then you can access the target server by 4 part Names, for example:
Insert into Server2.Database2.dbo.MapTable1 select * from table1
p.s you can add linked server by sp_addLinkedServer

How to query a table to a view and publish to a different database

I have 13 SQL databases some 2005 others 2008, on a VPN. I'd like to take all of the data from the "Employees" table on each database and make it a view at each location. I would then like to publish these views to 1 database on another server, all in one table marking where each came from within the origninal databases. For example the database where all the information goes to would look like this:
User Name Location
bik Bob K 1
JS John S 2
Etc.
Any help is appreciated.
I assume you want the data on the final server to be viewable, but not modifiable, and to reflect changes made to the source databases?
This would probably not perform all that well, but one do-it-yourself-way to do it would be the following (disclaimer: I haven't tried doing this myself):
Set up all the source servers as linked servers on the final server.
Create a view in this form:
SELECT *, 1 as Location
FROM [Linked Server 1].Database1.dbo.Table1
UNION ALL
SELECT *, 2 as Location
FROM [Linked Server 2].Database2.dbo.Table2
... etc ....
You might want to read this documentation on distributed queries, if you haven't already.
I believe it's also possible to use SSIS as the source of a distributed query, but a quick scan through the documentation didn't find anything about it. I mention that because SSIS would make pulling and transforming data from disaparate data sources very easy, and if you could use the final recordset as a data source, you could use an SSIS package as the backend to your view. However, again, performance would probably require considerable tuning.
If the queries don't have to be real time you could look into using SQL Server Integration Services (SSIS) to pull in the data to a local DB. you could schedule the job to run hourly/daily/weekly..

How to move a table from system schema to scott schema in Oracle?

In some days ago...i was converting some Large MySQL Database to Oracle 10g R2 by using Oracle SQL Developer database migration tools.But unfortunately it was migrated on system schema.but i need to it on scott schema.
After Googling i found this two links OraFAQ Forum and ASK TOM Q&A.But I cannot find any appropriate answer. Any one Can help me to do , How it is possible.
Thanks In Advance.
IIRC the MySQL backup tool spits out plain SQL. As it would be in the form of fairly vanilla SQL -- just create and insert, I guess -- it ought to be able to be run against your Oracle schema with the minimum of alteration.
Having said that, in the SQL Developer migration wizard, the second step allows you to select the target schema. If you have a connection setup to scott, why doesn't that work for you?
If the table isn't too large (dependent upon your system resources and server horsepower etc.) then you could simply rebuild the table in the desired schema with the following.
N.B. You need to be logged in as either the target schema's user (with select permission on the table in the SYSTEM tablespace) or as system:
CREATE TABLE <newschema>.<tablename>
AS
SELECT *
FROM system.<tablename>;
Then remove the original table once the new table has been created.
If the table is large then you could use DATAPUMP to export and import it into the desired schema.
Here is an article on using Data Pump for this purpose:
http://oraclehack.blogspot.com/2010/06/data-pump-moving-tables-to-new-schema.html
Hope this helps

SQL Azure - copy table between databases

I am trying to run following SQL:
INSERT INTO Suppliers ( [SupplierID], [CompanyName])
Select [SupplierID], [CompanyName] From [AlexDB]..Suppliers
and got an error "reference to database and/or server name in is not supported in this version of sql server"
Any idea how to copy data between databases "inside" the server?
I can load data to client and then back to server, but this is very slow.
I know this is old, but I had another manual solution for a one off run.
Using SQL Management Studio R2 SP1 to connect to azure, I right click the source database and select generate scripts.
during the wizard, after I have selected my tables I select that I want to output to a query window, then I click advanced. About half way down the properties window
there is an option for "type of data to script". I select that and change it to "data only", then I finish the wizard.
All I do then is check the script, rearrange the inserts for constraints, and change the using at the top to run it against my target DB.
Then I right click on the target database and select new query, copy the script into it, and run it.
Done, Data migrated.
Since 2015, this can be done by use of elastic database queries also known as cross database queries.
I created and used this template, it copies 1.5 million rows in 20 minutes:
CREATE MASTER KEY ENCRYPTION BY PASSWORD = '<password>';
CREATE DATABASE SCOPED CREDENTIAL SQL_Credential
WITH IDENTITY = '<username>',
SECRET = '<password>';
CREATE EXTERNAL DATA SOURCE RemoteReferenceData
WITH
(
TYPE=RDBMS,
LOCATION='<server>.database.windows.net',
DATABASE_NAME='<db>',
CREDENTIAL= SQL_Credential
);
CREATE EXTERNAL TABLE [dbo].[source_table] (
[Id] BIGINT NOT NULL,
...
)
WITH
(
DATA_SOURCE = RemoteReferenceData
)
SELECT *
INTO target_table
FROM source_table
Unfortunately there is no way to do this in a single query.
The easiest way to accomplish it is to use "Data Sync" to copy the tables. The benefit of this is that it will also work between servers, and keep your tables in sync.
http://azure.microsoft.com/en-us/documentation/articles/sql-database-get-started-sql-data-sync/
In practise, I haven't had that great of an experience with "Data Sync" running in production, but its fine for once off jobs.
One issue with "Data Sync" is that it will create a large number of "sync" objects in your database, and deleting the actual "Data Sync" from the Azure portal may or may not clean them up. Follow the directions in this article to clean it all up manually:
https://msgooroo.com/GoorooTHINK/Article/15141/Removing-SQL-Azure-Sync-objects-manually/5215
SQL-Azure does not support USE statement and effectively no cross-db queries. So the above query is bound to fail.
If you want to copy/backup the db to another sql azure db you can use the "Same-server" copying or "Cross-Server" copying in SQL-Azure. Refer this msdn article
You could use a tool like SQL Data Compare from Red Gate Software that can move database contents from one place to another and fully supports SQL Azure. 14-day free trial should let you see if it can do what you need.
Full disclosure: I work for Red Gate Software
An old post, but another option is the Sql Azure Migration wizard
Use the following steps, there is no straight forward way to do so. But by some trick we can.
Step1 : Create another one table with the same structure of Suppliers table inside [AlexDB], Say it as SuppliersBackup
Step2 : Create table with the same structure of Suppliers table inside DesiredDB
Step3 : Enable Data Sync Between AlexDB..Suppliers and DesiredDB..Suppliers
Step4 : Truncate data from AlexDB..Suppliers
Step5 : Copy data from AlexDB..SuppliersBackup to AlexDB..Suppliers
Step6 : Now run the sync
Data Copied to DesiredDB.
If you have onprem version that has the sp_addlinkedsrvlogin, you can setup Linked Servers for both source and target database then you can run your insert into query.
See "SQL Server Support for Linked Server and Distributed Queries against Windows Azure SQL Database" in this blog: https://azure.microsoft.com/en-us/blog/announcing-updates-to-windows-azure-sql-database/
Ok, i think i found answer - no way. have to move data to client, or do some other tricks. Here a link to article with explanations: Limitations of SQL Azure: only one DB per connection
But any other ideas are welcome!
You can easily add a "Linked Server" from SQL Management Studio and then query on the fully qualified table name. No need for flat files or export tables. This method also works for on-prem to azure database and vice versa.
e.g.
select top 1 ColA, ColB from [AZURE01_<hidden>].<hidden>_UAT_RecoveryTestSrc.dbo.FooTable order by 1 desc
select top 1 ColA, ColB from [AZURE01_<hidden>].<hidden>_UAT_RecoveryTestTgt.dbo.FooTable order by 1 desc
A few options (rather workarounds):
Generate script with data
Use data sync in Azure
Use MS Access (import and then export), with many exclusions (like no GUID in Access)
Use 3-rd party tools like Red Gate.
Unfortunately no easy and built-in way to do that so far.
I would recommend SSMS SQL Server Import and Export feature. This feature supports multiple connection configurations and cross-server copy of selected tables. I have tried .NET Sql Server connector, which works very well for the Azure SQL databases.