How to get data from multiple databases in one query? - sql

I'm using a Java driver for SQLite 3 and wanted to find out if there was any means to get data from multiple databases (db1.db, db2.db) in one query?
Does any driver for SQLite3 support this at the moment?
Say db1 has 100 rows and db2 has 100 rows, my requirement is to get 200 rows by querying these by a single query.

You'll probably want to look at the ATTACH DATABASE command:
http://www.sqlite.org/lang_attach.html
The ATTACH DATABASE statement adds another database file to the current database connection.
Here's a tutorial on how to use it:
http://longweekendmobile.com/2010/05/29/how-to-attach-multiple-sqlite-databases-together/

Related

Postgresql dump with data restriction

I'm working on developing a fast way to make a clone of a database to test an application. My database has some specif tables that are quite big (+50GB), but the big majority of the tables only have a few MBs. On my current server, the dump + restore takes some hours. These bigs tables have date fields.
With the context in mind, my question is: Is possible to use some type of restrictions on table rows to select the data that is being dumped? e.g. On table X only dump the rows that date is Y.
If this is a possible show can I do it? if it's not possible what would be a good alternative?
You can use COPY SELECT whatever FROM yourtable WHERE ... TO '/some/file' to limit what you export.
COPY command
You could use row level security and create a policy that lets the dumping database user see only those rows that you want to dump (make sure that that user is neither a superuser nor owns the tables, because these users are exempt from row level security).
Then dump the database with that user, using the --enable-row-security option of pg_dump.

Copy a subset of data from remote database to local SQL Server

I have a remote database that I want to copy on my local SQL Server.
IMPORTANT: I only want a sample data (1k rows for instance) from each table, and there are about 130 different tables.
I have tried to use the export data procedure in SSMS. Put simply, I go to TASKS> EXPORT DATA > CHOSE SOURCE (the remote db)> CHOSE DESTINATION (my local db) > CHOSE THE TABLES TO COPY > COPY
What I have tried:
I've tried to write down in this tool the SQL query like
SELECT TOP 1000 *
FROM TABLE1 GO ...
SELECT TOP 1000 *
FROM TABLE130
But on the mapping step, it puts every result within a single table instead of creating the 130 different output tables.
FYI, the above procedure is taking 2min for one table. Doing it one by one for each table will take 130 *2min = 4hours and half... plus it is so boring
Do you have any idea for resolving this situation?
Thank you
regards
If you only want a subset you are going to have problems with foreign keys, if there are any in your database.
Possible approaches to extract all data or a subset
Use SSIS to extract the data from the source db and load into your local db
Write a custom application that does the job. (You can use SQL Bulk Copy)
If you purely wan't to do do it in SSMS you can create a linked server on your local server to the remote server.
This way you can do something like this if the tables or not yet created on your local server:
SELECT TOP 1000 *
INSERT INTO [dbo].Table1
FROM [yourLinkedServer].[dbo].[Table1]
Changing the INTO table and FROM table for each table you want to copy.

How to transfer one table data to another table data with different environment server in SQL Server

I have 2 different databases (A and B) in a SQL Server with different environment server.
Database A has around 100 tables out of which I want to synchronize 15 tables from database A to database B.
So if any add / delete / update happens on those 15 tables, then those entries
should be updated simultaneously in database B.
Same way I need to keep track of update and delete.
Can anybody suggest me what will be the best solution?

Is there a MAX size for a table in MSSQL

I have 150Million records with 300 columns (nchar),I am running a script to import data to the database database but it is always stopping when it gets to 10Million..
Is there a MSSQL setting that controls how many records can be on a table? What can it be making it stop at 10Million?
Edit:
I have run the script multiple times and it has been able to create multiple tables, but they all max at the same 10million records
depends on available storage. The more available storage you have, the more rows can you have in a table

Copy (Import) Data from Oracle Database To Another

I want to Copy a data from One oracle database to another.
I have checked Import/Export Utility but the problem is import utility doesn't support conflicts resolution techniques between rows.
For Example if there's a table in the source database have the same row key in the destination database. if i use 'Ignore' parameter with value = y, the destination table will have a duplicate rows.
I want to ask if there's another way to import data from oracle database to another with some mechanism of detecting the conflicts and resolve them?
You might want to consider using a database link from database A to database B. You can query the data from database B to insert into your database A tables. You are free to query whatever you want using SQL or PL/SQL.
More on database links:
http://docs.oracle.com/cd/B19306_01/server.102/b14200/statements_5005.htm