I have a number of .db files that I'd like to merge into one.
Each database has four tables.
'Associated'
'Candidate'
'Picks'
'Picks_modified'
The tables might be empty in some of the files. I would like to merge these files using python2.7.
Thanks in advance for your help,
Antoine
This is a general SQLite solution which should work for you in Python as well. Assuming you had three databases, and you wanted the contents of the second and third tables in the tables of the first database, then you could try this:
ATTACH 'database1.db' AS db1;
ATTACH 'database2.db' AS db2;
ATTACH 'database3.db' AS db3;
INSERT INTO db1.Associated
SELECT * FROM db2.Associated
UNION ALL
SELECT * FROM db3.Associated
UNION ALL
...
Repeat the above for the other three tables in your databases (Candidate, Picks, Picks_modified).
In other words, we can insert the records from the other databases into the tables of the first database. If you wanted to aggregate everything in a different, perhaps new, database, then you can easily modfiy the above code to handle this.
Related
I have a very large data set in GPDB from which I need to extract close to 3.5 million records. I use this for a flatfile which is then used to load to different tables. I use Talend, and do a select * from table using the tgreenpluminput component and feed that to a tfileoutputdelimited. However due to the very large volume of the file, I run out of memory while executing it on the Talend server.
I lack the permissions of a super user and unable to do a \copy to output it to a csv file. I think something like a do while or a tloop with more limited number of rows might work for me. But my table doesnt have any row_id or uid to distinguish the rows.
Please help me with suggestions how to solve this. Appreciate any ideas. Thanks!
If your requirement is to load data into different tables from one table, then you do not need to go for load into file and then from file to table.
There is a component named tGreenplumRow which allows you to write direct sql queries (DDL and DML queries) in it.
Below is a sample job,
If you notice, there are three insert statements inside this component. It will be executed one by one separated by semicolon.
In SQL Server, I have two tables with different schema. The tables are in two separate databases on two servers.
How can I copy the content from one table to the other? I have several million rows to move.
Does this query work for tables in different servers?
INSERT INTO table2 (column_name(s))
SELECT column_name(s)
FROM table1;
Thanks
UPDATE:
I realized I'm inserting data to a non-sql database.
Here's the scenario:
I have a table in SQL Server Database and I want to move the data to an IBM Database.
There are several millions rows in my SQL table.
I need to unload the data and store as flat file. The format can be .csv or txt.
The "Task->Export" function does not work.
Millions of records being transferred across the network will ultimately timeout due to latency and maximum connection time established.
So is there any other ways to do this?
Thanks
Yes, performing this in a query would be the most efficient way of doing this...
You just have to qualify the name:
myDatabase.dbo.myTable
(assuming dbo is the right schema)
Or do you mean cross server/instance transactions? Then you need to set up a Linked Server and again, fully qualify the name:
myLinkedServer.myDatabase.dbo.myTable
UPD: Where from the requirement is coming.
My friend is using Mnemosine (http://mnemosyne-proj.org/) which is python program that uses sqlite as db. The issue that mobile version works only with one database file and my friend has already several. So he asked me if I can merge two databases.
So! I have two sqlite db files with same schema but different data.
Is there an automated way to include data from one file to another? I just need to insert additional values to dictionary tables and correctly insert values from other tables based on new ids.
Unfortunately there are no foreign keys defined so I need probably first specify columns/tables relationship. But in general, if I solve relationship issue, is it possible to merge dbs?
You can open the database you want to merge into, then attach the other database.
ATTACH DATABASE "foo.database" AS foo;
Then you can access the other database's tables by prefixing it with the database's name and a dot:
INSERT INTO bar (baz) SELECT baz FROM foo.bar;
You could try this:
sqlite3 bar.db ".dump t1" | grep -v "^CREATE" | sqlite3 foo.db
That will put the contents of table t1 from bar.db into table t1 in foo.db.
I need to unload around 5-6 million rows into a file from a sybase ASE database table. What is the best way to do that: bcping out or select * from... and storing the output to the file?
The table has some indexes on it. The database server is on a different machine than the file needs to be created.
Any ideas how can it be made faster?
The BCP utility is designed for that purpose. It should be faster than any select *, particularly if you use the native mode, and not the character mode.
I have to copy the tables with data from one database to another using Query. I know how to copy tables with data within a database. But I was not sure about how to do the same for copying between two databases.
I have to copy huge number of tables, so I need any fast method using query...
Anybody please help out...Thanks in advance...
You can use the same way to copy the tables within one database, the SELECT INTO but use a fully qualified tables names database.schema.object_name instead like so:
USE TheOtherDB;
SELECT *
INTO NewTable
FROM TheFirstDB.Schemaname.OldTable
This will create a new table Newtable in the database TheOtherDB from the table OldTable whih belongs to the databaseTheFirstDB
Right click on the database, select tasks and click on Generate Scripts.
In the resultant pop-up, choose options as required (click advanced), to drop and create table, drop if exists, etc.
Scroll down and choose "Schema and Data" or "Data Only" or "Types of data to script (2008 R2)"as required.
Save to file and execute on the destination DB.
Advantages -
Can be executed against the destination DB, even if it is on another server / instance
Quickly script multiple tables, with data as needed
Warning - Might take quite a while to script, if the tables contain a large amount of data.
Rajan
INSERT INTO DB2.dbo.MyOtherTable (Col0, Col1)
SELECT Col0, Col1 FROM DB1.dbo.MyTable
Both table column's must have same data types..
Below SQL Query will copy SQL Server table schema & data from one database to another database. You can always table name (SampleTable) in your destination database.
SELECT * INTO DestinationDB.dbo.SampleTable FROM SourceDB.dbo.SampleTable