Oracle SQL Dump file extracting parts to sql/another dump file - sql

I have a Oracle DB dump file and now I only need parts of the tables that are included there. Does anyone know how I can extract this parts into a separate dump file (or SQL)?
I thought about using the import statement. Import from dump file (full export) to dumpfile (needed parts) something like this, but don't know if its possible this way
import user/pw directory=fullexport_dump dumpfile=part.dmp logfile=import.log status=30

No it's not possible. You can only limit rows while exporting using query parameter.
exp ..... query="where id=10"
You may search further in the Oracle Documentation.
So, import the whole table, and create a new table with only required parts:
create table NEEDEDPARTS as select * from FULLEXPORT where id=10
Or, import the whole table and re-export with query parameter.

Related

importing excel table into database

I have a following table in xlsx format which I would like to import into the my sql database:
The table is pretty complicated and I only want the records after '1)HEADING'
I have been looking at php libraries to import into sql but they only seem to be for simple excel files.
You have two ways to realize that :
First method :
1) Export it into some text format. The easiest will probably be a tab-delimited version, but CSV can work as well.
2) Use the load data capability. See http://dev.mysql.com/doc/refman/5.1/en/load-data.html
3) Look half way down the page, as it will gives a good example for tab separated data:
FIELDS TERMINATED BY '\t' ENCLOSED BY '' ESCAPED BY '\'
4) Check your data. Sometimes quoting or escaping has problems, and you need to adjust your source, import command-- or it may just be easier to post-process via SQL.
Second method :
There's a simple online tool that can do this called sqlizer.io.
You upload an XLSX file to it, enter a sheet name and cell range, and it will generate a CREATE TABLE statement and a bunch of INSERT statements to import all your data into a MySQL database.

Skip Columns During Teradata Table Import From CSV Using SQL Assistant

I have a CSV file with data I need to import to a Teradata table, but it has a useless column that I would like to exclude from the import. The useless column is the first column, so the CSV rows are set up like:
'UselessData','Data','Data','Data'
Typically, I would import using SQL Assistant by choosing File -> Import Data from the menu and using the basic query:
INSERT INTO TableName VALUES (?,?,?,?)
But this will collect the extraneous useless data in Column 1. Is there a way to specify that an import take only certain columns or send the useless column to NULL?
AFAIK you can't do that with SQL Assistant.
Possible workarounds:
Switch to Teradata Studio or TPT for loading (will also load faster)
Load all columns into a Volatile Table first (and don't forget to increase the Maximum Batch size for simple Imports in Tools -> Options -> Import) and then Insert/Select into the target.

Import dump with SQLFILE parameter not returning the data inside the table

I am trying to import the dump file to .sql file using SQLFILE parameter.
I used the command "impdp username/password DIRECTORY=dir DUMPFILE=sample.dmp SQLFILE=sample.sql LOGFILE=sample.log"
I expected this to return a sql file with contents inside the table. But it created a sql file with only DDL queries.
For export I used, "expdp username/password DIRECTORY=dir DUMPFILE=sample.dmp LOGFILE=sample.log FULL=y"
Dump file size is 130 GB. So, I believe the dump has been exported correctly.
Am I missing something in the import command? Is there any other parameter should I use to get the contents?
Thanks in advance!
Your expectation was wrong, I'm afraid. You're asking it to do something it isn't designed for.
The documentation for SQLFILE says:
Purpose
Specifies a file into which all of the SQL DDL that Import would have executed, based on other parameters, is written.
So it will only ever contain DDL.
There isn't a mechanism to turn a .dmp file into a .sql containing insert statements. If you need to put the data into a table, just use the native import.
Individual insert statements - if you could generate them, which SQL Developer will do as a separate task unrelated to your data pump export - would be slower, would have problems with LOBs, and would have to be careful about the order they were run unless integrity constraints were disabled. Data pump takes care of all of that for you.

h2 - How to dump selective rows by WHERE class, SCRIPT dumps the entire database

I need to be able to EXPORT specific rows from selective tables for subsequent IMPORT on another machine. Is there a way to do that via H2's SQL grammar (like "SCRIPT FROM table WHERE column = value") OR do I need to write custom code to do this?
H2 supports the SCRIPT command to export a SQL script, and it supports CSVWRITE to export a CSV file.
The SCRIPT command does support a table name, but not a condition. What you could do is create a temporary table, and export it.
Or you could use CSVWRITE to export a CSV file.
Or, of course, you could create your own user defined function.

How to import pipe delimited text file data to SQLServer table

I have database table represented as text file in the following pattern:
0|ALGERIA|0| haggle. carefully f|
1|ARGENTINA|1|al foxes promise|
2|BRAZIL|1|y alongside of the pendal |
3|CANADA|1|eas hang ironic, silent packages. |
I need to import this data to a SQL Server 2008 database table. I have created the table with the types matching the schema.
How to import this data to the table?
EDIT: Solved by following the answer selected.
Note to anyone stumbling upon this in future: The datatype needs to be converted.
Refer: http://social.msdn.microsoft.com/Forums/en/sqlintegrationservices/thread/94399ff2-616c-44d5-972d-ca8623c8014e
You could use the Import Data feature by right mouse clicking the database, and then clicking Tasks then Import Data. This will give you a wizard which you can specify the delimiters etc. for your file and preview the output before you've inserted any data.
If you have a large amount of data you can use bcp to bulk import from file: http://msdn.microsoft.com/en-us/library/ms162802.aspx
The bcp utility bulk copies data
between an instance of Microsoft SQL
Server and a data file in a
user-specified format. The bcp utility
can be used to import large numbers of
new rows into SQL Server tables...
Except when used with the
queryout option, the utility requires
no knowledge of Transact-SQL. To
import data into a table, you must
either use a format file created for
that table or understand the structure
of the table and the types of data
that are valid for its columns.