Export and insert CLOB data containing SQL statements with SQL Developer - sql

I have a table with a CLOB column containing SQL code. Now I want to transfer the content from the developer database into the productive one. I use the following script to export the content of the dev table:
set long 100000
set lines 1000
spool d:\export.sql
select /*insert*/* from myTable;
spool off
However, the import into the prod table is not working due to ' characters in the SQL code. A generated insert statement looks like this:
insert into myTable (id, name, sql)
values (1, 'John', 'select * /* this is a hint */
from table1
where attr1 = 'hi,you' and attr2 = 'me, too')
How can I insert this CLOB, or how do I export it in a better way?

I'd use Data Pump if it's available.
If not, I'd use SQL*Loader.
What you can do, is use SQL Developer to unload your table to a SQL*Loader setup, each CLOB will be written to a file, and they can be loaded then w/o issues like what you're seeing.
I wrote this here for how to do this with BLOBS, but would be the same process.
The output will be all the files you need to move your table over to the new system, the control file, the data stream, and all the LOBS.
Once you have your files, you will need to make sure you have an Oracle Client installed, or have the full Instant Client.
This will give you access to SQL*Loader.
It's a command-line utility, no GUI. It works much like SQL*Plus does. You'll want to make sure your Oracle ENV is setup so you can start it up and connect.
But.
Everything you need is in the ZIP that SQLDev put together for you, the biggest piece is the .ctl (control file).
Docs
sqlldr scott CONTROL=ulcase1.ctl ulcase1.log
'scott' is the database username, it'll prompt you for a password. You'll subsitute the ulcase1.ctl for the ctl file you got from SQLDev. The log bit is optional, but IMPORTANT.
By the way, this should run FAST.
If you're running this on your pc, your connect string will be more like
sqlldr hr#server:port/service

Related

SQL Server - Copying data between tables where the Servers cannot be connected

We want some of our customers to be able to export some data into a file and then we have a job that imports that into a blank copy of a database at our location. Note: a DBA would not be involved. This would be a function within our application.
We can ignore table schema differences - they will match. We have different tables to deal with.
So on the customer side the function would ran somethiug like:
insert into myspecialstoragetable select * from source_table
insert into myspecialstoragetable select * from source_table_2
insert into myspecialstoragetable select * from source_table_3
I then run a select * from myspecialstoragetable and get a .sql file they can then ship to me which we can then use some job/sql script to import into our copy of the db.
I'm thinking we can use XML somehow, but I'm a little lost.
Thanks
Have you looked at the bulk copy utility bcp? You can wrap it with your own program to make it easier for less sophisticated users.
Since it is a function within your application, in what language is the application front-end written ? If it is .NET, you can use Data Transformation Services in SQL Server to do a sample export. In the last step, you could save the steps into a VB/.NET module. If necessary, modify this file to change table names etc. Integrate this DTS module into your application. While doing the sample export, export it to a suitable format such as .CSV, .Excel etc, whichever format from which you will be able to import into a blank database.
Every time the user wants do an export, he will have to click on a button that would invoke the DTS module integrated into your application, that will dump the data to the desired format. He can mail such file to you.
If your application is not written in .NET, in whichever language it is written, it will have options to read data from SQL Server and dump them to a .CSV or text file with delimiters. If it is a primitive language, you may have to do it by concatenating the fields of every record, by looping through the records and writing to a file.
XML would be too far-fetched for this, though it's not impossible. At your end, you should have the ability to parse the XML file and import it into your location. Also, XML is not really suited if the no. of records are too large.
You probably think of a .sql file, as in MySql. In SQL Server, .sql files, that are generated by the 'Generate Scripts' function of SQL Server's interface, are used for table structures/DDL rather than the generation of the insert statements for each of the record's hard values.

Hive: create table and write it locally at the same time

Is it possible in hive to create a table and have it saved locally at the same time?
When I get data for my analyses, I usually create temporary tables to track eventual
mistakes in the queries/scripts. Some of these are just temporary tables, while others contain the data that I actually need for my analyses.
What I do usually is using hive -e "select * from db.table" > filename.tsv to get the data locally; however when the tables are big this can take quite some time.
I was wondering if there is some way in my script to create the table and save it locally at the same time. Probably this is not possible, but I thought it is worth asking.
Honestly doing it the way you are is the best way out of the two possible ways but it is worth noting you can preform a similar task in an .hql file for automation.
Using syntax like this:
INSERT OVERWRITE LOCAL DIRECTORY '/home/user/temp' select * from table;
You can run a query and store it somewhere in the local directory (as long as there is enough space and correct privileges)
A disadvantage to this is that with a pipe you get the data stored nicely as '|' delimitation and new line separated, but this method will store the values in the hive default '^b' I think.
A work around is to do something like this:
INSERT OVERWRITE LOCAL DIRECTORY '/home/user/temp'
ROW FORMAT DELIMITED
FIELDS TERMINATED BY ','
select books from table;
But this is only in Hive 0.11 or higher

Export records from SQL Server 2005 express edition

I have a little problem. My friend has a database with over 10 tables and each table has over 90-100 records.
I can't find a workaround to export the records (to put in a SQL file something like this: INSERT INTO .... VALUES ... for each existing records) from his tables to import in my database.
How to do that ?
I tried: right click on a table -> Script Table as -> INSERT TO -> File ...
but it only generate the INSERT statement.
There are a solution ? or this feature is only for commercial version ?
UPDATE
You can use BCP command with command prompt like this
For export: bcp ADatabase.dbo.OneTable out d:\test\OneTable.bcp -c -Usa -Ppassword
For import: bcp ADatabase.dbo.OneTable in d:\test\OneTable.bcp -c -Usa -Ppassword
these commands will create a BCP file which contains records for specified table. You can import using existing BCP file into another database
If you use remote database then:
bcp ADatabaseRemote.dbo.OneTableRemote out d:\test\OneTableRemote.bcp -Slocalhost/SQLExpress -Usa -Ppassword
Instead of localhost/SQLExpress, you can use localhost or other server name...
Probably the simplest way to do this would be to run a SELECT statement that outputs to a file. Then you can import that data into your database.
For simple moves, I have also done a copy/paste manually. Sometimes it is better to use Excel as a staging platform before pasting it into the new database. You may need to create a temporary table in your new database that matches up exactly with the data you are pasting over. For example, I usually don't put a PK on the temp table at first and make the PK field just an INT. That way the copy will go smoother.
In the corporate world, you would use SSIS to move this data around.
a couple of ways you could do this. One,select everything from each table and save the results as a csv or delimited file (you can do this from sql management studio). You can also script the tables as create and copy the scripts over to the new database, assuming it is a sql server also. Then for import use the load infile statement. You may have to google the syntax for sql server but I know this works in mysql and oracle. haven't tried it in sql server yet.
LOAD DATA INFILE 'myfile'
INTO TABLE stuff
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
SET id = NULL;
Or if you are going to another sql server use the sql export import wizard.
http://msdn.microsoft.com/en-us/library/ms141209.aspx

Insert large amount of data efficiently with SQL

Hi I often have to insert a lot of data into a table. For example, I would have data from excel or text file in the form of
1,a
3,bsdf
4,sdkfj
5,something
129,else
then I often construct 6 insert statements in this example and run the SQL script. I found this was slow when I have to send thousands of small packages to server, it also causes extra overhead to the network.
What's your best way of doing this?
Update: I'm using ORACLE 10g.
Use Oracle external tables.
See also e.g.
OraFaq about external tables
What Tom thinks about external tables
René Nyffenegger's notes about external tables
A simple example that should get you started
You need a file located in a server directory (get familiar with directory objects):
SQL> select directory_path from all_directories where directory_name = 'JTEST';
DIRECTORY_PATH
--------------------------------------------------------------------------------
c:\data\jtest
SQL> !cat ~/.gvfs/jtest\ on\ 192.168.xxx.xxx/exttable-1.csv
1,a
3,bsdf
4,sdkfj
5,something
129,else
Create an external table:
create table so13t (
id number(4),
data varchar2(20)
)
organization external (
type oracle_loader
default directory jtest /* jtest is an existing directory object */
access parameters (
records delimited by newline
fields terminated by ','
missing field values are null
)
location ('exttable-1.csv') /* the file located in jtest directory */
)
reject limit unlimited;
Now you can use all the powers of SQL to access the data:
SQL> select * from so13t order by data;
ID DATA
---------- ------------------------------------------------------------
1 a
3 bsdf
129 else
4 sdkfj
5 something
Im not sure if this works in Oracle but in SQL Server you can use BULK INSERT sql statement to upload data from a txt or a csv file.
BULK
INSERT [TableName]
FROM 'c:\FileName.txt'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
Just make sure that the table columns correctly matches whats in the txt file. With a more complicated solution you may want to use a format file see the following:
http://msdn.microsoft.com/en-us/library/ms178129.aspx
There are alot of ways to speed this up.
1) Do it in a single transaction. This will speed things up by avoiding connection opening / closing.
2) Load directly as a CSV file. If you load data as a CSV file, the "SQL" statements aren't required at all. in MySQL the "LOAD DATA INFILE" operation accomplishes this very intuitively and simply.
3) You can also simply dump the whole file as text into a table called "raw". And then let the database parse the data on its own using triggers. This is a hack, but it will simplify your application code and reduce network usage.

What is the easiest way to add a bunch of content to a SQL database?

Nothing technical here. Suppose I have a lot of different categorized data, and I would like to create a database out of it. Would someone literally hand plug in all that info with SQL code itself? Or do some people make a mock website just to input data? What are some of your strategies?
If there would be no way to do it automatically, then a mock website would be the way to go: you can even use it with more people at once, actually multiplying the input speed (as long as you don't mess up assigning each of them a different part of the data).
What format is your data in? And how much of it is there? If its Excel then SQL Server has tools to import it in. I'm not sure if MySQL has anything similar. Even if it doesn't one other technique I have used with Excel data is to use a formula to concatenate as required to generate the INSERT statements. Then just paste those into a query window and run that.
I wouldn't do a website for it unless I was building an admin site for it already and wanted to test that with the initial load.
Most databases have a way to do bulk inserts or have tools for data import.
My strategies normally involve such tools.
Here is an example of importing a CSV file to SQL Server.
Most database servers provide a way to import data from a variety of formats, you could look into that first.
If not, you could write a simple script or console application to parse your input data, and write out a SQL script to insert the data into appropriate tables.
For example, if you data was in a CSV file, you would parse each line in the file, and generate an insert statement to write out to a .sql file.
MyData.csv
1,2,3,'Test',4
2,3,4,'Test2,6
GeneratedInsert.sql
insert into table (col1,col2,col3,col4,col5) values (1,2,3,'Test',4)
insert into table (cal1,col2,col3,col4,col5) values (2,3,4,'Test2',6)
MySQL has a statement LOAD DATA INFILE that is intended for loading bulk data from flat files. It's easy to use and much faster than alternative methods.
But first you do have to use SQL to design tables with fields that match the field of your import data. That is, if you have some file with comma-separated data:
Titanic;1997;4 stars
Batman Begins;2005;5 stars
"Harry Potter and the Sorcerer's Stone";2001;3 stars
...
You would create a table:
CREATE TABLE Movies (
title VARCHAR(100) NOT NULL,
year YEAR NOT NULL
rating VARCHAR(10)
);
Then load data:
LOAD DATA INFILE 'movies.txt' INTO TABLE Movies
FIELDS TERMINATED BY ';' OPTIONALLY ENCLOSED BY '"';
Most web languages have some sort of auto-scaffolding that you can quickly set up. Useful for admin work as well, if your site is hosted without direct access to DB.
Otherwise, yeah - write the SQL statements. Useful to bring a database up as part of your build process.