I have a table with a CLOB column containing SQL code. Now I want to transfer the content from the developer database into the productive one. I use the following script to export the content of the dev table:
set long 100000
set lines 1000
spool d:\export.sql
select /*insert*/* from myTable;
spool off
However, the import into the prod table is not working due to ' characters in the SQL code. A generated insert statement looks like this:
insert into myTable (id, name, sql)
values (1, 'John', 'select * /* this is a hint */
from table1
where attr1 = 'hi,you' and attr2 = 'me, too')
How can I insert this CLOB, or how do I export it in a better way?
I'd use Data Pump if it's available.
If not, I'd use SQL*Loader.
What you can do, is use SQL Developer to unload your table to a SQL*Loader setup, each CLOB will be written to a file, and they can be loaded then w/o issues like what you're seeing.
I wrote this here for how to do this with BLOBS, but would be the same process.
The output will be all the files you need to move your table over to the new system, the control file, the data stream, and all the LOBS.
Once you have your files, you will need to make sure you have an Oracle Client installed, or have the full Instant Client.
This will give you access to SQL*Loader.
It's a command-line utility, no GUI. It works much like SQL*Plus does. You'll want to make sure your Oracle ENV is setup so you can start it up and connect.
But.
Everything you need is in the ZIP that SQLDev put together for you, the biggest piece is the .ctl (control file).
Docs
sqlldr scott CONTROL=ulcase1.ctl ulcase1.log
'scott' is the database username, it'll prompt you for a password. You'll subsitute the ulcase1.ctl for the ctl file you got from SQLDev. The log bit is optional, but IMPORTANT.
By the way, this should run FAST.
If you're running this on your pc, your connect string will be more like
sqlldr hr#server:port/service
Hi I have a Table Called Temp
with two columns Name (varchar) and Image (VarBinary(max)) datatype respectively
Insert into Temp
(
Name,
Image
)
Select
'Bob',
(
select BulkColumn from openrowset (Bulk 'http://pngimg.com/upload/apple_PNG2579.png',Single_Blob) as Apple
)
This query errors out saying 'Cannot bulk load because the file "http://pngimg.com/upload/apple_PNG2579.png" could not be opened. Operating system error code 123(error not found). However when i click on the url it is perfectly valid.
Anyone up for suggestions?
Please note there is a way to store the image on the local server and then access it. However i am trying to find a way to directly do it within sql
Thanks in Advance
http://pngimg.com/upload/apple_PNG2579.png seems like an URL to me.
BULK INSERT is expecting a file name like c:\image\image.jpg
Using URL as the source is not supported with BULK INSERT.
Try to download the files locally first, then insert into the table.
SSIS, for example, has the functionality to download HTTP content.
below is the sample line of csv
012,12/11/2013,"<555523051548>KRISHNA KUMAR ASHOKU,AR",<10-12-2013>,555523051548,12/11/2013,"13,012.55",
you can see KRISHNA KUMAR ASHOKU,AR as single field but it is treating KRISHNA KUMAR ASHOKU and AR as two different fields because of comma, though they are enclosed with " but still no luck
I tried
BULK
INSERT tbl
FROM 'd:\1.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
FIRSTROW=2
)
GO
is there any solution for it?
The answer is: you can't do that. See http://technet.microsoft.com/en-us/library/ms188365.aspx.
"Importing Data from a CSV file
Comma-separated value (CSV) files are not supported by SQL Server bulk-import operations. However, in some cases, a CSV file can be used as the data file for a bulk import of data into SQL Server. For information about the requirements for importing data from a CSV data file, see Prepare Data for Bulk Export or Import (SQL Server)."
The general solution is that you must convert your CSV file into one that can be be successfully imported. You can do that in many ways, such as by creating the file with a different delimiter (such as TAB) or by importing your table using a tool that understands CSV files (such as Excel or many scripting languages) and exporting it with a unique delimiter (such as TAB), from which you can then BULK INSERT.
They added support for this SQL Server 2017 (14.x) CTP 1.1. You need to use the FORMAT = 'CSV' Input File Option for the BULK INSERT command.
To be clear, here is what the csv looks like that was giving me problems, the first line is easy to parse, the second line contains the curve ball since there is a comma inside the quoted field:
jenkins-2019-09-25_cve-2019-10401,CVE-2019-10401,4,Jenkins Advisory 2019-09-25: CVE-2019-10401:
jenkins-2019-09-25_cve-2019-10403_cve-2019-10404,"CVE-2019-10404,CVE-2019-10403",4,Jenkins Advisory 2019-09-25: CVE-2019-10403: CVE-2019-10404:
Broken Code
BULK INSERT temp
FROM 'c:\test.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '0x0a',
FIRSTROW= 2
);
Working Code
BULK INSERT temp
FROM 'c:\test.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '0x0a',
FORMAT = 'CSV',
FIRSTROW= 2
);
Unfortunately , SQL Server Import methods( BCP && BULK INSERT) do not understand quoting " "
Source : http://msdn.microsoft.com/en-us/library/ms191485%28v=sql.100%29.aspx
I have encountered this problem recently and had to switch to tab-delimited format. If you do that and use the SQL Server Management Studio to do the import (Right-click on database, then select Tasks, then Import) tab-delimited works just fine. The bulk insert option with tab-delimited should also work.
I must admit to being very surprised when finding out that Microsoft SQL Server had this comma-delimited issue. The CSV file format is a very old one, so finding out that this was an issue with a modern database was very disappointing.
MS have now addressed this issue and you can use FIELDQUOTE in your with clause to add quoted string support:
FIELDQUOTE = '"',
anywhere in your with clause should do the trick, if you have SQL Server 2017 or above.
Well, Bulk Insert is very fast but not very flexible. Can you load the data into a staging table and then push everything into a production table? Once in SQL Server, you will have a lot more control in how you move data from one table to another. So, basically.
1) Load data into staging
2) Clean/Convert by copying to a second staging table defined using the desired datatypes. Good data copied over, bad data left behind
3) Copy data from the "clean" table to the "live" table
I am using SQL Server 2008 R2 on my local PC. In my database there is one table with approx 98000 rows. Now I want to transfer that data directly to online server database. I have tried by making script of that table but when I run that script, it gives me error of insufficient memory. plz help me.. how can I do this. Thanks
There are a variety of strategies you can employ in this instance. Here's a few off the top of my head...
Got some .NET programming up your sleeve? Try the SqlBulkCopy class
Export the data to a transferable format, e.g. CSV file and then use BULK INSERT to insert the data.
Try using OPENROWSET to copy from the local to remote. Stackoverflow example
If you've got the full leverage of SSIS, example of SSIS is just here
A bit Heath Robinson but why not grab the data out into CSV and using some Excel skills, build the individual statements yourself. Example here using INSERT INTO and UNION
HTH
i am getting the problem to generate the 'insert into' script for table.table having the column Photo an which is of 'Image' type.
how i can generate the insert script for it.
(i am trying with MyGeneration tool to generate insert script but it fails to generate)
is there any way or tool to generate insert script.
thanks in advance
You need to convert image to byte[] to insert it into database
Here is Storing and Retrieving Images from SQL Server using Microsoft .NET article, which will be helpful for your task.