inserting value in long raw column - sql

I am required to test something with a long raw column in Oracle DB. For this I need to insert a value of length, say 4000, in that column. The data can be something simple like "AAAA ..." 4000 times. I tried inserting a large value using sqlplus but get the following error (perhaps due to length limitations in sqlplus ?
ERROR:
ORA-00972: identifier is too long
Is it possible to insert a large value into the long raw column using sqlplus ?

This is a limitation of sqlplus where the input cannot be more than 2499 characters.
I would suggest that you do the insert as part of a two step process.
Insert data into the table.column but keep the data less then 2500 characters.
Update the same column and concatenate the rest of the data to the already inserted data.
Not an ideal scenario, but as far as I can see, this is the only way if you want to use sqlplus.

In Oracle, "AAA" is an identifier, and 'AAA' is a string value.

Related

I want to insert a 200k character data in a column using sqldev. I tried basic insert but it says string too long. may I know how can I insert it?

I am using oracle sql developer and I need to insert a 200k data on one cell the data type of the column is already set to clob. However when I tried to insert the data, it says
SQL Error: ORA-01704 : string literal too long

Query remote oracle CLOB data from MSSQL

I read different posts about this problem but it didn't help me with my problem.
I am on a local db (Microsoft SQL Server) and query data on remote db (ORACLE).
In this data, there is a CLOB type.
CLOB type column shows me only 7 correct data the others show me <null>
I tried to CAST(DEQ_COMMENTAIRE_REFUS_IMPORT AS VARCHAR(4000))
I tried to SUBSTRING(DEQ_COMMENTAIRE_REFUS_IMPORT, 4000, 1)
Can you help me, please ?
Thank you
No MSSQL but in my case we were pulling data into MariaDB using the ODBC Connect engine from Oracle.
For CLOBs, we did the following (in outline):
Create PLSQL function get_clob_chunk ( clobin CLOB, chunkno NUMBER) RETURN VARCHAR2.
This will return the the specified nth chunk of 1000 chars for the CLOB.
We found 1,000 worked best with multibyte data. If the data is all plain text single byte that chunks of 4,000 are safe.
Apologies for the absence of actual code, as I'm a bit rushed for time.
Create a Oracle VIEW which calls the get_clob_chunk function to split the CLOB into 1,000 char chunk columns chunk1, chunk2, ... chunkn, CAST as VARCHAR2(1000).
We found that Oracle did not like having more than 16 such columns, so we had to split the views into sets of 16 such columns.
What this means is that you must check what the maximum size of data in the CLOB is so you know how many chunks/views you need. To do this dynamically adds complexity, needless to say.
Create a view in MariaDB querying the view.
Create table/view in MariaDB that joins the chunks up into a single Text column.
Note, in our case, we found that copying Text type columns between MariaDB databases using the ODBC Connect engine was also problematic, and required a similar splitting method.
Frankly, I'd rather use Java/C# for this.

INSERT Statement in SQL Server Strips Characters, but using nchar(xxx) works - why?

I have to store some strange characters in my SQL Server DB which are used by an Epson Receipt Printer code page.
Using an INSERT statement, all are stored correctly except one - [SCI] (nchar(154)). I realise that this is a control character that isn't representable in a string, but the character is replaced by a '?' in the stored DB string, suggesting that it is being parsed (unsuccessfully) somewhere.
The collation of the database is LATIN1_GENERAL_CI_AS so it should be able to cope with it.
So, for example, if I run this INSERT:
INSERT INTO Table(col1) VALUES ('abc[SCI]123')
Where [SCI] is the character, a resulting SELECT query will return 'abc?123'.
However, if I use NCHAR(154), by directly inserting or by using a REPLACE command such as:
UPDATE Table SET col1 = REPLACE(col1, '?', NCHAR(154))
The character is stored correctly.
My question is, why? And how can I store it directly from an INSERT statement? The latter is preferable as I am writing from an existing application that produces the INSERT statement that I don't really want to have to change.
Thank you in advance for any information that may be useful.
When you write a literal string in SQL is is created as a VARCHAR unless you prefix is with N. This means if you include any Unicode characters, they will be removed. Instead write your INSERT statement like this:
INSERT INTO Table(col1) VALUES (N'abc[SCI]123')

Struggling with LOBs

I am struggling on figuring out how to search in a LOB. I was trying the following but got the ORA-19011: Character string buffer too small, error
select * from gtpintr_data.sagadata sa where SA.DATA like '4780471';
The SQL LIKE command only works on varchar-type datatypes like VARCHAR2. Oracle has to convert the LOB to a string in order to run your query, so if it cannot fit it into the maximum size for a string it will fail.
You could use DBMS_LOB.INSTR in a PL/SQL program instead:
http://docs.oracle.com/database/121/ARPLS/d_lob.htm#ARPLS66715
But that will be slow as you would need to call it for each row in the table.
A better alternative is to add an Oracle Text index on the column and use the CONTAINS operator.
http://docs.oracle.com/database/121/CCREF/toc.htm

SQL Server : truncation error with plenty of room in char destination data type

I am attempting to do an insert from a select statement in SQL Server 2008 R2. The destination column's data type is char(7) and I have verified the len() and datalength() of the source column to be no longer than 6.
I am getting truncation error:
Msg 8152, Level 16, State 14, Line 219
String or binary data would be truncated.
I have verified using temp tables that an insert into a char(9) column works, but unfortunately the destination database will not support the data type change.
UPDATE: I was able to do the insert as required by adding a DISTINCT clause to the select statement in question, however the number of rows remains the same. So, I guess the reformatted question is why does adding the distinct clause return no error message even if the data is the same? Thanks!
What character set are you using? Some character sets have characters that take up two bytes. However the length function will still count them as 1.
I agree with Bill and David. But if most of the characters from your source are multi-byte then (some characters even take up 3 bytes) then you may need to use something like varchar(50) just to make sure your field is big enough to avoid truncation errors.