liquibase error String too long more than 4000 charchater - liquibase

I have question regarding liquibase.I am getting an error while inserting clob type data.String literal is too long.I have mentioned as follows
column name="help_item_text" type="clob" value="String too long more than 4000 charchaters"
But no luck still same.

Normally Liquibase uses standard SQL statements so that there is no difference between updateSql and update modes. For CLOB fields, that can run into problems when the total SQL length gets to be longer than the database's SQL parser can handle.
There is a valueClobFile attribute on column that allows you to save the long value to a file and then reference it from the changelog file. This gets the large value out of your changelog file and also tells Liquibase it needs to use a prepared statement.
<column name="help_item_text" type="clob" valueClobFile="String too long more than 4000 characters">

Related

ORA-01704: string literal too long when migrating database

I migrate some data from a database and get it in a data.xml file
Example:
<insert tableName="ORG_CUSTOMER">
<column name="My-Column" value="a very long string.."/>
</insert>
I then execute it with a cmd command
call ..\liquibase\liquibase-bin\liquibase.bat --changeLogFile=%DATA% --defaultSchemaName=NAME_DBA --defaultsFile=liquibase-local.properties update
And because the value sometimes is longer than 4000 characters i get the error:
Unexpected error running Liquibase: ORA-01704: string literal too long
The column is of value clob. The autogenerated xml file contains a lot of inserts I would prefer to not edit it manually. I read something that making it a prepared statement or using pl/sql will work but I don't now how to do that?
Oracle - by default - has a limit of 4000 byte for string literals.
If you are on 12.1 or newer you could raise that limit to 32k.
If you are using an old version or can't change the parameter, you can't use Liquibase's <insert> change.
You could try the <loadData> change and hope it uses a PreparedStatement, in that case the limit doesn't apply:
http://www.liquibase.org/documentation/changes/load_data.html
Without the code of liquibase.bat it is difficult to say but.
From the documentation on literals:
Text literals have properties of both the CHAR and VARCHAR2 datatypes:
Within expressions and conditions, Oracle treats text literals as though they have the datatype CHAR by comparing them using blank-padded comparison semantics.
A text literal can have a maximum length of 4000 bytes.
I would guess that the script is naively converting the XML from:
<insert tableName="ORG_CUSTOMER">
<column name="My-Column" value="a very long string.."/>
</insert>
To
INSERT INTO {tablename} ( "{column.name}" ) VALUES ( '{column.value}' );
and this will fail when the text literal has more than 4000 bytes.
To fix this you will need to edit the script to handle CLOB values with more than 4000 bytes (where you cannot use a text literal).

Can't convert String to Numeric/Decimal in SSIS

I have five or six OLE DB Sources with a String[DT_STR], with a length of 500 and 1252 (Latin) as Code Page.
The format of the column is like 0,08 or 0,10 etc etc. As you can see, it is separated with a comma.
All of them are equal except one of them. In this one source, I have a POINT as separation. On this it is working when I set the Data Type in the advanced editor of the OLE DB Source. On another (with comma separated) it is also working, if I set the Data Type in the advanced editor of the OLE DB Source. BUT the weird thing is, that it isn't working with the other sources although they are the same (sperated with comma).
I tested Numeric(18,2) and decimal(2).
Another try to solve the problem with the conversion task and/or the derived column task, failed.
I'm using SQL Server 2008 R2
Slowly, I think SSIS is fooling me :)
Has anyone an idea?
/// EDIT
Here a two screens:
Is working:
click
Isn't working:
click
I would not set the Data Type in the Advanced Editor of the OLE DB Source. I would convert the data in the SQL Code of the OLE DB Source, or in a Script Transformation e.g. using Decimal.TryParse , which would update a new column.
SSIS is unbeleivably fussy over datatypes and trying to mess with its internals is not productive.
Check that there are any spaces in between the commas, so that the SSIS is throwing an error trying to convert the blank space to a number. A blank space does not equal nothing in between spaces.
Redirect error rows and output the data to a file. Then you can examine the data that is being rejected by the SSIS and determine why it's causing error.
Reason for the error
1) Null’s are not properly handled either in the destination database or during SSIS package creation. It is quite possible that the source contains a null database but the destination is not accepting the NULL data leading to build generate above error.
2) Data types between source and destination does not match. For example, source column has varchar data and destination column have an int data type. This can easily generate above error. There are certain kind of datatypes which will automatically convert to another data type and will not generate the error but there are for incompatible datatypes which will generate The value could not be converted because of a potential loss of data. error.
The Issue arises when there is unhandled space or null. I have worked around using the Conditional (Ternary) Operator which checks the length:
LEN(TRIM([Column Name])) >= 1 ? (DT_NUMERIC,38,8)[Column Name] : 0

VB.NET and SQL INSERT INTO; data is truncated

I use VB.NET Studio Express 2012 to read a filestream into SQL Server Express. The database and table are created fine, most records load without error using .ExecuteNonQuery INSERT INTO, but some records I get the error:
String or binary data would be truncated.
Originally this was correct, because the column was only 20 characters and the data was between 22-25 on the failing records. I have changed the table so the column now is 30 char, but the error is still the same. I dropped the database and recreated it, but still the same problem.
Does VB keep info on field length somewhere?
May be some spaces are present before or after your string,you can use Trim() function and then try to insert.Trim function will remove extra spaces placed before and after you string.

Running into string constant size limits in insert statements with Oracle

I'm trying to copy a row from our production DB to my own little personal Oracle Express DB to repro a bug, since I can't really step into the code on production. Unfortunately, this row involves a column that serializes some sort of data structure into a blob column type, laughing in the face of the normalization gods. Here's the INSERT:
INSERT INTO TPM_VIEWS VALUES(
5,
'Test Repro View',
665,
1,
'0001000000ffffffff01000000000000000c020000003a44414c2c205... //About 7600 characters
);
I've tried running this in Aqua Data Studio 10 and I get:
ORA-01704: string literal too long
Next I tried pasting it into SQL*Plus, which gives me:
SP2-0027: Input is too long (> 2499 characters) - line ignored
Lastly, I tried pasting the whole thing into foo.sql and ran #foo.sql which gives me:
SQL> #c:\foo.sql
Input truncated to 7499 characters
SP2-0027: Input is too long (> 2499 characters) - line ignored
ERROR:
ORA-01756: quoted string not properly terminated
What's the super secret Oracle expert way to do this? And no, I don't have access to the Oracle server itself so I can't run any command line backup or export utilities. Thanks!
UPDATE:
I also tried splitting apart the string by sprinkling some ' || ''s around randomly, which gives me the error:
ORA-01489: result of string concatenation is too long
Since you can access the production database, the simplest solution is probably to create a database link from your local XE database to the production database.
CREATE DATABASE LINK link_to_prod
CONNECT TO <<your user name in prod>>
IDENTIFIED BY <<your password in prod>>
USING <<TNS alias for prod database>>
Then, you can copy the data from prod to your local database
INSERT INTO tpm_views
SELECT <<list of columns including BLOB>>
FROM tpm_views#link_to_prod
WHERE <<some key>> = 5

Issues with Chr(0) in SQL INSERT script

We currently use the SQL Publishing Wizard to back up our database schemas and data, however we have some database tables with hashed passwords that contain the null character (chr(0)). When SQL Publishing Wizard generates the insert data scripts, the null character causes errors when we try and run the resulting SQL - it appears to ignore ALL TEXT after the first instance of this character in a script. We recently tried out RedGate SQL Compare, and found that it has the same issue with this character. I have confirmed it is ascii character code 0 by running the ascii() sql function against the offending record.
A sample of the error we are getting is:
Unclosed quotation mark after the character string '??`????{??0???
The fun part is, I can't really paste a sample Insert statement because of course everything that appears after the CHR(0) is being omitted when pasting!
Change the definition of the column to VARBINARY. The data you store in there doesn't seem to be an appropiate VARCHAR to start with.
This will ripple through the code that uses the column as you'll get a byte[] CLR tpe back in the client, and you should change your insert/update code accordingly. But after all, a passowrd hash is a byte[], not a string.