I am trying to insert/update NCLOB data on oracle using mybatis. But having problem with this.
Data type of column is NCLOB and below is the code i used on sqldeveloper to test it.
UPDATE CONTENT_TABLE SET HTML_DATA = N'通过信息通信系统的整合' WHERE SEQ = 1;
I tried this on sqldeveloper to test it and it works fine.(I put 'N' in front of inserting string for NCLOB.
But when i try it by mybatis it does not work...
here is the code i tried on mybatis.
UPDATE CONTENT_TABLE SET HTML_DATA = N#{htmlData} WHERE SEQ = 1
That didn't work. Is there any way to use that 'N' to put on NCLOB data in mybatis other ways?
(by the way, the reason i used NCLOB data type is to put chinese data on DB.. otherwise when data type was just CLOB, the chinese data that i insert was broken..
Related
Using Microsoft SQL Server 2019.
I have two columns, one text representing some xml, another varbinary(max) representing already compressed xml, that I need to compress.
Please assume I cannot change the source data, but conversions can be made as necessary in the code.
I'd like to compress the text column, and initially it works fine, but if I try to save it into a temp table to be used further along in the process I get weird characters like ‹ or tŠÌK'À3û€Í‚;jw. Again, the first temp table I make stores it just fine, I can select the initial table and it displays compressed correctly. But if I need to pull it into a secondary temp table or variable from there it turns into a mess.
I've tried converting into several different formats, converting later in the process, and bringing in the source data for the column at the very last stage, but my end goal is to populate a variable that will be converted into JSON, and it always ends up weird there as well. i just need the compressed version of the columns do display properly when viewing the json variable I've made.
Any suggestions on how to tackle this?
Collation issue?
This smells of collation issue. tempdb is actually its own database with its own default collation and other settings.
In one database with default CollationA you call COMPRESS(NvarcharData) and that produces some VARBINARY.
In other database (tempdb) with default CollationB you call CONVERT(NVARCHAR(MAX), DECOMPRESS(CompressedData)). Now, what happens under the hood is:
CompressedData gets decompressed into VARBINARY representing NvarcharData in CollationA
that VARBINARY is converted to NVARCHAR assuming the binary data represents NVARCHAR data in CollationB, which is not true!
Try to be more explicit (collation, data type) with conversions between XML, VARBINARY and (N)VARCHAR.
Double compression?
I have also noticed "representing already compressed xml, that I need to compress". If you are doublecompressing, maybe you forgot to doubledecompress?
Example?
You are sadly missing an example, but I have produced minimal example of converting between XML and compressed data that works for me.
BEGIN TRANSACTION
GO
CREATE TABLE dbo.XmlData_Base (
PrimaryKey INTEGER NOT NULL IDENTITY(1, 1),
XmlCompressed VARBINARY(MAX) NULL
);
GO
CREATE OR ALTER VIEW dbo.XmlData
WITH SCHEMABINDING
AS
SELECT
BASE.PrimaryKey,
CONVERT(XML, DECOMPRESS(BASE.XmlCompressed)) AS XmlData
FROM
dbo.XmlData_Base AS BASE;
GO
CREATE OR ALTER TRIGGER dbo.TR_XmlData_instead_I
ON dbo.XmlData
INSTEAD OF INSERT
AS
BEGIN
INSERT INTO dbo.XmlData_Base
(XmlCompressed)
SELECT
COMPRESS(CONVERT(VARBINARY(MAX), I.XmlData))
FROM
Inserted AS I;
END;
GO
CREATE OR ALTER TRIGGER dbo.TR_XmlData_instead_U
ON dbo.XmlData
INSTEAD OF UPDATE
AS
BEGIN
UPDATE BASE
SET
BASE.XmlCompressed = COMPRESS(CONVERT(VARBINARY(MAX), I.XmlData))
FROM
dbo.XmlData_Base AS BASE
JOIN Inserted AS I ON I.PrimaryKey = BASE.PrimaryKey;
END;
GO
INSERT INTO dbo.XmlData
(XmlData)
VALUES
(CONVERT(XML, N'<this><I>I call upon thee!</I></this>'));
SELECT
*
FROM
dbo.XmlData;
SELECT
PrimaryKey,
XmlCompressed,
CONVERT(XML, DECOMPRESS(XmlCompressed))
FROM
dbo.XmlData_Base;
UPDATE dbo.XmlData
SET
XmlData = CONVERT(XML, N'<that><I>I call upon thee!</I></that>');
SELECT
*
FROM
dbo.XmlData;
SELECT
PrimaryKey,
XmlCompressed,
CONVERT(XML, DECOMPRESS(XmlCompressed))
FROM
dbo.XmlData_Base;
GO
ROLLBACK TRANSACTION;
I've got a front table that essentially matches our SSMS database table t_myTable. Some columns I'm having problems with are those with numeric data types in the db. They are set to allow null, but from the front end when the user deletes the numeric value and tries to send a blank value, it's not posting to the database. I suspect because this value is sent back as an empty string "" which does not translate to the null allowable data type.
Is there a trigger I can create to convert these empty strings into null on insert and update to the database? Or, perhaps a trigger would already happen too late in the process and I need to handle this on the front end or API portion instead?
We'll call my table t_myTable and the column myNumericColumn.
I could also be wrong and perhaps this 'empty string' issue is not the source of my problem. But I suspect that it is.
As #DaleBurrell noted, the proper place to handle data validation is in the application layer. You can wrap each of the potentially problematic values in a NULLIF function, which will convert the value to a NULL if an empty string is passed to it.
The syntax would be along these lines:
SELECT
...
,NULLIF(ColumnName, '') AS ColumnName
select nullif(Column1, '') from tablename
SQL Server doesn't allow to convert an empty string to the numeric data type. Hence the trigger is useless in this case, even INSTEAD OF one: SQL Server will check the conversion before inserting.
SELECT CAST('' AS numeric(18,2)) -- Error converting data type varchar to numeric
CREATE TABLE tab1 (col1 numeric(18,2) NULL);
INSERT INTO tab1 (col1) VALUES(''); -- Error converting data type varchar to numeric
As you didn't mention this error, the client should pass something other than ''. The problem can be found with SQL Profiler: you need to run it and see what exact SQL statement is executing to insert data into the table.
I have a CLOB column which contains a large amount of XML. I want to add a new attribute in that XML, like this attribute :
<name>me</name>
I tried using UpdateXML but I'm not getting it right.
CLOB is converted to XMLType using XMLType() and XMLType is converted to CLOB using to_clob. The following is an example.
create table table_with_clob (myclob clob);
insert into table_with_clob values ('<mytag><subtag>hello world</subtag></mytag>');
UPDATE table_with_clob SET myclob =
to_clob(INSERTCHILDXML(xmltype(myclob),
'/mytag', 'subtag',
XMLType('<subtag>another</subtag>')));
select * from table_with_clob;
Output
myclob
------
<mytag><subtag>hello world</subtag><subtag>another</subtag></mytag>
Though I think this is not very efficient and you might better convert the column to XMLType and the operate with it.
In connection with data replication from SQL Server to DB2 I have the following question:
On DB2 I have a table containing (for simplicity) two columns: COL1 and COL2.
COL1 is defined as CHAR(20). COL2 is defined as CHAR(10).
COL1 is replicated from SQL by converting a string into hex, e.g. "abcdefghij" to "6162636465666768696A" or "1111111111" to "31313131313131313131" by using the following SQL query:
CONVERT(char(20), cast(#InputString as binary) 2)
where #InputString would be "abedefghij".
In other words COL1 contains the hex value, but as a string (sorry if the wording is incorrect).
I need to convert the hex value back to a string and put this value into COL2.
What should the SQL query be on DB2 to do the convertion? I know how to do this on SQL Server, but not on DB2.
Note: The reason the hex-value is not pre-fixed with "0x" is because style 2 is used in the CONVERT statement.
select hex('A') from sysibm.sysdummy1;
returns 41.
and
select x'41' from sysibm.sysdummy1;
gives you 'A'. So you can put that in a for loop and loop through each pair of hex characters to arrive at your original string. Or you can write your own unhex function.
Taken from dbforums.com /db2/1627076-display-hex-columns.html (edit Nov 2020: original source link is now a spam site)
DB2 has built-in encoding/decoding.
For OPs question, use....
select CAST(ColumnName as char(20) CCSID 37) as ColumnName from TableName where SomeConditionExists
http://www-01.ibm.com/support/knowledgecenter/SSEPEK_10.0.0/com.ibm.db2z10.doc.intro/src/tpc/db2z_introcodepage.dita
This is one of most close topics to subject of my problem:
I have lost 2 days to figure out how to migrate XML files stored in DB2 BLOB field using SQL Developer.
(Yes, migrating to and doing the queries from SQL Developer - we are migrating data to Oracle from DB2, so we were using this tool)!
How to show XML file/string stored in BLOB?
Let's start with, what the problem was:
Data in BLOB was a XML file.
When selected in query, got:
When casted, like:
select CAST(BLOBCOLUMN as VARCHAR(1000)) from TABLE where id = 100;
output was in HEX:
Nothing worked... Not even the solution from the links in this topic.
!NOTHING!
By mistake found a solution:
CREATE FUNCTION in DB2:
CREATE FUNCTION unhex(in VARCHAR(32000) FOR BIT DATA)
RETURNS VARCHAR(32000)
LANGUAGE SQL
CONTAINS SQL
DETERMINISTIC NO EXTERNAL ACTION
BEGIN ATOMIC
RETURN in;
END
Run SELECT:
select UNHEX( CAST(BLOBCOLUMN as VARCHAR(32000) FOR BIT DATA)) from TABLE where id = 100;
Result:
I use this to convert FOR BIT DATA to characters:
cast (colvalue as varchar(2000) ccsid ebcdic for sbcs data)
I have a mysql database that holds content as a blob, for whatever reason those developers chose to use a blob is out of my control. Is it possible to convert the data to text and the data type to text?
have you tried the alter table command ?
alter table mytable change mycolumn mycolumn text;
from http://forums.mysql.com/read.php?103,164923,167648#msg-167648 it looks like you can use CAST.
you could create a new (TEXT) column, then fill it in with an update command:
update mytable set myNewColumn = CAST(myOldColumn AS CHAR(10000) CHARACTER SET utf8)
Converting the field from blob to text truncates all characters > 127. In my case we have lots of european characters, so this was not an option. Here's what I did:
Create temp field as text
Copy the blob field to the temp field: UPDATE tbl SET col_temp = CONVERT(col USING latin1); In this case my blob held latin1 encoded chars
Convert actual field to text datatype
Copy temp to actual field
Remove temp column
Not exactly straightforward but it worked and no data loss. I'm using Version: '5.1.50-community'