I have a CLOB column which contains a large amount of XML. I want to add a new attribute in that XML, like this attribute :
<name>me</name>
I tried using UpdateXML but I'm not getting it right.
CLOB is converted to XMLType using XMLType() and XMLType is converted to CLOB using to_clob. The following is an example.
create table table_with_clob (myclob clob);
insert into table_with_clob values ('<mytag><subtag>hello world</subtag></mytag>');
UPDATE table_with_clob SET myclob =
to_clob(INSERTCHILDXML(xmltype(myclob),
'/mytag', 'subtag',
XMLType('<subtag>another</subtag>')));
select * from table_with_clob;
Output
myclob
------
<mytag><subtag>hello world</subtag><subtag>another</subtag></mytag>
Though I think this is not very efficient and you might better convert the column to XMLType and the operate with it.
Related
Using Microsoft SQL Server 2019.
I have two columns, one text representing some xml, another varbinary(max) representing already compressed xml, that I need to compress.
Please assume I cannot change the source data, but conversions can be made as necessary in the code.
I'd like to compress the text column, and initially it works fine, but if I try to save it into a temp table to be used further along in the process I get weird characters like ‹ or tŠÌK'À3û€Í‚;jw. Again, the first temp table I make stores it just fine, I can select the initial table and it displays compressed correctly. But if I need to pull it into a secondary temp table or variable from there it turns into a mess.
I've tried converting into several different formats, converting later in the process, and bringing in the source data for the column at the very last stage, but my end goal is to populate a variable that will be converted into JSON, and it always ends up weird there as well. i just need the compressed version of the columns do display properly when viewing the json variable I've made.
Any suggestions on how to tackle this?
Collation issue?
This smells of collation issue. tempdb is actually its own database with its own default collation and other settings.
In one database with default CollationA you call COMPRESS(NvarcharData) and that produces some VARBINARY.
In other database (tempdb) with default CollationB you call CONVERT(NVARCHAR(MAX), DECOMPRESS(CompressedData)). Now, what happens under the hood is:
CompressedData gets decompressed into VARBINARY representing NvarcharData in CollationA
that VARBINARY is converted to NVARCHAR assuming the binary data represents NVARCHAR data in CollationB, which is not true!
Try to be more explicit (collation, data type) with conversions between XML, VARBINARY and (N)VARCHAR.
Double compression?
I have also noticed "representing already compressed xml, that I need to compress". If you are doublecompressing, maybe you forgot to doubledecompress?
Example?
You are sadly missing an example, but I have produced minimal example of converting between XML and compressed data that works for me.
BEGIN TRANSACTION
GO
CREATE TABLE dbo.XmlData_Base (
PrimaryKey INTEGER NOT NULL IDENTITY(1, 1),
XmlCompressed VARBINARY(MAX) NULL
);
GO
CREATE OR ALTER VIEW dbo.XmlData
WITH SCHEMABINDING
AS
SELECT
BASE.PrimaryKey,
CONVERT(XML, DECOMPRESS(BASE.XmlCompressed)) AS XmlData
FROM
dbo.XmlData_Base AS BASE;
GO
CREATE OR ALTER TRIGGER dbo.TR_XmlData_instead_I
ON dbo.XmlData
INSTEAD OF INSERT
AS
BEGIN
INSERT INTO dbo.XmlData_Base
(XmlCompressed)
SELECT
COMPRESS(CONVERT(VARBINARY(MAX), I.XmlData))
FROM
Inserted AS I;
END;
GO
CREATE OR ALTER TRIGGER dbo.TR_XmlData_instead_U
ON dbo.XmlData
INSTEAD OF UPDATE
AS
BEGIN
UPDATE BASE
SET
BASE.XmlCompressed = COMPRESS(CONVERT(VARBINARY(MAX), I.XmlData))
FROM
dbo.XmlData_Base AS BASE
JOIN Inserted AS I ON I.PrimaryKey = BASE.PrimaryKey;
END;
GO
INSERT INTO dbo.XmlData
(XmlData)
VALUES
(CONVERT(XML, N'<this><I>I call upon thee!</I></this>'));
SELECT
*
FROM
dbo.XmlData;
SELECT
PrimaryKey,
XmlCompressed,
CONVERT(XML, DECOMPRESS(XmlCompressed))
FROM
dbo.XmlData_Base;
UPDATE dbo.XmlData
SET
XmlData = CONVERT(XML, N'<that><I>I call upon thee!</I></that>');
SELECT
*
FROM
dbo.XmlData;
SELECT
PrimaryKey,
XmlCompressed,
CONVERT(XML, DECOMPRESS(XmlCompressed))
FROM
dbo.XmlData_Base;
GO
ROLLBACK TRANSACTION;
I Have an XML which have more than 4000 Characters . The Datatype in Oracle in XMLTYPE . The Insert or Execute Script Components are not allowing me to insert this XML into Oracle . I Cannot Change the Datatype in Oracle . Is there any way i can insert the XML into XMLTYPE Column. In Java i am able to achieve it by creating and SQLXML object from Connection .
SQLXML xml = conn.createSQLXML(); //This allowing to save the xml at any size
Error i am getting
ORA-01461: can bind a LONG value only for insert into a LONG column
insert Statement
insert into ABC(ID,RESPONSE_XML) values(123, :RESPONSE_XML))
Transformation
'RESPONSE_XML' : write(payload, 'application/xml')
If i reduce the number characters in the payload xml its inserting successfully . What we can do here to get it inserted.
JDBC Driver i am using is ojdbc14-10g.jar
it could be an Oracle limitation. It has a limit on the size of XML identifiers:
XML Identifier Length Limit – Oracle XML DB supports only XML identifiers that are 4000 characters long or shorter.
https://docs.oracle.com/cd/E18283_01/appdev.112/e16659/appjspec.htm
Having an XML like that would be just insane.
I am trying to insert/update NCLOB data on oracle using mybatis. But having problem with this.
Data type of column is NCLOB and below is the code i used on sqldeveloper to test it.
UPDATE CONTENT_TABLE SET HTML_DATA = N'通过信息通信系统的整合' WHERE SEQ = 1;
I tried this on sqldeveloper to test it and it works fine.(I put 'N' in front of inserting string for NCLOB.
But when i try it by mybatis it does not work...
here is the code i tried on mybatis.
UPDATE CONTENT_TABLE SET HTML_DATA = N#{htmlData} WHERE SEQ = 1
That didn't work. Is there any way to use that 'N' to put on NCLOB data in mybatis other ways?
(by the way, the reason i used NCLOB data type is to put chinese data on DB.. otherwise when data type was just CLOB, the chinese data that i insert was broken..
How to extract data from LONG datatype field using only SQL (without using PL/SQL)?
Getting error while concatenating with other columns-
ORA00932: inconsistent datatypes
DB: Oracle 8i enterprise edition
There is a trick using XML:
SELECT
long_column long_column_as_clob
FROM
XMLTABLE(
'ROWSET/ROW'
PASSING
XMLTYPE(
DBMS_XMLGEN.GETXML(
Q'{SELECT long_column FROM your_table}'
)
)
COLUMNS
long_column CLOB PATH 'LONG_COLUMN'
);
I have this table in Oracle 11g:
create table tmp_test_xml (
name_xml varchar2(4000),
file_xml xmltype
);
At this link oracle binding xmltype, I have read that for a correct insert into a field XMLType I must to use the "Xmltype binding" as this insert:
insert into tmp_test_xml values (
'file.xml',
xmltype(
'<?xml version="1.0" encoding="UTF-8"?>
<list_book>
<book>1</book>
<book>2</book>
<book>3</book>
</list_book>'
)
);
But, if I try to lunch this insert without the binding in XMLType, work very fine:
insert into tmp_test_xml values (
'file.xml',
'<?xml version="1.0" encoding="UTF-8"?>
<list_book>
<book>1</book>
<book>2</book>
<book>3</book>
</list_book>'
);
Now, is facoltative use binding or not? What is the correct insert?
Why the second insert works?
Oracle tries to do job for you, so if your datatype doesn't match column datatype, it attempts to convert data into correct data type.
Since you have correct xml string in your insert statement, database did conversion to xmltype for you.
And I'm not sure, but I doubt there were a word must use XMLType binding.
But you might want use it, if you need to have more control when creating XML from string.
Check XMLType constructors to get an idea.