String to CLOB with postgreSQL - sql

I'm trying to read a clob from postgreSQL DB, change it, and write it back.
I was able to read the clob successfully using the following code:
PreparedStatement statement = connection.prepareStatement("SELECT clob_column from data where id = 1");
ResultSet executeQuery = statement.executeQuery();
executeQuery.next()
Clob fetchedClob = executeQuery.getClob("clob_column");
But when I'm trying to create a new clob with the new data using:
Clob newClob = connection.createClob();
I'm getting the following error:
java.lang.AbstractMethodError: com.mchange.v2.c3p0.impl.NewProxyConnection.createClob()Ljava/sql/Clob;
Moreover, If I try just to edit the fetched clob, using:
fetchedClob.setString(0, "new string");
I'm getting the following error:
Method org.postgresql.jdbc4.Jdbc4Clob.setString(long,str) is not yet implemented.
Any idea?
Update: here is the table definition
CREATE TABLE data (
id bigint NOT NULL,
clob_column text, );
Thanks

No need to use getClob().
ResultSet.getString() and setString() work perfectly fine on text columns (PostgreSQL does not have a clob datatype so I assume you are using text)

Related

Failed to execute query. Error: String or binary data would be truncated in table xdbo.user_info', column 'uid'

I have problem inserting values in my SQL server database on Azure, I am getting the following error:
Failed to execute query. Error: String or binary data would be truncated in table 'dummy_app.dbo.user_info', column 'uid'. Truncated value: 'u'.
The statement has been terminated.
I don't understand where I am wrong, I just created the server, and I am trying to experiment but cant fix this.
if not exists (select * from sysobjects where name='user_info' and xtype='U')
create table user_info (
uid varchar unique,
name varchar,
email varchar
)
go;
INSERT INTO dbo.user_info(uid, name, email) VALUES('uids', 'name', 'email') go;
Creating the table works fine, the only thing that doesn't work is the second command INSERT
I suspect that the reason is that you haven't defined a lenght for varchar and it defaults to 1 as length. Therefore your value gets truncated.
Set a varchar length to something like varchar(200) and you should be good to go.
This looks like the fact that the CREATE portion of your procedure for the table doesn't include a length of varchar, so you'd have to specify a length such as varchar(50) since the default is 1. Refer to the official MS docs in the link, in the remarks.
docs.miscrosoft.com
Also, here is the syntax for the CREATE TABLE in Azure which might be helpful as well.
Syntax of Azure CREATE TABLE

What is the right way to get Avro-files containing JSON into table-structure on Snowflake?

I've been struggling to get my data from Azure Event Hub into SQL-table on Snowflake-platform. I just can't wrap my head around how to do it properly if I have to transform the data multiple times. My data is in the body of the Avro-file.
I just started doing Snowflake. So far I've tried to follow this tutorial on the subject but it doesn't actually save the JSON-formatted body anywhere in the video. So far I've tried something like this
CREATE DATABASE IF NOT EXISTS MY_DB;
USE DATABASE MY_DB;
CREATE OR REPLACE TABLE data_table(
"column1" STRING,
"column2" INTEGER,
"column3" STRING
);
create or replace file format av_avro_format
type = 'AVRO'
compression = 'NONE';
create or replace stage st_capture_avros
url='azure://xxxxxxx.blob.core.windows.net/xxxxxxxx/xxxxxxxxx/xxxxxxx/1/'
credentials=(azure_sas_token='?xxxxxxxxxxxxx')
file_format = av_avro_format;
copy into avro_as_json_table(body)
from(
select(HEX_DECODE_STRING($1:Body))
from #st_capture_avros
);
copy into data_table("Column1", "Column2", "Column3" )
from(
select $1:"jsonKeyValue1", $1:"jsonKeyValue2", $1:"jsonKeyValue3"
from avro_as_json_table
);
This doesn't work as it produces "SQL compilation error: COPY statement only supports simple SELECT from stage statements for import" error and I know I should use INSERT INTO in the last statement instead of copy but my question is more to do how would I eliminate redundant avro_as_json_table from the equation?
Rather than using
copy into avro_as_json_table(body)
from ...
try
INSERT INTO avro_as_json_table(body)
from ...

how to update N data into oracle from mybatis

I am trying to insert/update NCLOB data on oracle using mybatis. But having problem with this.
Data type of column is NCLOB and below is the code i used on sqldeveloper to test it.
UPDATE CONTENT_TABLE SET HTML_DATA = N'通过信息通信系统的整合' WHERE SEQ = 1;
I tried this on sqldeveloper to test it and it works fine.(I put 'N' in front of inserting string for NCLOB.
But when i try it by mybatis it does not work...
here is the code i tried on mybatis.
UPDATE CONTENT_TABLE SET HTML_DATA = N#{htmlData} WHERE SEQ = 1
That didn't work. Is there any way to use that 'N' to put on NCLOB data in mybatis other ways?
(by the way, the reason i used NCLOB data type is to put chinese data on DB.. otherwise when data type was just CLOB, the chinese data that i insert was broken..

Updating a CLOB XML in Oracle

I have a CLOB column which contains a large amount of XML. I want to add a new attribute in that XML, like this attribute :
<name>me</name>
I tried using UpdateXML but I'm not getting it right.
CLOB is converted to XMLType using XMLType() and XMLType is converted to CLOB using to_clob. The following is an example.
create table table_with_clob (myclob clob);
insert into table_with_clob values ('<mytag><subtag>hello world</subtag></mytag>');
UPDATE table_with_clob SET myclob =
to_clob(INSERTCHILDXML(xmltype(myclob),
'/mytag', 'subtag',
XMLType('<subtag>another</subtag>')));
select * from table_with_clob;
Output
myclob
------
<mytag><subtag>hello world</subtag><subtag>another</subtag></mytag>
Though I think this is not very efficient and you might better convert the column to XMLType and the operate with it.

Passing user defined TABLE type to stored Oracle function

I have an oracle function defined as:
FUNCTION SELECTINBOX (FA_FROMUSERLIKE IN PKGSMSTYPES.MAXVARCHAR2_T DEFAULT NULL ,
FA_INBOXOWNER IN PKGSMSTYPES.MAXVARCHAR2_T,
FA_A_URGENCY IN G_INTARRAY_TBL DEFAULT NULL ,
FA_PAGENO IN NUMBER DEFAULT 1
) RETURN G_SMSNOTES_TBL;
where G_INTARRAY_TBL is defined as,
create or replace
TYPE G_INTARRAY_TBL AS TABLE OF NUMBER;
I am building the query using eclipselink. The query works fine if I hardcode G_INTARRAY_TBL as null in the query string but if I try to pass a List of BigDecimals to it, I get an error,
Internal Exception: java.sql.SQLException: Invalid column type
Error Code: 17004
Include your code for your query.
You need to use a PLSQLStoredFunctionCall (#NamedPLSQLStoredFunctionQuery) for this. You also need to mirror the PLSQL TABLE type with a VARRAY type.
See,
http://wiki.eclipse.org/EclipseLink/Examples/JPA/PLSQLStoredFunction