I can insert text and integer data to MS Access db (.mdb) by using pyodbc package. But now i want to insert Large Binary objects. I have a table that consists ID(COUNTER type), Name(VARCHAR type), File (LONGBINARY type), Author(VARCHAR type) columns. I use that code to insert some text and int data:
cursor.execute("""INSERT INTO table(ID, Name) VALUES(1,'book')""")
After that i used that code but always getting error.
with open('c:/tree.jpg', 'rb') as file:
binData = file.read()
SQL = """INSERT INTO table VALUES(2,'threePicture', %s, 'Mike')""" %(binData)
cursor.execute(SQL)
The error is: ProgrammingError: ('42000', "[42000])
I found the solution using ? ? ? characters...
cursor.execute("insert into table values(?, ?, ?, ?)", 2, 'treePicture', pyodbc.Binary(binData), 'Mike')
Use ? chars for values in expression.
Related
I want to know how to manually insert a BLOB into my SQLite database. By manually I mean, without using a driver feature that will complete the command like setBytes:
Connection con = DriverManager.getConnection("jdbc:sqlite:database.db");
PreparedStatement stmt = con.prepareStatement("INSERT OR REPLACE INTO test (id, aBlobColumn) VALUES (0, ?)";
stmt.setBytes(1, new byte[] {0x37, 0xe7, 0x9f});
stmt.executeUpdate();
Is it possible to use a command like that:
INSERT OR REPLACE INTO test (id, aBlobColumn) VALUES (0, 37e79f);
or like that:
INSERT OR REPLACE INTO test (id, aBlobColumn) VALUES (0, BLOB(37, e7, 9f));
I don't mind if the command includes base64 data or raw data, I don't want to specifically use hexadecimal.
You can use the following :-
INSERT OR REPLACE INTO test (id, aBlobColumn) VALUES (0, x'37e79f');
However, the value has to be a hex string for it to be a BLOB.
I am working on existing project and need to prepare DML for oracle database.
But i am unable to prepare for Insert statement for Blob data due to huge size which is grater than 4000 bytes. Can any one help me?
N.B: I got this error, ORA-06550: String literal too lang.
Only INSERT Statement, no java programe. I just need it to prepare DML like insert statement.
My INSERT STATEMENT:
INSERT INTO APP_PROF('ID', 'IMAGE') VALUES('2', TO_BLOB('4654655665....'))
This Image blob data is greater than 45000bytes
Thanks in advance.
Yes. Use stream, not String/byte array to insert BLOB. Something like this:
PreparedStatement ps = conn.prepareStatement("insert into blobs (blob_value) values (?)");
InputStream in = new StringBufferInputStream(aLagrgeStringValue);
ps.setBinaryStream(1,in,(int)in.length());
ps.execute();
The problem I find here is that you are using string value to insert in a Blob column and error that you are reciving is the limitation error for the input string and not the blob column of the table.
please refer to the below link for further clarification.
How to convert VARCHAR2 to BLOB inside Oracle 11g PL/SQL after ORA-06502
I am inserting a large string into a CLOB column. The string is (in this instance) 3190 characters long - but can be much larger.
The string consists of xml data - sometimes the data will commit, sometimes i get the error. The error occurs roughly 50% of the time.
Even string which contain over 5000 characters will sometimes commit with no problem.
Unsure where to go next as i am under the impression that CLOB is the best data type for this data.
I have tried LONG LONG RAW
Someone suggested using XMLTYPE however that does not exist in my version of Oracle (11g - 11.2.0.2.0)
My insert statement:
INSERT INTO MYTABLE(InterfaceId, SourceSystem, Description, Type, Status, StatusNotes, MessageData, CreatedDate, ChangedDate, Id) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
MessageData is the CLOB column where the error is occuring, i have tried commiting without this data populated and it works.
Error
ORA-01461: can bind a LONG value only for insert into a LONG column
ALTER TABLE MYTABLE
ADD COLUMN XML_COL XMLTYPE;
AND THEN
SQL> INSERT INTO MYTABLE(..., XML_COL) VALUES (..., XMLTYPE('<root>example</root>'));
The key is to use XMLTYPE column and then use XMLTYPE() function to convert your string to XMLTYPE.
I have the following hsqldb schema (as reported by SQLWorkbench):
DROP TABLE TEST CASCADE;
CREATE TABLE TEST
(
NAME VARCHAR(256),
METADATA VARCHAR(2048),
DATA BLOB
);
GRANT TRIGGER, INSERT, REFERENCES, DELETE, SELECT, UPDATE ON TEST TO DBA;
Next, I am trying to insert a file into the DATA field using the following prepared statement:
MERGE INTO test USING (VALUES ?, ?, ?) I (name, metadata, data) ON (test.name=I.name) WHEN MATCHED THEN UPDATE SET test.data = I.data, test.metadata = I.metadata WHEN NOT MATCHED THEN INSERT (name, metadata, data) VALUES (I.name, I.metadata, I.data)
Here is the code:
String name = ...;
String metadata = ...;
InputStream data = ...;
JDBCDataSource ds = new JDBCDataSource();
ds.setDatabase("jdbc:hsqldb:file:c:/tmp/file.db");
ds.setUser("sa");
ds.setPassword("");
PreparedStatement set = ds.getConnection().prepareStatement(m_setSql);
set.setString(1, name);
set.setString(2, metadata);
set.setBinaryStream(3, data);
set.executeUpdate();
The setBinaryStream fails, because the parameter type is deemed to be VARCHAR, rather than BLOB. Indeed, the function org.hsqldb.jdbc.JDBCPreparedStatement.setBinStream has the following statement:
if (parameterTypes[parameterIndex - 1].typeCode == Types.SQL_BLOB) {
setBlobParameter(parameterIndex, x, length);
return;
}
For the parameterIndex 3 it should enter the if-statement and invoke the setBlobParameter. But, for some reason, typeCode returns 12, which corresponds to VARCHAR, the if-statement is skipped and in the end an org.hsqldb.HsqlException is raised with the message of incompatible data type in conversion.
What am I doing wrong?
The types of the parameter values in the MERGE statement are unknown and default to VARCHAR. You need to cast the BLOB parameter to BLOB.
MERGE INTO test USING (VALUES ?, ?, CAST(? AS BLOB)) I (name, metadata, data)
ON (test.name=I.name)
WHEN MATCHED THEN UPDATE SET test.data = I.data, test.metadata = I.metadata
WHEN NOT MATCHED THEN INSERT (name, metadata, data) VALUES (I.name, I.metadata, I.data)
I wonder if there's some way to insert Byte[] in to my database column using the INSERT statement through my SQL Editor.
For example
INSERT INTO Temp (id,name) VALUES(1,'rg_book');
I just wanna to test my data and I don't want to make a user interface (file uploader ,...etc).
How to write this statement?
The CLR Byte array type (Byte[]) maps to a VARBINARY type in Informix DB2. See typing information here.
If your name field is expecting character data, use the VARBINARY function to convert the data into a binary representation of the string. See here.
For example:
INSERT INTO Temp (id, name) VALUES (1, VARBINARY('rg_book'));
If I were you I would do the following (if I've understood your question correctly):
Create test console project
Using Foreach or For (on your Byte[] array) compose required Insert's and (for example) add them to some file on a disk.
Run this script in Management Studio to fill in a table.
FileInfo f = new FileInfo(#"d:\Inserts.txt");
Byte[] list = {0, 1, 2};
using (StreamWriter w = f.CreateText())
{
for (int i = 0; i < list.Length; i++)
{
w.WriteLine("INSERT INTO [TEMP] ([id], [Name]) VALUES ({0}, 'rg_book')", list[i]);
}
}