how to solve Data type BLOB can not be converted to varchar2 - sql

I have created in apex 5.1 a report with form in oracle apex 5.1 in which I have a BLOB column called 'LIEN'. And when I insert data in the table and run the application I get this error:
Data type BLOB can not be converted to VARCHAR2!
How can this be solved?

Blob is used for binary data, like image or other binary files.
For textual long fields a Clob or NClob should be used.
A binary representation in string should be used for Blob such as HEX or Base64.
For Oracle there are several stored procedure, or functions for this purpose, such as rawtohex(COLUMN), utl_raw.cast_to_varchar2(utl_encode.base64_encode(COLUMN)) and some other.

Related

Reading converted HEX data from SQL Server database

I am not sure I am approaching this project correctly. I have a SQL Server database that contains annotation data for images. The annotation data is stored as a blob data type in the database.
First I tried using a SELECT statement to convert the blob to text:
SELECT CONVERT(varchar(max), CONVERT(varbinary(max), blob_column))
FROM table
Then I used an online tool to convert it from HEX to ASCII. I was able to get more data from it. However, most of the converted text is junk. Below is a screenshot viewed in Notepad++.
Is it possible to convert that junk data into something useful? Is there another approach I should try?
Binary Code
0x08000000050000002B000000030000006100640061000A0100002E0200000000000001000000A047EDE73F000000009DE8E73F010000000000000000000000000000000100000000000000000000000000000001000000000000000000000000000000010000000000000000000000000000000100000000000000000000000000000001000000000000000000000000000000FFFF0E00000041006300740069006E00690063002000440061006D0061006700650000FFFFFFFFFFFFFFFF00000000000000009858700BEFB01146B269007757A49399000000000000F03F000000000000F03F010000003F000000010000006E00EC000000320200000000FF0001000000A047EDE73F000000009DE8E73F010000000000000000000000000000000100000000000000000000000000000001000000000000000000000000000000010000000000000000000000000000000100000000000000000000000000000001000000000000000000000000000000FFFF0B0000004E00650076007500730028004E00650076006900290000FFFFFFFFFFFFFFFF00000000000000002D2C2B3419D5E846AD84FA0BFC98B7A0000000000000F03F000000000000F03F01000000AC0000000400000062006E0065006F00890100002B010000FF00000001000000A047EDE73F000000009DE8E73F010000000000000000000000000000000100000000000000000000000000000001000000000000000000000000000000010000000000000000000000000000000100000000000000000000000000000001000000000000000000000000000000FFFF2200000052002F004F0020004E0065006F0070006C00610073006D0020006F006600200075006E006300650072007400610069006E0020006200650068006100760069006F00720000FFFFFFFFFFFFFFFF00000000000000001BC3C234AA150C42A01F6B2086317E8C000000000000F03F000000000000F03F01000000AC0000000400000062006E0065006F009A0100004F010000FF00000001000000A047EDE73F000000009DE8E73F010000000000000000000000000000000100000000000000000000000000000001000000000000000000000000000000010000000000000000000000000000000100000000000000000000000000000001000000000000000000000000000000FFFF2200000052002F004F0020004E0065006F0070006C00610073006D0020006F006600200075006E006300650072007400610069006E0020006200650068006100760069006F00720000FFFFFFFFFFFFFFFF0000000000000000308BB8069D17AF4FB539F319E8BE2B13000000000000F03F000000000000F03F01000000150000000200000061003100160100001E010000FF00FF0001000000A047EDE73F000000009DE8E73F010000000000000000000000000000000100000000000000000000000000000001000000000000000000000000000000010000000000000000000000000000000100000000000000000000000000000001000000000000000000000000000000FFFF090000004D0069006C0064002000410063006E00650000FFFFFFFFFFFFFFFF000000000000000031873239F16ACD48A11A5C0C328705AA000000000000F03F000000000000F03F010000000000000000000000000000000000000000000000

data is converting to binary format while loading data into monet db using Apache pig

I am using MonetDb-Pig layer to load the csv data into Monet db. Internally it is using Binarybulkload commands to load the data but after loading data into table, the csv file values are not not matching with Monet db table values(int ,double).Seems to be data converted into binary format.
How can we get back the actual values in monetdb? .
Table Structure that I am using
CREATE TABLE "test" (
"s_suppkey" INT,
"s_name" CLOB,
"s_address" CLOB,
"s_nationkey" INT,
"s_phone" CLOB,
"s_acctbal" DOUBLE,
"s_comment" CLOB
);
Load command that I am using
COPY BINARY INTO "test" FROM (
'$PATH/part-1/col-0.bulkload',
'$PATH/part-1/col-1.bulkload',
'$PATH/part-1/col-2.bulkload',
'$PATH/part-1/col-3.bulkload',
'$PATH/part-1/col-4.bulkload',
'$PATH/part-1/col-5.bulkload',
'$PATH/part-1/col-6.bulkload'
);
please convert byte buffer from BigEndian to LittleEndian, and check
The information provided is insufficient to isolate the issue. The most probable issue is a mis-alignment of the number of values in each of the binary columns.
Check the size of the elements in 's_acctbal' input file, to see if it
produced Floats instead of Double binary values.
btw. the MonetDB-Pig project is not actively maintained, but we welcome
patches.

How can I read a very long BLOB column in Oracle?

I want to connect a Node Express API with an Oracle 11g Database which has a table with a BLOB column. I want to read it using a SQL query, but the problem is that the BLOB column can have a very long text, more than 100k characters. How can i do this?
I tried using: select utl_raw.cast_to_varchar2(dbms_lob.substr(COLUMN_NAME)) from TABLE_NAME.
But it returns 'raw variable length too long'.
I can make multiple queries in a loop and then join them if it was necessary, but I haven't found how bring just a part of the blob.
Use the node-oracledb module to access Oracle Database (which you are probably already doing, but don't mention).
By default, node-oracledb will return LOBs as Lob instances that you can stream from. Alternatively you can fetch the data directly as a String or Buffer, which is useful for 'small' LOBs. For 100K, I would just get the data as a Buffer, which you can do by setting:
oracledb.fetchAsBuffer = [ oracledb.BLOB ];
Review the Working with CLOB, NCLOB and BLOB Data documentation, and examples like blobhttp.js and the other lob*.js files in the examples directory.
You may also want to look at https://jsao.io/2018/03/creating-a-rest-api-with-node-js-and-oracle-database/ which shows Express and node-oracledb.

Insert JSON file into SQL Server database

Hi how can I insert a JSON file into a Cell in Database? I don't want to store the File path but want to store the whole Content of the JSON file into the field.
What can I do?
JSON data stored/transferred in as a string. You can store it in a normal NVARCHAR column.
how large is the json text?, depending on that you should have a varchar field if content is not large or CLOB if is a lot of json text,
json is just text , so you just have something to read the content of the file, maybe some transact-sql script , and insert it in your table

Saving text file in db and retrieving

I have to store a text file in the form of byte array and has to read it back from the database and need to write on text file. What can i Do? I am using sql server 2008 R2 and vb.net
The could use varbinary(max) for the data type in sql. If saving space change max to a smaller number better noted here: http://msdn.microsoft.com/en-us/library/ms188362.aspx
You can convert the text file to byte[] and store it in column of image datatype. When you retrieve this data from data base you will need to type cast to byte[] and using FileStream you can convert it into the file.
Following are some helpful links
http://www.aspdotnet-suresh.com/2011/01/how-to-insert-images-into-database-and.html
http://social.msdn.microsoft.com/Forums/en/netfxbcl/thread/42cec0cb-5761-4aaa-93dc-861b29ee5ea6
Hope this is what you are looking for.