How can I read a very long BLOB column in Oracle? - sql

I want to connect a Node Express API with an Oracle 11g Database which has a table with a BLOB column. I want to read it using a SQL query, but the problem is that the BLOB column can have a very long text, more than 100k characters. How can i do this?
I tried using: select utl_raw.cast_to_varchar2(dbms_lob.substr(COLUMN_NAME)) from TABLE_NAME.
But it returns 'raw variable length too long'.
I can make multiple queries in a loop and then join them if it was necessary, but I haven't found how bring just a part of the blob.

Use the node-oracledb module to access Oracle Database (which you are probably already doing, but don't mention).
By default, node-oracledb will return LOBs as Lob instances that you can stream from. Alternatively you can fetch the data directly as a String or Buffer, which is useful for 'small' LOBs. For 100K, I would just get the data as a Buffer, which you can do by setting:
oracledb.fetchAsBuffer = [ oracledb.BLOB ];
Review the Working with CLOB, NCLOB and BLOB Data documentation, and examples like blobhttp.js and the other lob*.js files in the examples directory.
You may also want to look at https://jsao.io/2018/03/creating-a-rest-api-with-node-js-and-oracle-database/ which shows Express and node-oracledb.

Related

Get a clob output of a stored procedure over a dblink

I have a procedure that runs queries on a few tables and manipulates the output into a clob that it returns. I need to call this procedure in a remote Database over a dblink and get the clob value that the procedure returns. I know that we cannot access non-scalar data like clob over a dblink. I know that if the clob were in a table on the remote side, I could just create a global temp table and on the local side and do a insert into my local temp table with a select over the remote table. But in my case, the clob is a manipulated output of the procedure.
Any suggestions on how I can do this?
On the remote database, create a function to wrap around the procedure and return the CLOB as its return value. Then create a view that selects from this function and exposes the CLOB as a column. You should be able to query that CLOB column through the view remotely over a database link. I know this can work as I pull CLOB data over dblinks thousands of times a day in utilities I wrote, though I do remember it taking a bit of trial-and-error to make it happy.
If you cannot get that to work, there are a number of other workarounds available. One involves a remote package presenting package-declared collection types which can be used by a remote function in that package to disassemble the CLOB into a collection of varchar2(32767) records, return that collection to the calling database, which then using remote reference #dblink to that remote package's types is able to reassemble a local CLOB from the collection contents. But this kind of heavy-handed workaround really shouldn't be necessary.
Lastly, I should at least mention that using CLOBs for structured data is not a good design choice. CLOBs should have only unstructured data, the kind that is meaningful only to humans (like log files, free-form notes, user-entered descriptions, etc..). It should never be used for combining multiple pieces of meaningful structured data that a program is meant to interpret and work with. There are many other constructs that would handle that better than a CLOB.
I think that that CLOB is to be split into chunks of varchar2(4000) and stored into a temporary table with preserve rows, so that via that DB-link you will only select from that table that contains the chunks of the CLOB and a column that indicates their order. That would mean creating a procedure in that remote DB which calls the procedure generating the CLOB, then splits that CLOB into chunks and inserts them into the global temporary table.

Exporting SQL Server table containing a large text column

I have to export a table from a SQL Server, the table contains a column that has a large text content with the maximum length of the text going up to 100,000 characters.
When I use Excel as an export destination, I find out that the length of this text is capped and truncated to 32,765.
Is there an export format that preserves the length?
Note:
I will eventually be importing this data into another SQL Server
The destination SQL Server is in another network, so linked servers and other local options are not feasible
I don't have access to the actual server, so generating back up is difficult
As is documented in the Excel specifications and limits the maximum characters that can be stored in a single Excel cell is 32,767 characters; hence why your data is being truncated.
You might be better off exporting to a CSV, however, note that Quote Identified CSV files aren't supported within bcp/BULK INSERT until SQL Server 2019 (currently in preview). You can use a characters like || to denote a field delimited, however, if you have any line breaks you'll need to choose a different row delimitor too. SSIS, and other ETL tools, however, do support quote identified CSV files; so you can use something like that.
Otherwise, if you need to export such long values and want to use Excel as much as you can (which I actually personally don't recommend due to those awful ACE drivers), I would suggest exporting the (n)varchar(MAX) values to something else, like a text file, and naming each file with the value of your Primary Key included. Then, when you import the data back you can retrieve the (n)varchar(MAX) value again from each individual file.
The .sql is the best format for sql table. Is the native format for sql table, with that, you haven't to concert the export.

Compress Oracle Query Column

I have read access to an Oracle database. I am using cx_Oracle to make queries. One of the table column is a CLOB with XML strings. To speed up the network access I figured I would try to ask the database to first compress the xml data before it sends it because the network link is very slow. Is there any way to do this? I would uncompress the data on my end. Looking for something like:
SELECT COMPRESS(clob_column) AS comp_data
FROM table1
WHERE id=1
Thanks

how to read data stored in blobs

my application datbase in postgressql and from the document i understand that it store few data in a blob and from the table i can only get the oid of it.
is there any possibility to read the content from these blobs? if yes, could someone share the knowhow?
From the OID, a file with the contents of the large object can be exported.
Either client-side (psql):
\lo_export oid-to-export /path/to/a/file
Or server-side in SQL (creates the file on the server, beware that postgres must have the permission to write into the destination directory).
SELECT lo_export(oid-to-export, '/path/to/a/file');

How to detemine content type of binary data in the image field of SQL Server 2008?

I need to determine file type (i.e., MimeType) of stored data in the SQL Server 2008.
Is there anyway, if possible using SQL query, to identify the content type or MimeType of the binary data stored in the image column?
I think that, if you need that information, it would probably be better to store it in a separate column. Once it's in the DB, your only options really are guessing it from the file name (if you happen to store that) or by detecting the signature from the first few bytes of data.
There is no direct way in SQL Server to do that - there's no metadata on binary columns stored inside SQL Server, unless you've done it yourself.
For SQL Server, a blob is a blob is a blob - it's just a bunch of bytes, and SQL Server knows nothing about it, really. You need to have that information available from other sources, e.g. by storing a file name, file extension, mime type or something else in a separate column.
Marc