I need to select several clobs as a nested table.
create table t (vc_val varchar2(100), clob_val clob);
create type varchar_t as table of varchar2(100);
create type clob_t as table of clob;
Following query works fine:
select cast(collect(vc_val) as varchar_t) from t;
And following fails, why?
select cast(collect(clob_val) as clob_t) from t;
Link to this example http://sqlfiddle.com/#!4/b01e7/3
Can someone explain me why second query fails?
It doesn't work because CAST doesn't support LOB types.
You can read about this in Oracle's Documentation: CAST Function In Oracle
Using your test data from SQLFiddle, CAST can convert a CLOB to a VARCHAR2:
SELECT CAST(clob_val AS VARCHAR2(100)) FROM t;
Result:
CAST(CLOB_VALASVARCHAR2(100))
-----------------------------
clob1
clob2
But we can't do it the other way around, the CLOBs are just not supported:
SELECT CAST(vc_val AS CLOB) FROM t;
> 00932. 00000 - "inconsistent datatypes: expected %s got %s"
CREATE OR REPLACE TYPE t_clob_tab as table of clob;
declare
l_clob_tab t_clob_tab;
begin
-- collect some data as clobs into a nested table
select
cast(multiset(
select to_clob(object_name)
from dba_objects
where rownum <= 10)
as t_clob_tab)
into l_clob_tab
from dual;
-- show the data
for i in 1 .. l_clob_tab.count
loop
dbms_output.put_line('Clob' || i || ' Value is: ' || l_clob_tab(i));
end loop;
end;
Output:
Clob1 Value is: C_OBJ#
Clob2 Value is: I_OBJ#
Clob3 Value is: TAB$
Clob4 Value is: CLU$
Clob5 Value is: C_TS#
Clob6 Value is: I_TS#
Clob7 Value is: C_FILE#_BLOCK#
Clob8 Value is: I_FILE#_BLOCK#
Clob9 Value is: C_USER#
Clob10 Value is: I_USER#
As for the CAST function support for LOB types:
CAST does not directly support any of the LOB data types. When you use
CAST to convert a CLOB value into a character data type or a BLOB
value into the RAW data type, the database implicitly converts the LOB
value to character or raw data and then explicitly casts the resulting
value into the target data type. If the resulting value is larger than
the target type, then the database returns an error.
This seems to refer to converting from a CLOB -> Varchar. But if you already have Clobs, you should be able to put them into a collection (a nested table in this case).
I typically use CAST + MULTISET instead of COLLECT, I think its easier and less fussy. I think your problem is with COLLECT + CAST here, not CAST itself (similar issue with NUMBER precisions).
EDIT:
I removed any suggestion of using Collect function, although I could use it without error in a simple select, I could not use it in pl/sql. Also, in addition to the CAST + MULTISET option above (SQL or pl/sql), you can (in pl/sql anyway) simply do:
select clob_col
bulk collect into l_clob_tab
from t;
Hope that helps.
Related
I have a table with an xml column and trying to do sum of values in an xml tag.
Table created:
CREATE TABLE XML_TABLE6
(
XML_COL VARCHAR2(2000 BYTE)
);
Insert into XML_TABLE6
(XML_COL)
Values
('<a><b>1</b><b>2</b></a>');
COMMIT;
I am using the below select statement to return the expression in datatype "double". But i am getting the error "ORA-00905: missing keyword".
SQL query:
select XMLCast(XMLQuery('sum(a/b)' RETURNING CONTENT)
as double) from xml_table6;
Expected output: 3.0
There are some issues in the query:
You didn't specify the column you want to take as input (XML_passing_clause)
You need to explicitly cast the column to the XMLType instance to process it with XML functions.
Oracle doesn't have double data type. See numeric data types in the documentation. XMLCAST function:
The datatype argument can be of data type NUMBER, VARCHAR2, CHAR, CLOB, BLOB, REF XMLTYPE, and any of the datetime data types.
After you've fixed this issues, it works fine:
with XML_TABLE6(XML_COL) as (
select '<a><b>1</b><b>2</b></a>'
from dual
)
select xmlcast(
XMLQuery('sum(a/b)' passing xmltype(XML_COL) RETURNING CONTENT)
as binary_double
) as res
from XML_TABLE6
|RES|
|:--|
|3.0E+000|
db<>fiddle
In Oracle, the ANSI data type is double precision, not just double.
You also need to pass in the actual column value, and as that's a string, convert it to XMLType:
select
XMLCast(
XMLQuery('sum(a/b)' PASSING XMLType(xml_col) RETURNING CONTENT)
as double precision)
from xml_table6;
Or use a normal number data type:
select
XMLCast(
XMLQuery('sum(a/b)' PASSING XMLType(xml_col) RETURNING CONTENT)
as number
)
from xml_table6;
Or binary_double:
select
XMLCast(
XMLQuery('sum(a/b)' PASSING XMLType(xml_col) RETURNING CONTENT)
as binary_double
)
from xml_table6;
db<>fiddle
I have a SQL stored in a column where the format for date is in square brackets:
sql_column
------------------------------------------------------------------------------
select col1 from table1 where col2 = to_date('[DD-MON-YYYY]', 'DD-MON-YYYY');
select col1 from table2 where col2 between [YYYYMMDD]00000 and [YYYYMMDD]99999 and col3 = to_date('[DD-MON-YYYY]', 'DD-MON-YYYY');
....
I don't know which format can be there beforehand, but it is always a date format.
Is there a way to use regexp_replace (or regexp_substr or even with regexp_* functions) to find and replace the pattern with the result of to_char of my given date and pattern taken from the db column.
Perhaps something like this (which doesn't work obviously):
select sql_column,
regexp_replace(sql_column, '\[(.+?)\]', to_char(some_date, '\1'))
from my_table;
Could you please help?
There is no way to parse ALL the possible date formats. For example, if you see '01/11/2017' in a varchar field you cannot say if it refers to November 1st or January 11th.
Said that, you can use a common table expression to choose the best pattern and then use it to convert the string value to date. For example:
select * from
(select
case
when REGEXP_LIKE(col2, '^[0-9]{2}/[0-9]{2}/[0-9]{4}$') then 'DD/MM/YYYY'
when REGEXP_LIKE(col2, '^[0-9]{14}$') then 'YYYYMMDDHH24MISS'
end pattern, table1.*
from table1) x
where to_date(x.col2, x.pattern) = to_date('01/11/2017','DD/MM/YYYY')
This approach would probably lead to a full table scan so it is far from being efficient. If the original table is large, I advice creating a materialized view with the converted date to improve performance.
You'll need a bit of dynamic SQL to solve this problem.
I do not present a complete solution, but this should give you a hint how to approach it.
Let's approach it bottom up.
What you actually need is a REPLACE statement, that transforms you SQL text in the required form with the paramater date.
This could be for your example this statement for teh first example (parameter date is 30.11.2017)
select replace(sql_column, '[DD-MON-YYYY]','30-NOV-2017') from my_table;
If you run the first statement on your first row you get the expected result:
select col1 from table1 where col2 = to_date('30-NOV-2017', 'DD-MON-YYYY');
So how to get those REPLACE statements. One possibility is to write a PL/SQL function.
The function has two parameters - the original SQL string and teh parameter date.
Using regexp you scrap the date format mask.
With dynamic SQL (EXECUTE IMMEDIATE) you format you parameter DATE as string with propper format.
Finally the REPLACE statement is returned.
create or replace function format_date(i_txt IN VARCHAR2, i_date DATE) return VARCHAR2 is
v_date_format VARCHAR2(4000);
v_form_date VARCHAR2(4000);
v_param VARCHAR2(4000);
v_form VARCHAR2(4000);
v_sql VARCHAR2(4000);
BEGIN
v_param := regexp_substr(i_txt, '\[(.+?)\]');
v_date_format := replace(replace(v_param,'[',null),']',null);
v_sql := 'select to_char(:d,'''||v_date_format||''') as my_dt from dual';
execute immediate v_sql into v_form_date using i_date;
v_form := 'select replace(sql_column, '''||v_param||''','''||v_form_date||''') from my_table';
return (v_form);
END;
/
NOTE that I handle only the first date mask in the string, you'll need to loop on all occcureces to get correct the second example!
I have a query which gives output as
Could not determine polymorphic type because input has type "unknown"
Query :
select ( array_to_string(array_agg(name), ', '))::text as name,path
from(select 'fullpath' as Path,null as id,'' as name
from tblabc where key = 'key1' and value = '1'
) as e
group by path;
I have a postgres database
The issue here is that '' as name doesn't actually specify a type for the value. It's the unknown type, and PostgreSQL usually infers the real type from things like what column you're inserting it into or what function you pass it to.
In this case, you pass it to array_agg, which is a polymorphc function. It can take inputs of the pseudo-type anyelement, which really just means "figure it out at runtime".
PostgreSQL would still figure it out except that array_to_string doesn't actually take a text[] as input. It takes anyarray - another polymorphic type, like anyelement for arrays.
So there's nothing in the query to tell PostgreSQL what type that '' is. It could guess you meant text, but it's a bit too fussy for that. So it complains. The issue simplifies down to:
regress=> SELECT array_to_string(array_agg(''), ',');
ERROR: could not determine polymorphic type because input has type "unknown"
To solve this, write a typed literal:
TEXT '' AS name
or use a cast:
CAST('' AS text) AS name
or the PostgreSQL shorthand:
''::text
examples:
regress=> SELECT array_to_string(array_agg(TEXT ''), ',');
array_to_string
-----------------
(1 row)
regress=> SELECT array_to_string(array_agg(''::text), ',');
array_to_string
-----------------
(1 row)
regress=> SELECT array_to_string(array_agg(CAST('' AS text)), ',');
array_to_string
-----------------
(1 row)
I want to reverse clob data type value in oracle in same way as we do for string data type fields with the help of 'reverse' function.Is there any inbuilt method for that.Google was not much help.Being a newbie in sql don't know whether it is even possible? I initially thought
that 'reverse' function can be used for clob data type fields also but its not working, here is the example I have tried-
drop table test;
create table test
(
name varchar2(4000),
description clob
)
insert into test values ('aadinath','I have to reverse a clob data type value')
select reverse(name) from test;
output= htanidaa
select reverse(name), reverse(description) from test;
output= ORA-00932: inconsistent datatypes: expected CHAR got CLOB
00932. 00000 - "inconsistent datatypes: expected %s got %s"
You need to convert clob to varchar2 first. Then perform the reverse.
Reference 1:
The Function to translate CLOB datatype into varchar() is DBMS_LOB. The DBMS_LOB package provides subprograms to operate on BLOBs, CLOBs, NCLOBs, BFILEs, and temporary LOBs. You can use DBMS_LOB to access and manipulation specific parts of a LOB or complete LOBs. DBMS_LOB can read and modify BLOBs, CLOBs, and NCLOBs; it provides read-only operations for BFILEs.
Syntax:
DBMS_LOB.SUBSTR (lob_loc, amount, offset)
dbms_lob.substr( clob_column, for_how_many_bytes, from_which_byte );
Parameter Description:
lob_loc: Locator for the LOB to be read i.e CLOB column name.
amount: Number of bytes (for BLOBs) or characters (for CLOBs) to be read.
offset: Offset in bytes (for BLOBs) or characters (for CLOBs) from the start of the LOB.
Example:
CREATE OR REPLACE VIEW temp_view
AS
SELECT
column1, -- datatype numeric
column2, -- datatype varchar()
DBMS_LOB.SUBSTR(column3, 2000,1) as column3, -- datatype CLOB
column4 -- datatype numeric
FROM temp_table;
Note: In this example I am reading first 2000 charactres.
I am trying to see from an SQL console what is inside an Oracle BLOB.
I know it contains a somewhat large body of text and I want to just see the text, but the following query only indicates that there is a BLOB in that field:
select BLOB_FIELD from TABLE_WITH_BLOB where ID = '<row id>';
the result I'm getting is not quite what I expected:
BLOB_FIELD
-----------------------
oracle.sql.BLOB#1c4ada9
So what kind of magic incantations can I do to turn the BLOB into it's textual representation?
PS: I am just trying to look at the content of the BLOB from an SQL console (Eclipse Data Tools), not use it in code.
First of all, you may want to store text in CLOB/NCLOB columns instead of BLOB, which is designed for binary data (your query would work with a CLOB, by the way).
The following query will let you see the first 32767 characters (at most) of the text inside the blob, provided all the character sets are compatible (original CS of the text stored in the BLOB, CS of the database used for VARCHAR2) :
select utl_raw.cast_to_varchar2(dbms_lob.substr(BLOB_FIELD)) from TABLE_WITH_BLOB where ID = '<row id>';
SQL Developer provides this functionality too :
Double click the results grid cell, and click edit :
Then on top-right part of the pop up , "View As Text" (You can even see images..)
And that's it!
You can use below SQL to read the BLOB Fields from table.
SELECT DBMS_LOB.SUBSTR(BLOB_FIELD_NAME) FROM TABLE_NAME;
Use this SQL to get the first 2000 chars of the BLOB.
SELECT utl_raw.cast_to_varchar2(dbms_lob.substr(<YOUR_BLOB_FIELD>,2000,1)) FROM <YOUR_TABLE>;
Note: This is because, Oracle will not be able to handle the conversion of BLOB that is more than length 2000.
If you want to search inside the text, rather than view it, this works:
with unzipped_text as (
select
my_id
,utl_compress.lz_uncompress(my_compressed_blob) as my_blob
from my_table
where my_id='MY_ID'
)
select * from unzipped_text
where dbms_lob.instr(my_blob, utl_raw.cast_to_raw('MY_SEARCH_STRING'))>0;
I can get this to work using TO_CLOB (docs):
select
to_clob(BLOB_FIELD)
from
TABLE_WITH_BLOB
where
ID = '<row id>';
This works for me in Oracle 19c, with a BLOB field which larger the the VARCHAR limit. I get readable text (from a JSON-holding BLOB)
Barn's answer worked for me with modification because my column is not compressed. The quick and dirty solution:
select * from my_table
where dbms_lob.instr(my_UNcompressed_blob, utl_raw.cast_to_raw('MY_SEARCH_STRING'))>0;
I struggled with this for a while and implemented the PL/SQL solution, but later realized that in Toad you can simply double click on the results grid cell, and it brings up an editor with contents in text. (i'm on Toad v11)
In case your text is compressed inside the blob using DEFLATE algorithm and it's quite large, you can use this function to read it
CREATE OR REPLACE PACKAGE read_gzipped_entity_package AS
FUNCTION read_entity(entity_id IN VARCHAR2)
RETURN VARCHAR2;
END read_gzipped_entity_package;
/
CREATE OR REPLACE PACKAGE BODY read_gzipped_entity_package IS
FUNCTION read_entity(entity_id IN VARCHAR2) RETURN VARCHAR2
IS
l_blob BLOB;
l_blob_length NUMBER;
l_amount BINARY_INTEGER := 10000; -- must be <= ~32765.
l_offset INTEGER := 1;
l_buffer RAW(20000);
l_text_buffer VARCHAR2(32767);
BEGIN
-- Get uncompressed BLOB
SELECT UTL_COMPRESS.LZ_UNCOMPRESS(COMPRESSED_BLOB_COLUMN_NAME)
INTO l_blob
FROM TABLE_NAME
WHERE ID = entity_id;
-- Figure out how long the BLOB is.
l_blob_length := DBMS_LOB.GETLENGTH(l_blob);
-- We'll loop through the BLOB as many times as necessary to
-- get all its data.
FOR i IN 1..CEIL(l_blob_length/l_amount) LOOP
-- Read in the given chunk of the BLOB.
DBMS_LOB.READ(l_blob
, l_amount
, l_offset
, l_buffer);
-- The DBMS_LOB.READ procedure dictates that its output be RAW.
-- This next procedure converts that RAW data to character data.
l_text_buffer := UTL_RAW.CAST_TO_VARCHAR2(l_buffer);
-- For the next iteration through the BLOB, bump up your offset
-- location (i.e., where you start reading from).
l_offset := l_offset + l_amount;
END LOOP;
RETURN l_text_buffer;
EXCEPTION
WHEN OTHERS THEN
DBMS_OUTPUT.PUT_LINE('!ERROR: ' || SUBSTR(SQLERRM,1,247));
END;
END read_gzipped_entity_package;
/
Then run select to get text
SELECT read_gzipped_entity_package.read_entity('entity_id') FROM DUAL;
Hope this will help someone.
You can try this:
SELECT TO_CHAR(dbms_lob.substr(BLOB_FIELD, 3900)) FROM TABLE_WITH_BLOB;
However, It would be limited to 4000 byte
Worked for me,
select lcase((insert(
insert(
insert(
insert(hex(BLOB_FIELD),9,0,'-'),
14,0,'-'),
19,0,'-'),
24,0,'-'))) as FIELD_ID
from TABLE_WITH_BLOB
where ID = 'row id';
Use TO_CHAR function.
select TO_CHAR(BLOB_FIELD) from TABLE_WITH_BLOB where ID = '<row id>'
Converts NCHAR, NVARCHAR2, CLOB, or NCLOB data to the database character set. The value returned is always VARCHAR2.