Oracle sql - Read data from a file for IN clause - sql

Is it possible to read data from a file to supply the data for IN clause?
SQL> SELECT a,b from TABLE123 where type=10 and values IN('file.txt');
The file.txt has list of values.
I cannot use a subquery because the table on which the subquery is to be applied in on a different database.
EDIT: I would prefer not to create a temporary table

assuming that you have copied the "file.txt" file to the Oracle server (under: 'ext_tab_data' directory):
CREATE TABLE countries_ext (
country_code VARCHAR2(5),
country_name VARCHAR2(50),
country_language VARCHAR2(50)
)
ORGANIZATION EXTERNAL (
TYPE ORACLE_LOADER
DEFAULT DIRECTORY ext_tab_data
ACCESS PARAMETERS (
RECORDS DELIMITED BY NEWLINE
FIELDS TERMINATED BY ','
MISSING FIELD VALUES ARE NULL
(
country_code CHAR(5),
country_name CHAR(50),
country_language CHAR(50)
)
)
LOCATION ('Countries1.txt','Countries2.txt')
)
PARALLEL 5
REJECT LIMIT UNLIMITED;
Please find details here...
Here is your SQL:
SELECT a,b from TABLE123
where type=10
and values IN(select country_code from countries_ext);
PS off course you can replace your files, which would replace the contents of your external table...

Directly as stated, no. Somewhere a table-like entity must be defined.

If you don't mind copying and editing your text file, you can copy the text file to file.sql, add SELECT a,b from TABLE123 where type=10 and values IN( to the beginning of the file, and ); to the end of the file and add commas and quotes as needed to each line of the file.
Then from SQL*Plus you can just run the file:
SQL> #file.sql
Otherwise no, there's no way to do it without temporarily getting the file data into a table of some sort. #MaxU referenced the method I would choose to use.

Related

Cloned Snowflake table, why is every column in quotes?

I have a table of varchar values, when I copy this table by cloning the entire table has quotation marks around every varchar value.
For example 12/8/2017 becomes "12/8/2017", Finance becomes "Finance".
Wondering A, why did this happen. B is there any way to fix this?
So I tried to think of a senario where this might happen and I found this:
CREATE OR REPLACE TABLE demo_db.public.employees
(emp_id number,
first_name varchar,
last_name varchar
);
-- Populate the table with some seed records.
Insert into demo_db.public.employees
values(100,'"John"','"Smith"')
(200,'Sam','White'),
(300,'Bob','Jones'),
(400,'Linda','Carter');
SELECT * FROM demo_db.public.employees;
CREATE OR REPLACE TABLE demo_db.public.employees_clone
CLONE employees;
From demo: https://community.snowflake.com/s/article/cloning-in-snowflake
You may notice that I had to use ' ' in order for the INSERT statement to accept the data. I did the same INSERT to the cloned table below and received an error.
INSERT INTO demo_db.public.employees_clone VALUES(500,""'Mike'"",'Jones');
However this worked:
INSERT INTO demo_db.public.employees_clone VALUES(500,'"Mike"','Jones');
The results of the select * of the clone:
desc table demo_db.public.employees_clone;
So the type was still varchar, it just had ' " ' a quote in the string.
Try DESC to see what happened. I am going to guess that the original table loaded the strings with "" or from where you are reading it is putting it in quotes. Either way, please share the original data, or a sample of it with support. If you are in the community portal, please see: https://support.snowflake.net/s/article/How-to-Get-Access-to-the-Case-Console-in-the-Lodge-Community

Insert various value

I have a table like this.
create table help(
id number primary key,
number_s integer NOT NULL);
I had to insert value 0 from id 1 and id 915 I solved this one in a simple way doing
update help set number_s=0 where id<=915;
This one was easy.
Now I have to set a numbers ( that change every row) from id 915 to last row.
I was doing
update help set number_s=51 where id=916;
update help set number_s=3 where id=917;
There are more than 1.000 row to be updated how can I do it very fast?
When I had this problem I used to use sequence to auto increment value like id (example
insert into help(id,number_s) values (id_sequence.nextval,16);
insert into help(id,number_s) values (id_sequence.nextval,48);
And so on but on this case it cannot be used because in this case id start from 915 and not 1...) How can I do it very fast? I hope it is clear the problem.
Since you have your ids and numbers in a file with a simple structure, it's a fairly small number, and assuming this is something you're going to do once, honestly what I would do would be to pull the file into Excel, use the text functions to build 1000 insert statements and cut and paste them wherever.
If those assumptions are incorrect, you could (1) use sqlldr to load this file into a temporary table and (2) run an update on your help table based on the rows in that temporary table.
As mentioned in previous answers and according to your comment that there is a file stored in your system, You can use the external table / SQL loader to achieve the result.
I am trying to show you the demo as follows:
-- Create an external table pointing to your file
CREATE TABLE "EXT_SEQUENCES" (
"ID" number ,
"number_s" number
)
ORGANIZATION EXTERNAL ( TYPE ORACLE_LOADER
DEFAULT DIRECTORY "<directory name>" ACCESS PARAMETERS (
RECORDS DELIMITED BY NEWLINE
BADFILE 'bad_file.txt'
LOGFILE 'log_file.txt'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' MISSING FIELD VALUES ARE NULL
) LOCATION ( '<file name>' )
) REJECT LIMIT UNLIMITED;
-- Now update your help table
MERGE INTO help H
USING EXT_SEQUENCES
ON ( H.ID = E.ID)
WHEN MATCHED THEN
UPDATE SET H.number_s = E.number_s;
Note: You need to change the access parameters of the external table according to your actual data in the file.
Hope you will get proper direction now.
Cheers!!

Importing CSV file to Oracle when only DML are allowed

My database is hosted on a server to which I can only issue DML statements.
Is there an SQL command (for Oracle) that I could use to fill a table with the entries from a CSV file? The columns of the CSV file and the table are the same, but if there is a version of the command where I could decide which field from the file goes to which column it would be even better.
Also, I cannot install anything besides the Oracle SQL Developer so what I need is an SQL code that I can run from there. I believe that SQL*Loader and external tables don't help in this situation.
use oracle external table
create directory ext_data_files as 'C:\'; -- create oracle directory object point to the directory where your file resides, using this we will fetch the csv data
create table teachers_ext (
first_name varchar2(15),
last_name varchar2(15),
phone_number varchar2(12)
)
organization external (
type oracle_loader
default directory ext_data_files
access parameters (fields terminated by ',' )
location ('teacher.csv')
)
reject limit unlimited
/
your csv will be like
John,Smith,8737493
Foo, Bar, 829823832
Directly copied from Oracle 9i documentation:
CREATE TYPE student_type AS object (
student_no CHAR(5),
name CHAR(20))
/
CREATE TABLE roster (
student student_type,
grade CHAR(2));
Also assume there is an external table defined as follows:
CREATE TABLE roster_data (
student_no CHAR(5),
name CHAR(20),
grade CHAR(2))
ORGANIZATION EXTERNAL (TYPE ORACLE_LOADER DEFAULT DIRECTORY ext_tab_dir
ACCESS PARAMETERS (FIELDS TERMINATED BY ',')
LOCATION ('foo.dat'));
To load table roster from roster_data, you would specify something
similar to the following:
INSERT INTO roster (student, grade)
(SELECT student_type(student_no, name), grade FROM roster_data);
The external table access driver (aka ORACLE_LOADER) accept a bunch of options to handle many different cases: fixed width, CSV, endianness (binary data), separators... Once again, see the doc for the details.
... if there is a version of the command where I could decide which field from the file goes to which column it would be even better.
As you understood it now, external tables are handled like any other table. So, you can as usual re-order and/or perform calculation on the fly in your INSERT ... SELECT ... statement.

Invalid number error in reading data from external table in Oracle 11gr2

I am having the below DDL for external table.
CREATE TABLE emp_load
(
employee_number VARCHAR2(50),
employee_last_name VARCHAR2(50),
employee_first_name VARCHAR2(50),
employee_middle_name VARCHAR2(50),
employee_hire_date VARCHAR2(50)
)
organization external (TYPE oracle_loader DEFAULT directory abc_dir ACCESS
parameters ( records
delimited BY newline fields terminated BY '|' missing
field VALUES are NULL (employee_number, employee_last_name
, employee_first_name, employee_middle_name,
employee_hire_date) ) location ('info.dat') ) reject limit
UNLIMITED
and my .dat file is like below.
010|ABC|DEF|XYZ|03-DEC-2011
020|CCC|123|SSS|04-DEC-2011
I have a table called
CREATE TABLE test_emp_load_1
(
mployee_number VARCHAR2(50),
employee_last_name VARCHAR2(50),
employee_first_name NUMBER(38),
employee_middle_name VARCHAR2(50),
employee_hire_date VARCHAR2(50)
)
and now i am using the below merge statement ( in the below even though i keep e.EMPLOYEE_NUMBER = '020' i think first it trying to run a scan on the entire external table)
which is giving the below error.
SQL Error: ORA-29913: error in executing ODCIEXTTABLEFETCH callout
ORA-01722: invalid number
but when i am using
MERGE INTO test_emp_load_1 te
USING (select * from emp_load where EMPLOYEE_NUMBER = '020') e
on ( e.EMPLOYEE_FIRST_NAME = te.employee_first_name )
WHEN MATCHED THEN
UPDATE SET
te.employee_last_name = e.EMPLOYEE_LAST_NAME
WHEN NOT MATCHED THEN
INSERT
(te.employee_last_name)
VALUES
( e.EMPLOYEE_LAST_NAME)
where e.EMPLOYEE_NUMBER = '020';
I am getting output 1 row merged. It looks a bug in Oracle 11g R2.
I am using DB Oracle 11G R2 on Windows Platform. I also tried this in Red hat Linux and Oracle 11g R2 I am getting the same issue
Any suggestions?
In your external table you have:
employee_first_name VARCHAR2(50),
In your other table you have:
employee_first_name number(38),
In your merge you have:
on ( e.EMPLOYEE_FIRST_NAME = te.employee_first_name )
So you're comparing a string with a number. They have to be compared as the same type; it could go either way, but Oracle is choosing to convert the string into a number to do the comparision, so it's effectively doing:
on ( to_number(e.EMPLOYEE_FIRST_NAME) = te.employee_first_name )
If your data is actually numeric then that is OK, though it would be better to have the data types right in the first place. But your data is not numeric, and probably isn't really meant to be. Look at your sample data again:
010|ABC|DEF|XYZ|03-DEC-2011
020|CCC|123|SSS|04-DEC-2011
The 'first name' is the third field in the file. The second row is OK as '123' can be converted to a number. The first row is not OK, 'DEF' cannot be converted to a number. That row is therefore rejected. This probably isn't the field you meant to be numeric in the first place though, given its name.
As Ben mentioned you have the mployee_number field named incorrectly in your normal table, so that will also error at some point. Just to avoid those errors your table would need to be defined like this:
create table test_emp_load_1 (employee_number NUMBER,
employee_last_name VARCHAR2(50),
employee_first_name VARCHAR2(50),
employee_middle_name VARCHAR2(50),
employee_hire_date DATE)
Assuming all the records will actually have a numeric first field, and a valid date as the last field. Your external table definition should also define the columns with the correct type, and specify the expected date format so that doesn't error. You should always use the correct data types: never store numbers or dates as strings, even in an external table definition (though they are obviously string in the actual external file).
The merge also seems odd as you're only setting the last name for inserted records.

External table in Oracle Database

EMPDET is an external table containing the columns EMPNO and ENAME. What is external table in oracle database?
Why can/cannot we update/delete from an external table?
A. UPDATE empdet
SET ename = 'Amit'
WHERE empno = 1234;
B. DELETE FROM empdet
WHERE ename LIKE 'J%';
An external table in an oracle-database is a way of accessing data residing in some .txt or .csv file via sql-commands. So the table-data is not kept in the database-tablespace but it is rather some kind of view on the sequential dataset. So there is no way the database can index or update the data since it is outside it's scope but it can only do selects on it.
"External Table" means you have a (typically) CSV file stored on your file system and Oracle reads this CSV file defined by settings in CREATE TABLE statement. The data is not saved in Oracle Tablespace but you can select them like a normal table. However, you can only select them (or logically create a view from it) but you cannot modify anything.
Here a simple example of an external table:
CREATE TABLE ADHOC_CSV_EXT (
C1 VARCHAR2(4000),
C2 VARCHAR2(4000),
C3 VARCHAR2(4000)
)
ORGANIZATION EXTERNAL (
TYPE ORACLE_LOADER
DEFAULT DIRECTORY SOME_FOLDER
ACCESS PARAMETERS (
records delimited BY newline
fields terminated BY ',' optionally enclosed BY '"'
missing field VALUES are NULL)
LOCATION ('foo.csv')
);