external table in oracle? - sql

hi friends i have tried to load data from flat file to external file using oralce_loader access driver my code is
create table test_ext (
id_1 varchar(35),
emp_name varchar(25),
e_mail varchar(25))
organization external (
type oracle_loader default directory new_dir access parameters
( records delimited by newline fields(
id_1 char(30),
e_name char(25),
mail char(25)))
location('test.csv')) reject limit unlimited;
and my data file:
"E.FIRST_NAME||','||E.EMAIL||','||MANAGER_ID"
-----------------------------------------------
"Jennifer,JWHALEN,101"
"Michael,MHARTSTE,100"
"Susan,SMAVRIS,101"
"Hermann,HBAER,101"
"Shelley,SHIGGINS,101"
"William,WGIETZ,205"
"Steven,SKING,"
"Neena,NKOCHHAR,100"
"Lex,LDEHAAN,100"
"Alexander,AHUNOLD,102"
"Bruce,BERNST,103"
"David,DAUSTIN,103"
"Valli,VPATABAL,103"
"Diana,DLORENTZ,103"
"Nancy,NGREENBE,101"
"Daniel,DFAVIET,108"
"John,JCHEN,108"
while run that above query i got
**ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04043: table column not found in external source: EMP_NAME
29913. 00000 - "error in executing %s callout"
*Cause: The execution of the specified callout caused an error.
*Action: Examine the error messages take appropriate action.**
I tried so many things but I can't.

firstly your csv file looks wrong.
"Alexander,AHUNOLD,102"
remove all quotes, otherwise it will look like one field.
secondly your using the "fields" syntax suggesting you want fixed length csv file, but your dat file is comma seperated. so i think you want to fix your csv to remove the quotes and the two spurious header lines and change your table DDL to :
create table test_ext (
id_1 varchar(35),
emp_name varchar(25),
e_mail varchar(25))
organization external (
type oracle_loader default directory new_dir access parameters
(
records delimited by newline
fields terminated by ',' optionally enclosed by '"')
location('test.csv')) reject limit unlimited;
eg:
SQL> host cat test.csv
Jennifer,JWHALEN,101
Michael,MHARTSTE,100
Susan,SMAVRIS,101
Hermann,HBAER,101
Shelley,SHIGGINS,101
William,WGIETZ,205
Steven,SKING,
Neena,NKOCHHAR,100
Lex,LDEHAAN,100
Alexander,AHUNOLD,102
Bruce,BERNST,103
David,DAUSTIN,103
Valli,VPATABAL,103
Diana,DLORENTZ,103
Nancy,NGREENBE,101
Daniel,DFAVIET,108
John,JCHEN,108
SQL> create table test_ext (
2 id_1 varchar(35),
3 emp_name varchar(25),
4 e_mail varchar(25))
5 organization external (
6 type oracle_loader default directory new_dir access parameters
7 (
8 records delimited by newline
9 fields terminated by ',' optionally enclosed by '"')
10 location('test.csv')) reject limit unlimited;
Table created.
SQL> select * from test_ext;
ID_1 EMP_NAME E_MAIL
----------------------------------- ------------------------- -------------------------
Jennifer JWHALEN 101
Michael MHARTSTE 100
Susan SMAVRIS 101
Hermann HBAER 101
Shelley SHIGGINS 101
William WGIETZ 205
Neena NKOCHHAR 100
Lex LDEHAAN 100
...etc...

Related

Extract data from file name and store it in a table using stored procedure

I have different types of files (pdf, csv, doc, txt) in a directory.
Name of the files are something like this:
John.Doe.19900101.TX.pdf //FirstName.LastName.DOB.StateOfResidence
Bruce.Banner.19700101.PA.doc
Steve.Rodgers.19760101.AR.csv
Tony.Stark.19901210.CA.txt
How to write a stored procedure in Oracle to read the files in a directory and extract FirstName, LastName, DOB, State and store it in a table in appropriate columns?
Ex: For the file John.Doe.19900101.TX.pdf, data should be extracted like this:
John in FirstName column
Doe in LastName column
19900101 in DOB column
TX in State column
whole file in CLOB column
You will have to work at the OS level to gather filenames from OS directory. Considering you are trying to get the information from Unix flavors; Following URL will help you grab the file listing into a table and even a view.
Code that you need is :
--drop directory SCRIPT_TEMP_DIR;
CREATE DIRECTORY SCRIPT_TEMP_DIR AS '/home/oracle/oracle_scripts'
;
GRANT EXECUTE ON DIRECTORY SCRIPT_TEMP_DIR TO USER_NAME
; -- Here USER_NAME will be your SCHEMA/USER NAME
drop table USER_NAME.home_directory purge;
CREATE TABLE USER_NAME.home_directory
(
filerecord VARCHAR2(15),
flink VARCHAR2(2),
fowner VARCHAR2(6),
fgroup VARCHAR2(8),
fsize VARCHAR2(32),
fdate_part1 VARCHAR2(16),
fdate_part2 VARCHAR2(16),
fdate_year_or_time VARCHAR2(16),
fname VARCHAR2(255)
)
ORGANIZATION EXTERNAL
(
TYPE ORACLE_LOADER
DEFAULT DIRECTORY SCRIPT_TEMP_DIR
ACCESS PARAMETERS
(
records delimited by newline
preprocessor SCRIPT_TEMP_DIR:'listing.sh'
fields terminated by whitespace
(
filerecord ,
flink ,
fowner ,
fgroup ,
fsize ,
fdate_part1 ,
fdate_part2 ,
fdate_year_or_time ,
fname
)
)
LOCATION ('listing.sh')
)
REJECT LIMIT UNLIMITED;
Once this is done, you just need to select from above table created.
SELECT *
FROM USER_NAME.home_directory;
Later you can apply substr/instr functions to split info. You may also use regex function to get the requried information.
SELECT fname,
regexp_substr(fname, '[^.]+', 1, 1) part1,
regexp_substr(fname, '[^.]+', 1, 2) part2,
regexp_substr(fname, '[^.]+', 1, 3) part3
FROM USER_NAME.home_directory;
And this gives you :
The required URL to follow is here
The code pasted above was modified where you need to change USER_NAME as well while granting permissions on directory.

ORA-29913: error in executing ODCIEXTTABLEOPEN callout

I am creating external table using hr schema but i get errors
"ORA-29913: error in executing ODCIEXTTABLEOPEN callout ORA-29400:
data cartridge error KUP-00554: error encountered while parsing access
parameters KUP-01005: syntax error: found "missing": expecting one of:
"column, (" KUP-01007: at line 4 column 3
29913. 00000 - "error in executing %s callout"
*Cause: The execution of the specified callout caused an error.
*Action: Examine the error messages take appropriate action."
----------------My Code-------------------
create directory ex_tab as 'C:\My Works\External Table';
create table strecords (
st_id number(4),
st_name varchar(10),
schl_name varchar(5),
st_city varchar(15),
st_year number(4)
)
ORGANIZATION EXTERNAL
(TYPE oracle_loader
DEFAULT DIRECTORY ex_tab
ACCESS PARAMETERS
(
RECORDS DELIMITED BY newline
FIELDS TERMINATED BY ','
REJECT ROWS WITH ALL NULL FIELDS
MISSING FIELDS VALUES ARE NULL
(
st_id number(4),
st_name char(10),
schl_name char(5),
st_city char(15),
st_year number(4)
)
)
LOCATION ('strecords.txt')
);
desc strecords;
select * from strecords;
This is my code, please check it and review it.
You have several issues here. The immediate one causing your problem is that you have the clauses in the wrong order, but you also have MISSING FIELDS instead of MISSING FIELD:
...
ACCESS PARAMETERS
(
RECORDS DELIMITED BY newline
FIELDS TERMINATED BY ','
MISSING FIELD VALUES ARE NULL
REJECT ROWS WITH ALL NULL FIELDS
(
...
Then your field list contents have invalid data types for that part of the statement; you can just omit that entirely in this case as those match the table column definition.
If no field list is specified, then the fields in the data file are assumed to be in the same order as the fields in the external table.
So you can simplify it to:
create table strecords (
st_id number(4),
st_name varchar(10),
schl_name varchar(5),
st_city varchar(15),
st_year number(4)
)
ORGANIZATION EXTERNAL
(TYPE oracle_loader
DEFAULT DIRECTORY ex_tab
ACCESS PARAMETERS
(
RECORDS DELIMITED BY newline
FIELDS TERMINATED BY ','
MISSING FIELD VALUES ARE NULL
REJECT ROWS WITH ALL NULL FIELDS
)
LOCATION ('strecords.txt')
);
Some operations needed against an Oracle Bug called
Select From External Table Returns Errors ORA-29913 ORA-29400 KUP-554 KUP-1005 (Doc ID 302672.1)
When creating an external table, the access parameters should be specified in the following order:
Comments
Record Format Info
Field definitions
Specify Comments, Record Format Info and Field definitions in the correct order. Even inside the Record format Info, 'Records delimited by ...' clause should come before any other clause.
For more information, refer to the access_parameters clause.

SQL Server Management Studio - Query using text file

Let's say I have a text file that has the following lines
102
333
534
Then, in my SQL table, I have a few different columns:
AutoID | Name | Description
--------------------------------------
102  | Jackson  | [Description Here]
241  | Edward   | [Description Here]
333  | Timothy  | [Description Here]
437  | Nikky    | [Description Here]
534  | Jeremy   | [Description Here]
Is there anyway I can parse the text file through SQL Server Management Studio so that it will query the table and pull out every row that has a column (AutoID, in this case) that matches a line in the text file (Note, I only want the rows from a table that I would specify)?
This way I could edit them or update the rows that only match IDs in the text file.
The rows displayed in management studio would look like this.
 AutoID | NAME  | Description
--------------------------------------
102  | Jackson  | [Description Here]
333  | Timothy  | [Description Here]
534  | Jeremy   | [Description Here]
--What you need to do is import the text file into a table in your SQL database, and then compare its values to the table you want to query (which I have called AutoIDTest in my example).
--Using your example data, I've put together the following code that accomplishes this process.
--1. I created a destination table for the text file's values called TextImport. I've called the test text file E:\TestData.txt. Additionally, I'm assuming this text file only has one column, autoID.
--2. Then, I imported the data into the destination table using the BULK INSERT statement.
--3. Finally, I compared the data in TextImport to the table from which you are seeking values using an INNER JOIN statement.
CREATE TABLE AutoIDTest ---Create test table. Since your first column doesn't have a name, I'm calling it ID. I'm assuming AutoID and ID are both of type int.
(
ID int,
AutoID int,
Name varchar(25),
Description varchar(50)
)
INSERT INTO AutoIDTest -- Populate test table
VALUES
( 1, 102, 'Jackson', 'Description1'),
( 2, 241, 'Edward', 'Description2'),
( 3, 333, 'Timothy', 'Description3'),
( 4, 437, 'Nikky', 'Description4'),
( 5, 534, 'Jeremy', 'Description5')
CREATE TABLE TextImport --Create destination table for text file.
(
AutoID varchar(20),
)
BULK INSERT TextImport --Load Data from text file into TextImport table
FROM 'E:\TestData.txt' ---The name and location of my test text file.
WITH
(
ROWTERMINATOR ='\n'
);
SELECT ---Produce Output Data
ID,
t1.AutoID,
Name,
Description
FROM
AutoIDTest AS t1
INNER JOIN
TextImport AS t2
ON t1.autoID = cast(t2.autoID AS int) --convert varchar to int
OUTPUT
-- ID AutoID Name Description
-- --------- ----------- ------------------------- -------------------
-- 1 102 Jackson Description1
-- 3 333 Timothy Description3
-- 5 534 Jeremy Description5

import csv file to oracle database

I am trying to import data from txt file to table. The TXT file is having 5 records.
'ext.txt' is my file .'IMPORT' is a directory.
Records are
7499,ALLEN,SALESMAN,30
7521,WARD,SALESMAN,30
7566,JONES,MANAGER,20
7654,MARTIN,SALESMAN,30
I tried below query but its only inserts 3rd record to the external table.
Can anyone provide me the reason for this ans solution for insert all rows.
create table ext_tab (
empno CHAR(4),
ename CHAR(20),
job1 CHAR(20),
deptno CHAR(2)
)
ORGANIZATION EXTERNAL (
TYPE ORACLE_LOADER
DEFAULT DIRECTORY IMPORT
ACCESS PARAMETERS (
RECORDS DELIMITED BY NEWLINE
BADFILE IMPORT:'test.bad'
LOGFILE IMPORT:'test.log'
FIELDS TERMINATED BY ',' (
empno char(4) ,
ename char(4),
job1 CHAR(20),
deptno CHAR(2)
)
)
LOCATION (import:'ext.txt')
)
PARALLEL 5
REJECT LIMIT UNLIMITED;
This will work for your given test data
CREATE TABLE ext_tab (
empno VARCHAR(4),
ename VARCHAR(20),
job1 VARCHAR(20),
deptno VARCHAR(2)
)
ORGANIZATION EXTERNAL (
DEFAULT DIRECTORY import
ACCESS PARAMETERS (
RECORDS DELIMITED BY NEWLINE
FIELDS TERMINATED BY ','
)
LOCATION ('ext.txt')
);
providing that you have set up the import directory correctly.
Tested on 11.2.0.4
As for the reason it was given in the comments above. Always check the .bad and .log files first (created in the directory you put the file in). These are very helpful at telling you why rows are rejected.
I expect you have errors like this in the log:-
KUP-04021: field formatting error for field ENAME
KUP-04026: field too long for datatype
KUP-04101: record 1 rejected in file /import_dir/ext.txt
KUP-04021: field formatting error for field ENAME
KUP-04026: field too long for datatype
KUP-04101: record 3 rejected in file /import_dir/ext.txt
KUP-04021: field formatting error for field ENAME
KUP-04026: field too long for datatype
KUP-04101: record 4 rejected in file /import_dir/ext.txt
and that only WARD was imported because only his name fits into CHAR(4).

sql loader truncate always the first column

I used oracle 10g and I have this file.ctl
OPTIONS (SKIP=1)
LOAD DATA
INFILE '/home/gxs/segmentation/sqlloader/datos.csv'
APPEND INTO TABLE test
(id "s_test.nextval",
name char(10) TERMINATED BY ',' ,
tel char(20) TERMINATED BY ',' ,
apellido char(10) TERMINATED BY ',' )
My csv file es
name,tel,apellido
daniel,12345,buitrago
cesar,98765,san
alex,4556,ova
but when see the table the name haven't the firs caracter:
id name apellido tel
1 aniel buitrago 12345
2 esar san 98765
3 lex ova 4556
What do?
As per the documentation, you need to use the EXPRESSION keyword to show that the value is coming purely from the specified expression and is not dependent on anything in the data file:
OPTIONS (SKIP=1)
LOAD DATA
INFILE '/home/gxs/segmentation/sqlloader/datos.csv'
APPEND INTO TABLE test
(id EXPRESSION "s_test.nextval",
name char(10) TERMINATED BY ',' ,
tel char(20) TERMINATED BY ',' ,
apellido char(10) TERMINATED BY ',' )
... which inserts:
ID NAME TEL APELLIDO
---------- ---------- ---------- ----------
1 daniel 12345 buitrago
2 cesar 98765 san
3 alex 4556 ova
At the moment it's assuming ID is a field in your data file, and since you haven't specified a data type it's defaulting to char with size 1, which is consuming the first character of your real first field.