My task is to create some external tables using hive beeline. But I encountered relative path error, says "Relative path in absolute URI: hdfs://localhost:8020./user/bigdata) (state=08S01,code=1)
Aborting command set because "force" is false and command failed:"
I am using a hql script(by requirement) to create external table, my script is this:
create external table ecustomer(
customer_id DECIMAL(3),
customer_code VARCHAR(5),
company_name VARCHAR(100),
contact_name VARCHAR(50),
contact_title VARCHAR(30),
city VARCHAR(30),
region VARCHAR(2),
postal_code VARCHAR(30),
country VARCHAR(30),
phone VARCHAR(30),
fax VARCHAR(30))
row format delimited fields terminated by '|'
stored as textfile location 'user/bigdata/ecustomer';
create external table eorder_detail(
order_id DECIMAL(5),
product_id DECIMAL(2),
customer_id DECIMAL(3),
salesperson_id DECIMAL(1),
unit_price DECIMAL(2,2),
quantity DECIMAL(2),
discount DECIMAL(1,1))
row format delimited fields terminated by '|'
stored as textfile location 'user/bigdata/eorder_detail';
create external table eproduct(
product_id DECIMAL(2),
product_name VARCHAR(50),
unit_price DECIMAL(2,2),
unit_in_stock DECIMAL(4),
unit_on_order DECIMAL(3),
discontinued VARCHAR(1))
row format delimited fields terminated by '|'
stored as textfile location 'user/bigdata/eproduct';
create external table esalesperson(
employee_id DECIMAL(1),
lastname VARCHAR(30),
firstname VARCHAR(30),
title VARCHAR(50),
birthdate VARCHAR(30),
hiredate VARCHAR(30),
notes VARCHAR(100))
row format delimited fields terminated by '|'
stored as textfile location 'user/bigdata/esalesperson';
create external table eorder(
order_id DECIMAL(5),
order_date VARCHAR(30),
ship_via DECIMAL(1),
ship_city VARCHAR(30),
ship_region VARCHAR(30),
ship_postal_code VARCHAR(30),
ship_country VARCHAR(30))
row format delimited fields terminated by '|'
stored as textfile location 'user/bigdata/eorder';
then, I execute this script on beeline server, however, I encountered the abovementioned error. I have already create a folder on my hadoop server for each table which are ecustomer, eorder_detail, eproduct, esalesperson and eorder. And the tables are also uploaded to hadoop server. Please help me resolve the error.
Try using an absolute path, instead of a relative one. e.g. 'hdfs://localhost:8020/user/bigdata/ecustomer'
create external table ecustomer(
customer_id DECIMAL(3),
customer_code VARCHAR(5),
company_name VARCHAR(100),
contact_name VARCHAR(50),
contact_title VARCHAR(30),
city VARCHAR(30),
region VARCHAR(2),
postal_code VARCHAR(30),
country VARCHAR(30),
phone VARCHAR(30),
fax VARCHAR(30))
row format delimited fields terminated by '|'
stored as textfile location 'hdfs://localhost:8020/user/bigdata/ecustomer';
...
[same for other DDLs]
Related
I have a CSV file, with 19 fields, some fields contain text in Arabic language.
The encoding of the file is UTF8:
I created a table in postgres with the same name of the fields of the file
The goal is to import the data from the CSV file to the table created.
CREATE TABLE Dubai_201606 (
Case_Type_Arabic VARCHAR(100),
Case_Type VARCHAR(100),
Case_Number VARCHAR(20),
Abbreviation VARCHAR(50),
Notice_Date Date,
Notifier VARCHAR(100),
English VARCHAR(100),
Notifier_N_Ref VARCHAR(50),
Notifier_Licence_No int,
Notifier_Company_No VARCHAR(50),
Notifier_City VARCHAR(50),
Party VARCHAR(100),
Enlish_Party VARCHAR(100),
Party_N_Ref VARCHAR(50),
Party_Licence_No int,
Party_Company_No VARCHAR(50),
Party_City VARCHAR(50),
Subject_Arabic TEXT,
Subject_English TEXT
)
Then I used Copy to import the file into the created table.
COPY Dubai_201606 FROM 'C:\Users\king-\OneDrive\Bureau\201606.csv' WITH CSV HEADER;
After execution I had the following error
ERROR: ERROR: sequence of bytes invalid for "UTF8" encoding: 0xff
CONTEXT: COPY dubai_201606, line 1
SQL state: 22021
CREATE EXTERNAL TABLE schema_vtvs_ai_ext.fire(
fire_number VARCHAR(50),
fire_year DATE,
assessment_datetime INTEGER,
size_class CHAR,
fire_location_latitude REAL,
fire_location_longitude REAL,
fire_origin VARCHAR(50),
general_cause_desc VARCHAR(50),
activity_class VARCHAR(50),
true_cause VARCHAR(50),
fire_start_date DATE,
det_agent_type VARCHAR(50),
det_agent VARCHAR(50),
discovered_date DATE,
reported_date DATE,
start_for_fire_date DATE,
fire_fighting_start_date DATE,
initial_action_by VARCHAR(50),
fire_type VARCHAR(50),
fire_position_on_slope VARCHAR(50),
weather_conditions_over_fire VARCHAR(50),
fuel_type VARCHAR(50),
bh_fs_date DATE,
uc_fs_date DATE,
ex_fs_date DATE
);
This is the SQL code i have written to add an external table in Redhsift schema but the below error. i can't seem to see where the error is?
[Amazon](500310) Invalid operation: syntax error at end of input Position: 684;
If your data is in Amazon S3, then you need to specify the file format (via STORED AS) and the path to data files in S3 (via LOCATION).
This is the example query for csv files (with 1 line header):
create external table <external_schema>.<table_name> (...)
row format delimited
fields terminated by ','
stored as textfile
location 's3://mybucket/myfolder/'
table properties ('numRows'='100', 'skip.header.line.count'='1');
See official doc for details.
I tried deleting a row from a table but the result was
DELETE 0
When I created a copy of the original table and performed the delete on the new copy-table
the result was: DELETE 1 as expected
I've created a trigger function that updates a column on a different table upon inserting or deleting is there an option that this is the problem?
Does anyone know what might be the problem ??
P.S. I post a similar to the creation code of the original table because it has been altered for the next parts of the university project and the trigger function's code
P.S.2 A friend who has the same exact database, table, trigger function etc.
doesn't have this problem
create table Listings(
id int,
listing_url varchar(40),
scrape_id bigint,
last_scraped date,
name varchar(140),
summary varchar(1780),
space varchar(1840),
description varchar(2320),
experiences_offered varchar(10),
neighborhood_overview varchar(1830),
notes varchar(1790),
transit varchar(1810),
access varchar(1830),
interaction varchar(1340),
house_rules varchar(1820),
thumbnail_url varchar(10),
medium_url varchar(10),
picture_url varchar(150),
xl_picture_url varchar(10),
street varchar(90),
neighbourhood varchar(20),
neighbourhood_cleansed varchar(70),
neighbourhood_group_cleansed varchar(10),
city varchar(40),
state varchar(60),
zipcode varchar(20),
market varchar(30),
smart_location varchar(40),
country_code varchar(10),
country varchar(10),
latitude varchar(10),
longitude varchar(10),
is_location_exact boolean,
property_type varchar(30),
room_type varchar(20),
accommodates int,
bathrooms varchar(10),
bedrooms int,
beds int,
bed_type varchar(20),
amenities varchar(1660),
);
CREATE FUNCTION Vs()
RETURNS "trigger" AS
$Body$
BEGIN
IF(TG_OP='INSERT')THEN
UPDATE "Host"
SET listings_count = listings_count + 1
WHERE id = NEW.host_id;
RETURN NEW;
ELSIF(TG_OP='DELETE')THEN
UPDATE "Host"
SET listings_count = listings_count -1
WHERE id = OLD.host_id;
RETURN OLD;
END IF;
END;
$Body$
LANGUAGE 'plpgsql' VOLATILE;
CREATE TRIGGER InsertTrigger
before INSERT
ON "Listing"
FOR EACH ROW
EXECUTE PROCEDURE Vs();
CREATE TRIGGER DeleteTrigger
before DELETE
ON "Listing"
FOR EACH ROW
EXECUTE PROCEDURE Vs();
I am trying to create a basic table using subtypes and insert some data into this in Oracle Express 11g.
My table is successfully created but i am having issues with inserting data.
The result of my insert statement always throws back an error 'SQL Error: ORA-00904: "BRANCH_PHONE": invalid identifier'.
The column which shows up in the error message is always the column which is at the end of the insert statement, despite the column existing in the table. I have tried the following code:
create type addressType as object(
street varchar2(20),
city varchar2(20),
postCode varchar2(8))
not final
/
create type branchType as object(
branchID int,
branch_address addressType,
branch_phone int(11))
not final
/
create table Branch of branchType(
constraint branch_pk primary key(branchID));
/
insert into Branch values (
branchID('2364'),
addressType('12 Rooster','Atlantis','A13 4UG'),
branch_phone('01316521311'));
I would really appreciate any ideas.
I made some changes, including changing the branch_phone to varchar2. A Phone number, while is "numbers" is not a data type of number. it is a string of characters. Also you were passing branchID as a string, but you are declaring it as a number, so changed that also. BranchID and branch_phone are primitive data types, so no constructor needed.
create type addressType as object(
street varchar2(20),
city varchar2(20),
postCode varchar2(8))
not final
/
create type branchType as object(
branchID int,
branch_address addressType,
branch_phone varchar2(11))
not final
/
create table Branch of branchType(
constraint branch_pk primary key(branchID));
/
insert into Branch values (
branchtype(2364,
addressType('12 Rooster','Atlantis','A13 4UG'),
'01316521311') )
I receive error as below every time when i select external table that i have created.
ORA-29913: bład podczas wykonywania wywołania (callout) ODCIEXTTABLEOPEN
ORA-29400: bład kartrydza danych
KUP-00554: error encountered while parsing access parameters
KUP-01005: syntax error: found "minussign": expecting one of: "badfile, byteordermark, characterset, column, data, delimited, discardfile, dnfs_enable, dnfs_disable, disable_directory_link_check, field, fields, fixed, io_options, load, logfile, language, nodiscardfile, nobadfile, nologfile, date_cache, dnfs_readbuffers, preprocessor, readsize, string, skip, territory, variable, xmltag"
KUP-01007: at line 4 column 23
29913. 00000 - "error in executing %s callout"
The external table is created successfully. Here is the script which creates external table:
CREATE TABLE TB_CNEI_01C
(
NEW_OMC_ID VARCHAR(2),
NEW_OMC_NM VARCHAR(8),
NEW_BSS_ID VARCHAR(6),
NEW_BSS_NM VARCHAR(20),
OMC_ID VARCHAR(2),
OMC_NM VARCHAR(8),
OLD_BSS_ID VARCHAR(6),
OLD_BSS_NM VARCHAR(20),
DEPTH_NO INTEGER,
NE_TP_NO INTEGER,
OP_YN INTEGER,
FAC_ALIAS_NM VARCHAR(20),
FAC_GRP_ALIAS_NM VARCHAR(20),
SPC_VAL VARCHAR(4),
INMS_FAC_LCLS_CD VARCHAR(2),
INMS_FAC_MCLS_CD VARCHAR(3),
INMS_FAC_SCLS_CD VARCHAR(3),
INMS_FAC_SCLS_DTL_CD VARCHAR(2),
LDEPT_ID VARCHAR(3),
FAC_ID VARCHAR(15),
MME_IP_ADDR VARCHAR(20),
MDEPT_ID VARCHAR(4),
HW_TP_NM VARCHAR(20),
MME_POOL_NM VARCHAR(20),
BORD_CNT INTEGER,
FAC_DTL_CLSFN_NM VARCHAR(50),
INSTL_FLOOR_NM VARCHAR(20),
INSTL_LOC_NM VARCHAR(30)
)
ORGANIZATION EXTERNAL
(
TYPE oracle_loader
DEFAULT DIRECTORY EXTERNAL_DATA
ACCESS PARAMETERS
(
RECORDS DELIMITED BY NEWLINE
badfile EXTERNAL_DATA:'testTable.bad'
logfile EXTERNAL_DATA:'testTable.log'
CHARACTERSET x-IBM949
FIELDS TERMINATED BY ','
MISSING FIELD VALUES ARE NULL
(
NEW_OMC_ID VARCHAR(2),
NEW_OMC_NM VARCHAR(8),
NEW_BSS_ID VARCHAR(6),
NEW_BSS_NM VARCHAR(20),
OMC_ID VARCHAR(2),
OMC_NM VARCHAR(8),
OLD_BSS_ID VARCHAR(6),
OLD_BSS_NM VARCHAR(20),
DEPTH_NO INTEGER,
NE_TP_NO INTEGER,
OP_YN INTEGER,
FAC_ALIAS_NM VARCHAR(20),
FAC_GRP_ALIAS_NM VARCHAR(20),
SPC_VAL VARCHAR(4),
INMS_FAC_LCLS_CD VARCHAR(2),
INMS_FAC_MCLS_CD VARCHAR(3),
INMS_FAC_SCLS_CD VARCHAR(3),
INMS_FAC_SCLS_DTL_CD VARCHAR(2),
LDEPT_ID VARCHAR(3),
FAC_ID VARCHAR(15),
MME_IP_ADDR VARCHAR(20),
MDEPT_ID VARCHAR(4),
HW_TP_NM VARCHAR(20),
MME_POOL_NM VARCHAR(20),
BORD_CNT INTEGER,
FAC_DTL_CLSFN_NM VARCHAR(50),
INSTL_FLOOR_NM VARCHAR(20),
INSTL_LOC_NM VARCHAR(30)
)
)
LOCATION ('TB_CNEI_01C.csv')
);
I have checked all permisions for data directory and data files
I had a few commented lines in my 'CREATE TABLE ..' script. I removed those commented lines and the error disappeared.
I received the above suggestion from : http://www.orafaq.com/forum/t/182288/
It seems your CHARACTERSET(x-IBM949) containing - character is not valid
You may try the other alternatives without that sign,
such as
AL32UTF8, US7ASCII, WE8MSWIN1252 .. etc.