How to insert csv file to postgresql using jmeter - sql

i'm trying to insert CSV data file to POSTGRESQL from JMETER using "LOAD DATA" statement. But LOAD DATA statement won't working. On the internet load statement becoming highlighted and working but on
my machine its not becoming highlighted and getting error on it.
Here is my SQL query:
"LOAD DATA INFILE '/Desktop/Summary/oca_sum.csv' INTO TABLE Load
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n' "
And error is:
org.postgresql.util.PSQLException: ERROR: syntax error at or near "DATA" Position: 6
Thanks.

You are trying to run a MySQL command against a PostgreSQL database.
PostgreSQL uses a different syntax for accomplishing the same task.

Related

Fixing error in a SHOW TABLES IN DATABASE name query

I am trying to list all the table in a database in Amazon AWS Athena via a Python script.
Here is my script:
data = {'name':['database1', 'database-name', 'database2']}
# Create DataFrame
df = pd.DataFrame(data)
for index, schema in df.iterrows():
tables_in_schema = pd.read_sql("SHOW TABLES IN "+schema[0],conn)
There is an error running this
When I run the same query in the Athena query editor, I get an error
SHOW TABLES IN database-name
Here is the error
DatabaseError: Execution failed on sql: SHOW TABLES IN database-name
An error occurred (InvalidRequestException) when calling the StartQueryExecution operation: line
1:19: mismatched input '-'. Expecting: '.', 'LIKE', <EOF>
unable to rollback
I think the issue is with the hypen "-" in the database name.
How do I escape this in the query?
You can use the Glue client instead. It provides a function get_tables(), which returns a list of all the tables in a specific data base.
The database, table or columns names cannot have anything other than an underscore "_" in its name. Any other special character will cause an issue when querying. It does not stop you from creating an object with the special characters but will cause an issue when using those objects.
The only way around this is to re-create the database names without the special character, hyphen "-" in this case.
https://docs.aws.amazon.com/athena/latest/ug/tables-databases-columns-names.html

\copy command is not working to import csv table into my AWS database

I'm trying to import a csv table into my cloud database on AWS.
I'm using Dbeaver to connect to my database
the COPY command is not allowed by AWS so I must use \copy
\copy orders from 'C:\tmp\orders.csv' with DELIMITER ',';
Dbeaver returns an error
Reason:
SQL Error [42601]: ERROR: syntax error at or near "\"
Position: 1
Double the backslashes to escape them.
\copy orders from 'C:\\tmp\\orders.csv' with DELIMITER ',';
\COPY is a psql command and not a SQL statement, you need to use native Dbeaver command or connect to the database with the psql client.
https://www.postgresql.org/docs/9.6/app-psql.html#APP-PSQL-META-COMMANDS-COPY

Derby SQL commands to export to CSV file

I am very new to Apache Derby, and I have a need to export all of the rows from a table into a csv file.
I am using RazorSQL to connect to the database.
What I need to know is if there is an equivalent Derby SQL command to what MySQL does to export data below.
NOTE: VERY IMPORTANT. One of the columns in this database table is defined as an XML datatype. It will NOT work using any kind of wizard to export the data, because the select statement needs to use the 'xmlserialize' statement in the SQL command.
Here is what I want to do, but Derby does not understand the "INTO OUTFILE" statement I am used to using with MySQL.
select
message_id,
msg_schema_id,
user_id,
req_send_time,
destination,
xmlserialize(msg_content as clob) as message,
facility_id
from Audit
into outfile 'd:\csv\audit.csv'
fields terminated by ','
enclosed by '"'
lines terminated by '\n';

Aginity for Netezza Create Temp Table from external xlsx file using ODBC

In Aginity Workbench for Netezza, I am trying to create a temp table from a .XLSX file containing 13 columns but only need columns 1 and 5. I can export to tab delimited .TXT with only the two columns needed and it works fine, but I would like to avoid converting from the original file as it is regularly updated and others may be running this file.
Must be TEMP TABLE and must be XLSX. The temp table will JOINed in a subsequent query.
I have the following query:
CREATE TEMP TABLE office AS
(SELECT zip_code, DISPATCH_LEVEL
FROM EXTERNAL 'file.xlsx'
(zip_code VARCHAR(10), DISPATCH_LEVEL VARCHAR(100))
USING (REMOTESOURCE 'ODBC' DELIMITER '\t'));
I get the following error block:
ERROR [HY008] Operation canceled
ERROR [01000] Unable to write nzlog/bad files
ERROR [01000] Unable to write nzlog/bad files
ERROR [HY000] ERROR: External Table : count of bad input rows reached maxerrors limit
Netezza external tables simply do not directly support XLSX files. They require character delimited files, fixed length files, or internal/native format files.

Importing CSV to MySQL with Load Data Infile

I'm trying to load a CSV file into MySQL, and I keep getting syntax errors:
load data infile 'c:/worldminwage.csv' fields terminated by ',' enclosed
by '"' lines terminated by '\n' (YearlyWage, PercentGDP, Date
effective);
Can anyone help me get this working? Thanks.
Valid syntax is LOAD DATA INFILE 'c:/worldminwage.csv' INTO TABLE tablename.
You forgot to mention which table the data should go in. See LOAD DATA INFILE Syntax
I think This would be more batter
LOAD DATA INFILE 'c:/worldminwage.csv' fields terminated by ',';
Because you will get rid of error like columns numbers are not matching in the .csv file and mysql table