I need to write the results of executing a hive query to a file. how do i do it? currently, it's printing to the console.
beeline -u db_url -n user_name -p password -f query.sql
i tried:
beeline -u db_url -n user_name -p password -f query.sql 2> output.txt
but output.txt just contains when connection started and closed, not the results of the query - which are still being printed to the console.
I assume beeline -u db_url -n user_name -p password -f query.sql > output.txt must be OK. Without 2
"2" in your command is errlog, not the stdout
so "...query.sql 2> output.txt" would put the errlog results into your text file, while "...query.sql > output.txt" would put the actual output into the text file.
In addition to #dds 's answer you can try adding the silent feature to get rid of all the other stuffs like the connection started and closed status being printed in the output file.
beeline -u db_url -n user_name -p password --silent=true -f query.sql > output.txt
I think you meant to type "csv2" instead of "csv 2". Here's the fixed command line:
beeline -u db_url -n user_name -p password -f query.sql2 > output.txt
Related
I have 2 SQL queries which I execute to get the size of table and number of records in a table
[~] mysql -u <username> -h <hostname> -p <db_name> -e "SQL_Query 1" > out.txt
[~] mysql -u <username> -h <hostname> -p <db_name> -e "SQL_Query 2" > out1.txt
How can I wite shell script to execute these queries
This is a shell script, supported by bash / sh, and probably others:
#!/bin/sh
mysql -u <username> -h <hostname> -p > output.log <<EOF
SELECT query 1 ...;
SELECT query 2 ...;
EOF
Note: You'll need to address the password entry issue, which can be done in several ways.
You can also enter your SQL in a file (file.sql) and redirect input from that file:
mysql -u <username> -h <hostname> -p < file.sql > output.log
I have this bash script which connects to a postgre sql db and performs a query. I would like to be able to read line from a .txt file into the query as parameters. What is the best way to do that? Your assistance is greatly appreciated! I have my example code below however it is not working.
#!/bin/sh
query="SELECT ci.NAME_VALUE NAME_VALUE FROM certificate_identity ci WHERE ci.NAME_TYPE = 'dNSName' AND reverse(lower(ci.NAME_VALUE)) LIKE reverse(lower('%.$1'));"
(echo $1; echo $query | \
psql -t -h crt.sh -p 5432 -U guest certwatch | \
sed -e 's:^ *::g' -e 's:^*\.::g' -e '/^$/d' | \
sed -e 's:*.::g';) | sort -u
Considering that the file has only one sql query per line:
while read -r line; do echo "${line}" | "your code to run psql here"; done < file_with_query.sql
That means: while read the content of file_with_query.sql line by line, do something with each line.
I want to run query stored file in beeline. This code works OK in putty.
beeline -u "hiveserver" -n "username" -p "password" --outputformat=csv2 --silent=true -e "select * from table;" >output1.txt
When I save sql command to query.hql or query.sql and upload to server where hadoop is, command does not export anything. I get no error.
beeline -u "hiveserver" -n "username" -p "password" --outputformat=csv2 --silent=true -f query.hql >output1.txt
Query in file works when I run it as !run query.hql directly in beeline.
What is wrong with my query in file approach?
Make sure you have a new line character at the end of the file. Otherwise, beeline will not execute that command rather will just print onto the beeline terminal. Please let me know if that works.
Please, check if below is the case.
I need to execute SQL from batch file.
I am executing following to connect to Postgres and select data from table
C:/pgsql/bin/psql -h %DB_HOST% -p 5432 -U %DB_USER% -d %DB_NAME%
select * from test;
I am able to connect to database, however I'm getting the error
'select' is not recognized as an internal or external command,
operable program or batch file.
Has anyone faced such issue?
This is one of the query i am trying, something similar works in shell script, (please ignore syntax error in the query if there are any)
copy testdata (col1,col2,col3) from '%filepath%/%csv_file%' with csv;
You could pipe it into psql
(
echo select * from test;
) | C:/pgsql/bin/psql -h %DB_HOST% -p 5432 -U %DB_USER% -d %DB_NAME%
When closing parenthesis are part of the SQL query they have to be escaped with three carets.
(
echo insert into testconfig(testid,scenarioid,testname ^^^) values( 1,1,'asdf'^^^);
) | psql -h %DB_HOST% -p 5432 -U %DB_USER% -d %DB_NAME%
Use the -f parameter to pass the batch file name
C:/pgsql/bin/psql -h %DB_HOST% -p 5432 -U %DB_USER% -d %DB_NAME% -f 'sql_batch_file.sql'
http://www.postgresql.org/docs/current/static/app-psql.html
-f filename
--file=filename
Use the file filename as the source of commands instead of reading commands interactively. After the file is processed, psql terminates. This is in many ways equivalent to the meta-command \i.
If filename is - (hyphen), then standard input is read until an EOF indication or \q meta-command. Note however that Readline is not used in this case (much as if -n had been specified).
if running on Linux, this is what worked for me (need to update values below with your user, db name etc)
psql "host=YOUR_HOST port=YOUR_PORT dbname=YOUR_DB_NAME user=YOUR_USER_NAME password=YOUR_PASSWORD" -f "fully_qualified_path_to_your_script.sql"
You cannot put the query on separate line, batch interpreter will assume it's another command instead of a query for psql. I believe you will need to quote it as well.
I agree with Spidey:
1] if you are passing the file with the sql use -f or --file parameter
When you want to execute several commands the best way to do that is to add parameter -f, and after that just type path to your file without any " or ' marks (relative paths works also):
psql -h %host% -p 5432 -U %user% -d %dbname% -f ..\..\folder\Data.txt
It also works in .NET Core. I need it to add basic data to my database after migrations.
Kindly refer to the documentation
1] if you are passing the file with the sql use -f or --file parameter
2] if you are passing individual command use -c or --command parameter
If you are trying the shell script
psql postgresql://$username:$password#$host/$database < /app/sql_script/script.sql
I have a query written in a file located at /path/to/query. How can I save the output result to a csv file, without using COPY in the query? I tried the following command, but the output file's fields are separated by " | ".
psql -U username -d dbname -f /path/to/query -o /path/to/output/file -F ','
It is not explained in the documentation, but the -F option requires the -A option (unaligned table output) to work:
psql -U username -d dbname -f /path/to/query -o /path/to/output/file -F ',' -A
If you don't wish the headers in your csv, this means, without extra rows at the top and at the bottom, use the -t option too.
psql -U username -d dbname -f /path/to/query -o /path/to/output/file -F ',' -A -t
From the help:
-A, --no-align unaligned table output mode
-F, --field-separator=STRING
set field separator (default: "|")
-t, --tuples-only print rows only