Execute sql queries from shell script - sql

I need to execute the following sql queries from bash/expect script
what is the preferred approach to run these queries from bash script
# psql ambari -U ambari
Password for user ambari:
psql (9.2.24)
Type "help" for help.
ambari=>
ambari=>
ambari=>
ambari=> select
ambari-> sum(case when ulo = 1 then 1 else 0 end) as ulo_1,
ambari-> sum(case when ulo = 2 then 1 else 0 end) as ulo_2,
.
.
.
for access PostgreSQL we do
psql ambari -U ambari
Password for user ambari:bigdata
and when we run this ( /tmp/file include the bach of the query )
psql -U ambari -f /tmp/file ambari
we get
psql: FATAL: no pg_hba.conf entry for host "[local]", user "ambari", database "ambari", SSL off

I'm using this
dbhost=localhost
dbport=5432
dbuser=user
dbpass=pass
dbname=test
export PGPASSWORD="$dbpass"
dbopts="-h $dbhost -p $dbport -U $dbuser -d $dbname"
Then run sql script from file
psql $dbopts < "$path_to_sql_script"
Or from query var
query="
SELECT 1;
...
"
psql $dbopts <<< "$query"
Also pgpass can be set in special file ~/.pgpass like this
echo "$dbhost:$dbport:$dbname:$dbname:$dbpass" > ~/.pgpass
chmod 600 ~/.pgpass

Use switches -c command or -f filename, ie.:
$ psql -U ambari -c "SELECT ... ;" ambari # > result.file
or:
$ cat file.sql
SELECT
... ;
$ psql -U ambari -f file.sql ambari # > result.file
Probably -f as your query seems lengthy. Use > result.file to store the query result to a file.
As for the password, store following kind of entry to .pgpass file in user's home dir:
$ cat >> ~/.pgpass
#hostname:port:database:username:password
localhost:5432:ambari:ambari:t00M4NY53CR3t5
and set its rights to user's eyes only:
$ chmod 600 ~/.pgpass
Also, consider psql -h hostname if the database is not running in localhost (this needs to reflect in .pgpass entry as well).

Related

Shell Script to execute SQL Queries

I have 2 SQL queries which I execute to get the size of table and number of records in a table
[~] mysql -u <username> -h <hostname> -p <db_name> -e "SQL_Query 1" > out.txt
[~] mysql -u <username> -h <hostname> -p <db_name> -e "SQL_Query 2" > out1.txt
How can I wite shell script to execute these queries
This is a shell script, supported by bash / sh, and probably others:
#!/bin/sh
mysql -u <username> -h <hostname> -p > output.log <<EOF
SELECT query 1 ...;
SELECT query 2 ...;
EOF
Note: You'll need to address the password entry issue, which can be done in several ways.
You can also enter your SQL in a file (file.sql) and redirect input from that file:
mysql -u <username> -h <hostname> -p < file.sql > output.log

How to execute sql query from command line tool for SQL Server?

I am new in SQL Server, can somebody help me to execute SQL query from command-line tool in SQL Server?
use below command prompt query to execute filename.sql using SQLCMD.
SQLCMD -d <database-name> -U <User-name> -P <Password> -i filename.sql -o output.txt
-d = Database Name
-U = User Name
-P = Password
-i = Filename.sql
-o = output.txt
-E = Windows Authentication mode if you specify this need to skip -U & -P parameters.

Write beeline query results to a text file

I need to write the results of executing a hive query to a file. how do i do it? currently, it's printing to the console.
beeline -u db_url -n user_name -p password -f query.sql
i tried:
beeline -u db_url -n user_name -p password -f query.sql 2> output.txt
but output.txt just contains when connection started and closed, not the results of the query - which are still being printed to the console.
I assume beeline -u db_url -n user_name -p password -f query.sql > output.txt must be OK. Without 2
"2" in your command is errlog, not the stdout
so "...query.sql 2> output.txt" would put the errlog results into your text file, while "...query.sql > output.txt" would put the actual output into the text file.
In addition to #dds 's answer you can try adding the silent feature to get rid of all the other stuffs like the connection started and closed status being printed in the output file.
beeline -u db_url -n user_name -p password --silent=true -f query.sql > output.txt
I think you meant to type "csv2" instead of "csv 2". Here's the fixed command line:
beeline -u db_url -n user_name -p password -f query.sql2 > output.txt

Import dump/sql file into my postgresql database on Linode

I recently moved my Ruby on Rails 4 app from Heroku to Linode. Everything has been setup correctly, but I need to populate my database with a file, lets call it movies.sql
I am not very familiar with postgresql command and VPS, so having trouble getting this done. I uploaded it to Dropbox since I saw many SO posts that you can use S3/Dropbox.
I saw different commands like this (unsure how to go about it in my situation):
psql -U postgres -d testdb -f /home/you/file.sql
psql -f file.sql dbname
psql -U username -d myDataBase -a -f myInsertFile
So which is the correct one in my situation and how to run when I SSH in Linode? Thanks
You'll need to get the file onto your server or you'll need to use a different command from your terminal.
If you have the file locally, you can restore without sshing in using the psql command:
psql -h <user#ip_address_of_server> -U <database_username> -d <name_of_the_database> -f local/path/to/your/file.sql
Otherwise, the command is:
psql -U <database_username> -d <name_of_the_database> < remote/path/to/your/file.sql
-U sets the db username, -h sets the host, -d sets the name of the database, and -f tells the command you're restoring from a file.

MySQL dump structure of all tables and data of some

I'm trying to dump the structure of all the tables in our database, and then only the data of the ones I specifically want, but i seem to be doing something wrong as I'm not getting the empty tables created for the ones I exclude from the data dump.
I have a text file which specifies which tables I want to dump the data for (called showtables.txt):
SHOW TABLES FROM mydb
WHERE Tables_in_mydb NOT LIKE '%_history'
AND Tables_in_mydb NOT LIKE '%_log';
I am then doing this command to dump the structure of all tables, and then the data of the tables returned by that query in the text file:
mysqldump -u root -pmypassword mydb --no-data > mydump.sql; mysql -u root -pmypassword < showtables.txt -N | xargs mysqldump mydb -u root -pmypassword > mydump.sql -v
I am getting the dump of all the tables included in the results of the showtables query, but I am not getting the structures of the rest of the tables.
If I run just the structure part as a single command, that works fine and I get the structures dumped for all tables. But combining it with the data dump seems to not work.
Can you point me to where I'm going wrong with this?
Thanks.
I think you've got the order of your commandline arguments wrong (the redirection to a file should be the end), and you need an extra parameter for xargs so we can specify the database name to mysqldump.
Additionally, you need to append >> the dump data, otherwise you'd be overwriting the mydump.sql file for each table:
mysqldump -u root -pmypassword mydb --no-data > mydump.sql
mysql -u root -pmypassword -N < showtables.txt | xargs -I {} mysqldump -v -u root -pmypassword mydb {} >> mydump.sql
Sources: http://www.cyberciti.biz/faq/linux-unix-bsd-xargs-construct-argument-lists-utility/
Working off of Jon's answer, but the -I in xargs will run a separate mysqldump command for each table. Easier to just allow the xargs default which appends the output of the previous command to the next command. mysqldump's last argument is a list of all tables you'd like to dump.
My solution also shows connecting through a bastion host. gzip'ing before streaming over the SSH connection is vastly faster than sending the uncompressed SQL over the wire.
FILE=~/production.sql.gz
HOST=ext-db-read-0.cdzvblmx0n9h.us-west-1.rds.amazonaws.com
USER=username
PASS="s3cret"
DB=myapp_prod
EXCLUDE="'activities', 'changelogs'"
ssh bastion.mycompany.com <<EOF > $FILE
mysqldump -h $HOST -u $USER -p$PASS $DB --no-data | gzip
mysql -h $HOST -u $USER -p$PASS -N -e "SHOW TABLES WHERE Tables_in_$DB NOT IN ($EXCLUDE)" $DB | xargs mysqldump -v -h $HOST -u $USER -p$PASS $DB | gzip
EOF
If you don't want to save .gz just pipe it through gzip -d:
ssh bastion.mycompany.com <<EOF | gzip -d > $FILE
etc
or directly to your local db:
ssh bastion.mycompany.com <<EOF | gzip -d | mysql -uroot myapp_development