How to use "%" character in sql query on linux shell? - sql

I am trying to pull all the jdk packages installed on set of hosts by sending a sql select statement to osquery on linux shell via pssh .
Here is the query:
pssh -h myhosts -i 'echo "SELECT name FROM rpm_packages where name like '%jdk%';"| osqueryi --json'
but usage of "%" is giving me below error.
Error: near line 1: near "%": syntax error
I tried to escape % ,but the error remains same. Any ideas how to overcome this error?

You aren't getting this error from your shell but from the query parser, and it's not actually caused by the % character, but to the ' that immediately precedes it. Look at where you have quotes:
'echo "SELECT name FROM rpm_packages where name like '%jdk%';"| osqueryi --json'
^----------------------------------------------------^ ^-------------------^
These quotes are consumed by the shell when it parses the argument. Single quotes tell the shell to ignore any otherwise-special characters inside and treat what is within the quotes as part of the argument -- but not the quotes themselves.
After shell parsing finishes, the actual, verbatim argument that gets sent to pssh looks like this:
echo "SELECT name FROM rpm_packages where name like %jdk%;"| osqueryi --json
Note that all of the single quotes have been erased. The result is that your query tool sees the % (presumably modulus) operator in a place that it doesn't expect -- right after another operator (like) which makes about as much sense to the parser as name like * jdk. The parser doesn't understand what it means to have two consecutive binary operators, so it complains about the second one: %.
In order to get a literal ' there, you need to jump through this hoop:
'\''
^^^^- start quoting again
|||
|\+-- literal '
|
\---- stop quoting
So, to fix this, replace all ' instances inside the string with '\'':
pssh -h myhosts -i 'echo "SELECT name FROM rpm_packages where name like '\''%jdk%'\'';"| osqueryi --json'

osqueryi accepts a single statement on the command line. Eliminating the echo can make quoting a bit simpler:
osqueryi --json "SELECT * FROM users where username like '%jdk%'"
You will, however, need the quotes to pass through your pssh command line.
While osqueryi is great for short simple things, if you're building a frequent polling service, osqueryd with scheduled queries is generally simpler.

Related

Running command in perl6, commands that work in shell produce failure when run inside perl6

I'm trying to run a series of shell commands with Perl6 to the variable $cmd, which look like
databricks jobs run-now --job-id 35 --notebook-params '{"directory": "s3://bucket", "output": "s3://bucket/extension", "sampleID_to_canonical_id_map": "s3://somefile.csv"}'
Splitting the command by everything after notebook-params
my $cmd0 = 'databricks jobs run-now --job-id 35 --notebook-params ';
my $args = "'{\"directory\": \"$in-dir\", \"output\": \"$out-dir\",
\"sampleID_to_canonical_id_map\": \"$map\"}'"; my $run = run $cmd0,
$args, :err, :out;
Fails. No answer given either by Databricks or the shell. Stdout and stderr are empty.
Splitting the entire command by white space
my #cmd = $cmd.split(/\s+/);
my $run = run $cmd, :err, :out
Error: Got unexpected extra arguments ("s3://bucket", "output":
"s3://bucket/extension",
"sampleID_to_canonical_id_map":
"s3://somefile.csv"}'
Submitting the command as a string
my $cmd = "$cmd0\"$in-dir\", \"output\": \"$out-dir\", \"sampleID_to_canonical_id_map\": \"$map\"}'";
again, stdout and stderr are empty. Exit code 1.
this is something about how run can only accept arrays, and not strings (I'm curious why)
If I copy and paste the command that was given to Perl6's run, it works when given from the shell. It doesn't work when given through perl6. This isn't good, because I have to execute this command hundreds of times.
Perhaps Perl6's shell https://docs.perl6.org/routine/shell would be better? I didn't use that, because the manual suggests that run is safer. I want to capture both stdout and stderr inside a Proc class
EDIT: I've gotten this running with shell but have encountered other problems not related to what I originally posted. I'm not sure if this qualifies as being answered then. I just decided to use backticks with perl5. Yes, backticks are deprecated, but they get the job done.
I'm trying to run a series of shell commands
To run shell commands, call the shell routine. It passes the positional argument you provide it, coerced to a single string, to the shell of the system you're running the P6 program on.
For running commands without involving a shell, call the run routine. The first positional argument is coerced to a string and passed to the operating system as the filename of the program you want run. The remaining arguments are concatenated together with a space in between each argument to form a single string that is passed as a command line to the program being run.
my $cmd0 = 'databricks jobs run-now --job-id 35 --notebook-params ';
That's wrong for both shell and run:
shell only accepts one argument and $cmd0 is incomplete.
The first argument for run is a string interpreted by the OS as the filename of a program to be run and $cmd0 isn't a filename.
So in both cases you'll get either no result or nonsense results.
Your other two experiments are also invalid in their own ways as you discovered.
this is something about how run can only accept arrays, and not strings (I'm curious why)
run can accept a single argument. It would be passed to the OS as the name of the program to be run.
It can accept two arguments. The first would be the program name, the second the command line passed to the program.
It can accept three or more arguments. The first would be the program name, the rest would be concatenated to form the command line passed to the program. (There are cases where this is more convenient coding wise than the two argument form.)
run can also accept a single array. The first element would the program name and the rest the command line passed to it. (There are cases where this is more convenient.)
I just decided to use backticks with perl5. Yes, backticks are deprecated, but they get the job done.
Backticks are subject to code injection and shell interpolation attacks and errors. But yes, if they work, they work.
P6 has direct equivalents of most P5 features. This includes backticks. P6 has two variants:
The safer P6 alternative to backticks is qx. The qx quoting construct calls the shell but does not interpolate P6 variables so it has the same sort of level of danger as using shell with a single quoted string.
The qqx variant is the direct equivalent of P5 backticks or using shell with a double quoted string so it suffers from the same security dangers.
Two mistakes:
the simplistic split cuts up the last, single parameter into multiple arguments
you are passing $cmd to run, not #cmd
use strict;
my #cmd = ('/tmp/dummy.sh', '--param1', 'param2 with spaces');
my $run = run #cmd, :err, :out;
print(#cmd ~ "\n");
print("EXIT_CODE:\t" ~ $run.exitcode ~ "\n");
print("STDOUT:\t" ~ $run.out.slurp ~ "\n");
print("STDERR:\t" ~ $run.err.slurp ~ "\n");
output:
$ cat /tmp/dummy.sh
#!/bin/bash
echo "prog: '$0'"
echo "arg1: '$1'"
echo "arg2: '$2'"
exit 0
$ perl6 dummy.pl
/tmp/dummy.sh --param1 param2 with spaces
EXIT_CODE: 0
STDOUT: prog: '/tmp/dummy.sh'
arg1: '--param1'
arg2: 'param2 with spaces'
STDERR:
If you can avoid generating $cmd as single string, I would generate it into #cmd directly. Otherwise you'll have to implement complex split operation that handles quoting.

Using SQL LIKE predicate in Db2 comand line processor CLP

I am trying to use the CLP to call an SQL query that uses LIKE:
SELECT NAME, PLACE, ANIMAL from ZOOTABLE where NAME like 'TIG%' or NAME like 'LIO%';
With DB2 CLP, i run per the IBM documentation:
db2 "SELECT NAME, PLACE, ANIMAL from ZOOTABLE where NAME like 'TIG\%' or NAME like 'LIO\%'";
I get this error:
SQL0104N An unexpected token "%" was found following "where NAME like
TIG". Expected tokens may include: "". SQLSTATE=
Any suggestions would be greatly appreciated. Thank you!
Why not just use the syntax, the statement without any escaping...? Also remove the semicolon after the closing quote or put it before the ending quote. The Db2 error comes from the attempted escaping (\%).
db2 "SELECT NAME, PLACE, ANIMAL from ZOOTABLE where NAME like 'TIG%' or BNAME like 'LIO%'"
I was never able to make this work, with Escape or anything else. However, i did manage to export the output to a CSV (Excel) file and that gave me the output i needed:
Logged in as DB2 user:
#!/bin/bash -xv
set -vx
export Host=$1
export sid=$2
db2 "EXPORT TO /tmp/db2select.csv OF DEL MODIFIED BY NOCHARDEL SELECT NAME, PLACE, ANIMAL from ZOOTABLE"
sed -n '/TIG/p' /tmp/db2select.csv | tee /tmp/zooselect.csv
sed -n '/LIO/p' /tmp/db2select.csv | tee -a /tmp/zooselect.csv
This gave me the base output i needed and i then could manipulate the Excel file as needed

Sqoop: double quotes query

I have a problem with the double quotes on this sqoop query:
select i.Number, i.Date,i.Station, i.lStation,
count(*) ax, “1- Pd” St , b.Type
from Leg jl, yLeg i, senger b,
where jl.LegID = i.LegID and jl.rID = b.erID and b.gID = b.ID
and b.tus not in (1,4) group by Number, Date, tion, b.Type
how can i fixed? with some escape parameter
First debug the query with the below command sqoop eval -libjars /var/lib/sqoop/ojdbc6.jar --connect jdbc:oracle:thin:#hostname:portnumber/servicename --username user -password password --query "select * from schemaname.tablename where rownum=10" write your query in the --query and see if the actual query is generating the output you are expecting ? you can see the output in the terminal itself.
If the query is giving the results as you expected, use the below sqoop command
to import the table
sqoop import -libjars /var/lib/sqoop/ojdbc6.jar --connect 'jdbc:oracle:thin:#hostname/service_name' --username user -password password -m 1 --hive-overwrite --hive-import --hive-database database_name --hive-table table_nmae --target-dir '/user/hive/warehouse/databasename.db/tablename' --query "select * from source_database.source_tablename WHERE 1=1 AND \$CONDITIONS"
The exact problem with the double quotes you are facing can be resolved using escape key. Please us the WHERE 1=1 AND \$CONDITIONS as is and paste your query before the WHERE in sqoop command.
If you face any error please paste the error, you must need to add an other escape key to escape the double quotes.
There are two parts to this question.
The first is what is a valid query for your source database? Most databases have some kind of client or shell that let you enter and execute queries. Your query should be valid as far as the shell or client is concerned.
The second part of your question is how do you take that query (as a String) and pass it to the database via sqoop. The answer to that lies in the way you're running sqoop.
If you're running sqoop via command line then you need to identify those characters (usually double quotes) that give your OS fits when embedded in a command line argument. Use a backslash before those characters to help the OS parse the command correctly. You usually have to put the entire query string inside unescaped double quotes so that the OS treats your query as a single string argument.
If you're running sqoop via Oozie then I strongly recommend you break the Sqoop command into arguments in the Sqoop action:
<arg>--query</arg>
<arg>select ... count(*) ax, “1- Pd” St , b.Type ... WHERE $CONDITIONS</arg>
So that you can generally paste your query as is into the action.
Of course, nothing is that simple. You still have to remember that the query is sitting inside an XML document, so any character that will mess up an XML parse become problematic. The only characters like that that I've encountered so far are the angle brackets and I use property substitution (a bit of a kludge, I admit) to solve that problem:
In the Oozie workflow properties file I put:
lessThan=<
and I change my arg from
<arg>SELECT * from MyTable where $CONDITIONS AND (SOME_COL < 1000)</arg>
to
<arg>SELECT * from MyTable where $CONDITIONS AND (SOME_COL ${lessThan} 1000)</arg>
EDIT:
For those of you who don't like my kludge, you could try using a CDATA element to "escape" anything in the query (except, of course, ']]>'):
<arg><![CDATA[SELECT * from MyTable where $CONDITIONS AND (SOME_COL < 1000)]]></arg>

SQLCMD use LIKE '%#%'

I'm trying to run a query using SQLCMD.EXE and have trouble with the LIKE portion.
WHERE email LIKE '%%#%%'
I think it is an error with cmd prompt rather then SQLCMD.EXE since I get the error:
Syntax error "#%'"
I am running this via Notepad++ (NppExec) pointing to the bat file like so:
H:\scripts\SQL.bat "$(CURRENT_WORD)"
This causes the query to be wrapped in double quotes before being used by the SQLCMD.EXE call. The SQLCMD.EXE call then runs in the bat file like so:
SQLCMD.EXE -U user -P %pass% -S %server% -Q %sql% -d %table%
It works perfect on any query I use aside from this LIKE '%%#%%' part.
UPDATE
I've done a few more tests and think I have narrowed it down to being a problem with the % and the #.
So queries like these work fine:
SELECT name FROM table WHERE name LIKE 'test'
SELECT name FROM table WHERE name LIKE 'test%'
SELECT name FROM table WHERE name LIKE '%%test'
But these will cause errors:
SELECT name FROM table WHERE name LIKE '%test'
SELECT name FROM table WHERE name LIKE '%test%'
This is fine since I am ok with doubling the % in my queries, but I've tried %%#% and %%#%% and they throw errors. Syntax error "#'"" or Syntax error "#%'"", respectively.
Also, the reason for the variables is that I included some logic so it can detect table names and run for different servers and databases.
Here is the bat file
set sql=%1
iff %#index[%sql%,sur_] GT -1 THEN
SET SERVER=server1
SET table=tablename
SET pass=password
else
SET SERVER=server2
SET table=tablename
SET pass=password
endiff
SQLCMD.EXE -U usr -P %pass% -S %server% -Q %sql% -d %table%
The reason for the weird syntax is due to the command being run through TCC/LE (see here)
I'm not quite sure what your reasoning is for doubling up the %s, but it looks like your intent is to find values in the email column that contain #. If so, you can try rewriting the clause as such:
WHERE CHARINDEX('#', email) > 0
If it's the # symbol that is tripping things up, use CHAR(64) instead.
WHERE CHARINDEX(CHAR(64), email) > 0
When run query with sqlcmd, i found that % symbol will be removed. Let's say your query is :
SELECT name FROM table WHERE name LIKE 'test%'
The sqlcmd will read your query as
SELECT name FROM table WHERE name LIKE 'test'
So sqlcmd will not filter your result. Please use %% for query
SELECT name FROM table WHERE name LIKE 'test%%'
and you will get result
SELECT name FROM table WHERE name LIKE 'test%'
I have tested this on SQLServer 2005 & 2008

Execute SQL from file in bash

I'm trying to load a sql from a file in bash and execute the loaded sql. The sql file needs to be versatile, meaning it cannot be altered in order to make things easy while being run in bash (escaping special characters like * )
So I have run into some problems:
If I read my sample.sql
SELECT * FROM SAMPLETABLE
to a variable with
ab=`cat sample.sql`
and execute it
db2 `echo $ab`
I receive an sql error because by doing a cat the * has been replaced by all the files in the directory of sample.sql.
Easy solution would be to replace "" with "\" . But I cannot do this, because the file needs to stay executable in programs like DB Visualizer etc.
Could someone give me hint in the right direction?
The DB2 command line processor has options that accept a filename as input, so you shouldn't need to load statements from a text file into a shell variable.
This command will execute all SQL statements in the file, with newline treated as the statement terminator:
db2 -f sample.sql
This command will execute all SQL statements in the file, with semicolon treated as the statement terminator:
db2 -t -f sample.sql
Other useful CLP flags are:
-x : Suppress the column headings
-v : Echo the statement text immediately before execution
-z : Tee a copy of all CLP output to the filename immediately following this flag
Redirect stdin from the file.
db2 < sample.sql
In case, you have a variable used in your script and wanted to get it replaced by the shell before executed in DB2 then use this approach:
Contents of File.sql:
cat <<xEOF
insert values(1,2) into ${MY_SCHEMA}.${MY_TABLE};
select * from ${MY_SCHEMA}.${MY_TABLE};
xEOF
In command prompt do:
export MY_SCHEMA='STAR'
export MY_TAVLE='DIMENSION'
Then you are all good to get it executed in DB2:
eval File.sq |db2 +p -t
The shell will replace the global variables and then DB2 will execute it.
Hope it helps.