I have a BigQuery table that has column with some values of '\N' (without the quotes). I want to write a query with where clause on the field.
This is my command "SELECT barcode FROM [mydataset1.mytab1] where barcode = '\N' and length(barcode) < 5"
The above command works perfectly on Windows. The above command returns records for which barcode is \N. Now the same command returns error on Linux platform. I think the special character needs to be written differently.
I tried "SELECT barcode FROM [mydataset1.mytab1] where barcode = '/\N' and length(barcode) < 5" and this does not work either. Could you let me know who to modify the above query to work it on Linux environment?
I have attached the screenshots of the working and not working screens.
http://goo.gl/9p6cwD (Windows works)
http://goo.gl/DeAHij (Linux gives error)
Try using \\\. For instance, this query works:
$ bq query "SELECT '\\\N';"
Related
I am trying to pull all the jdk packages installed on set of hosts by sending a sql select statement to osquery on linux shell via pssh .
Here is the query:
pssh -h myhosts -i 'echo "SELECT name FROM rpm_packages where name like '%jdk%';"| osqueryi --json'
but usage of "%" is giving me below error.
Error: near line 1: near "%": syntax error
I tried to escape % ,but the error remains same. Any ideas how to overcome this error?
You aren't getting this error from your shell but from the query parser, and it's not actually caused by the % character, but to the ' that immediately precedes it. Look at where you have quotes:
'echo "SELECT name FROM rpm_packages where name like '%jdk%';"| osqueryi --json'
^----------------------------------------------------^ ^-------------------^
These quotes are consumed by the shell when it parses the argument. Single quotes tell the shell to ignore any otherwise-special characters inside and treat what is within the quotes as part of the argument -- but not the quotes themselves.
After shell parsing finishes, the actual, verbatim argument that gets sent to pssh looks like this:
echo "SELECT name FROM rpm_packages where name like %jdk%;"| osqueryi --json
Note that all of the single quotes have been erased. The result is that your query tool sees the % (presumably modulus) operator in a place that it doesn't expect -- right after another operator (like) which makes about as much sense to the parser as name like * jdk. The parser doesn't understand what it means to have two consecutive binary operators, so it complains about the second one: %.
In order to get a literal ' there, you need to jump through this hoop:
'\''
^^^^- start quoting again
|||
|\+-- literal '
|
\---- stop quoting
So, to fix this, replace all ' instances inside the string with '\'':
pssh -h myhosts -i 'echo "SELECT name FROM rpm_packages where name like '\''%jdk%'\'';"| osqueryi --json'
osqueryi accepts a single statement on the command line. Eliminating the echo can make quoting a bit simpler:
osqueryi --json "SELECT * FROM users where username like '%jdk%'"
You will, however, need the quotes to pass through your pssh command line.
While osqueryi is great for short simple things, if you're building a frequent polling service, osqueryd with scheduled queries is generally simpler.
I want to execute an SQL file with sqlplus, but when I try to in Powershell ISE the result says how to use sqlplus. The result I get
The code I used in the example in ISE is:
sqlplus "username/password#database #C:Path\To\file.sql"
But when I run this code in CMD or regular Powershell it works without problems. The result is just some dummy Select 1 from dual.
I have tried to put the path in a single qoute( ' ) with and without the # (inside and outside of the quote) but nothing is working. I also didn't find much when googling the issue.
I also tried just to connect and it works without problems, although I can't type anything after it connects. Result with just the connect
because you are doing wrong
the real syntex is
sqlplus username/password#TnsAlias 'c:\path\to\DBscript.sql' | out-file 'c:\temp\sql- output.txt'
I think you (') use early.
or try this without outfile
$output = sqlplus username/password#TnsAlias 'c:\path\to\DBscript.sql'
store in variable
I'm trying to run a big query query from the command line, but because my query is very long I've written it in a text file. The query works from the GUI and I'm overwriting a table that already exsists
bq query --allow_large_results --replace --destination_table=me.Tbl_MyTable '`cat query.txt`'
However, I'm getting error results:
Error in query string: Error processing job
'dev:bqjob_r_00000123456789456123_1': Encountered "
"\'cat query.txt\' "" at line 1, column 1.
Was expecting: EOF
Do I need to put the entire file path in the .txt filename? (this doesn't seem to make a difference)
Are there any characters I need to be careful with in the text file (e.g. "\" or quotation marks) ?
I'm using where clauses and group by clauses - is that an issue?
Instead of cat, just pipe the input from the file. The command would be:
bq query --allow_large_results --replace --destination_table=me.Tbl_MyTable < query.txt
This will send the contents of query.txt to the bq tool.
Elliot is right, now if you want to cat, sed or anything, pipe it:
cat query.txt | bq query
This is equivalent to my earlier question here, but for sqlite.
As before, I am trying to do the following using the sqlite3 command line client.
UPDATE my_table set my_column=CONTENT_FROM_FILE where id=1;
I have looked at the documentation on .import, but that seems to be a little heavyweight for what I am trying to do.
What is the correct way to set the value of one field from a file?
The method I seek should not impose constraints on the contents of the file.
Assuming the file content is all UTF-8 text and doesn't have any quote characters that would be misinterpreted, you could do this (assuming posix shell - on Windows try cygwin):
$ echo "UPDATE my_table set my_column='" >> temp.sql
$ cat YourContentFile >> temp.sql
$ echo "' where id=1;" >> temp.sql
$ sqlite3
SQLite version 3.7.13 2012-07-17 17:46:21
Enter ".help" for instructions
Enter SQL statements terminated with a ";"
sqlite> .read temp.sql
If the content does have single quotes, escape them first with a simple find-and-replace (you'd need to do that anyway).
hth!
See: http://www.sqlite.org/cli.html#fileio
sqlite> INSERT INTO images(name,type,img)
...> VALUES('icon','jpeg',readfile('icon.jpg'));
In your case:
UPDATE my_table set my_column=readfile('yourfile') where id=1;
If you don't have readfile, you need to .load the module first.
Note
I found that the provided fileio module: http://www.sqlite.org/src/artifact?ci=trunk&filename=ext/misc/fileio.c uses sqlite3_result_blob. When I use it in my project with text columns, it results in Chinese characters being inserted into the table rather than the bytes read from file. This can be fixed by changing it to sqlite3_result_text. See http://www.sqlite.org/loadext.html for instructions on building and loading run-time extensions.
I had a look at the BigQuery command line tool documentation and I saw that you are able to use timestamp literals in a WHERE clause. The documentation shows the following example:
$ bq query "SELECT name, birthday FROM dataset.table WHERE birthday <= '1959-01-01 01:02:05'"
Waiting on job_6262ac3ea9f34a2e9382840ee11538ef ... (0s) Current status: DONE
+------+---------------------+
| name | birthday |
+------+---------------------+
| kim | 1958-06-24 12:18:35 |
+------+---------------------+
As the dataset.table is not a public dataset, I build an example using the wikipedia dataset.
SELECT title, timestamp, SEC_TO_TIMESTAMP(timestamp) AS human_timestamp
FROM publicdata:samples.wikipedia
HAVING human_timestamp>'2008-01-01 01:02:03' LIMIT 5
The example works on the BigQuery Browser but it does not on the bq tool. Why? I tried to use scape characters and several combinations of single and double quotes without success. It is a Windows issue? Here goes a screenshot:
EDIT: This is BigQuery CLI 2.0.18
I know that "It works on my machine" isn't a satisfying answer, but I've tried this on my Mac and on a windows machine, and it appears to work fine on both. Here is the output from my windows machine for the same query you've specified:
C:\Users\Jordan Tigani>bq query "SELECT title, timestamp, SEC_TO_TIMESTAMP(timestamp) AS human_timestamp FROM publicdata:samples.wikipedia HAVING human_timestamp>'2008-01-01 01:02:03' LIMIT 5"
Waiting on bqjob_r607b7a74_00000144b71ddb9b_1 ... (0s) Current status: DONE
Can you make sure that the quotes you're using aren't pasted smart quotes and there aren't any stray unicode characters that might confuse the parsing?
One other hint is to use the --apilog=- option, which tells BigQuery to print out all interaction with the server to stdout. You can then see exactly what is getting sent to the BigQuery backend, and verify that the quotes are as expected.
I found out that the problem is due to the greater operator > in the Windows command line. It does not have anything to do with the google-cloud-sdk, sorry.
It seems that you have to use the scape to echo the sign in the command line: ^>
I found it at google groups (by Todd and Margo Chester), and the official reference at Microsoft site.