dbt CLI string variables get inserted as integers from YAML dictionary - dbt

I'm trying to run the run command with some custom values:
dbt run --vars "{start_date: '2022-08-01', end_date: '2022-08-02'}"
and then use these variables in contexts like:
WHERE session_date BETWEEN {{ var('start_date') }} AND {{ var('end_date') }}
The values need to be inserted as strings (with single quotes kept), but it gets compiled as:
WHERE session_date BETWEEN 2022-08-01 AND 2022-08-02
which then is incorrect SQL ("No matching signature for operator BETWEEN for argument types: DATE, INT64, INT64"). I have tried switching the single and double quotes, but the problem remains.
How can I make it respect the quotes passed in the YAML dict?

I solved this by simply adding single quotes around the variable statements:
WHERE session_date BETWEEN '{{ var("start_date") }}' AND '{{ var("end_date") }}'

Try the postgresql TO_DATE function to convert your input string variable to a datetime variable for easier comparison.
E.g)
WHERE session_date BETWEEN to_date({{ var('start_date') }}) AND to_date({{ var('end_date') }})

Related

Parse string with `T` to timestamp PostgreSQL

I have this string 2019-02-14T17:49:20.987 which I want to parse into a timestamp. So I am playing with the to_timestamp function and it seems to work fine except... The problem is with this T letter there. How do I make PostgreSQL skip it?
What pattern should I use in to_timestamp?
Of course I can replace the T with a space and then parse it but I find this approach too clumsy.
Quote from the manual
If there are characters in the template string that are not template patterns, the corresponding characters in the input data string are simply skipped over (whether or not they are equal to the template string characters).
So just put any non-template character there (e.g. X):
select to_timestamp('2019-02-14T17:49:20.987', 'YYYY-MM-DDXHH24:MI:SS.MS')
Online example: https://rextester.com/OHYD18205
Alternatively, you can simply cast the value:
select '2019-02-14T17:49:20.987'::timestamp
The string with T is a valid input literal for timestamp or timestamptz:
select '2019-02-14T17:49:20.987'::timestamp;
timestamp
-------------------------
2019-02-14 17:49:20.987
(1 row)

Why doesn't the PostgreSQL COPY command allow NULL values inside arrays?

I have the following table definition:
create table null_test (some_array character varying[]);
And the following SQL file containing data.
copy null_test from stdin;
{A,\N,B}
\.
When unnesting the data (with select unnest(some_array) from null_test), the second value is "N", when I am expecting NULL.
I have tried changing the data to look as follows (to use internal quotes on the array value):
copy null_test from stdin;
{"A",\N,"B"}
\.
The same non-null value "N" is inserted?
Why is this not working and is there a workaround for this?
EDIT
As per the accepted answer, the following worked. However, the two representation of NULL values within a COPY command depending on whether you're using single or array values is inconsistent.
copy null_test from stdin;
{"A",NULL,"B"}
\.
\N represents NULL as a whole value to COPY, not as part of another value and \N isn't anything special to PostgreSQL itself. Inside an array, the \N is just \N and COPY just passes the array literal to the database rather than trying to interpret it using COPY's rules.
You simply need to know how to build an array literal that contains a NULL and from the fine manual:
To set an element of an array constant to NULL, write NULL for the element value. (Any upper- or lower-case variant of NULL will do.) If you want an actual string value "NULL", you must put double quotes around it.
So you could use these:
{A,null,B}
{"A",NULL,"B"}
...
to get NULLs in your arrays.

how to escape a single quote in a pig script

Can pig scripts use double quotes? If not how to escape a single quote? I'm trying to parse a date time and I'm geting errors
Unexpected character '"'
And here is the script
logOutput = FOREACH parsedLog GENERATE uid, ToDate(timestamp,"YYYY-MM-DD'T'hh:mm ss:'00'") as theTime:datetime
You can escape a single quote using \\ (double backslash).
%declare CURRENT_TIME_ISO_FORMAT ToString($CURRENT_TIME,'yyyy-MM-dd\\'T\\'HH:mm:ss.SSSZ')
Just be aware that when you are using the escaping, you should not reuse the created String on another place of the script, but to everything on single call.
For example, let's say you want to send the String to the ISOToDay function, this script will fail:
%declare CURRENT_TIME_ISO_FORMAT ToString($CURRENT_TIME,'yyyy-MM-dd\\'T\\'HH:mm:ss.SSSZ')
%declare TODAY_BEGINNING_OF_DAY_ISO_FORMAT ISOToDay($CURRENT_TIME_ISO_FORMAT)
Instead, you should do:
%declare TODAY_BEGINNING_OF_DAY_ISO_FORMAT ISOToDay(ToString($CURRENT_TIME,'yyyy-MM-dd\\'T\\'HH:mm:ss.SSSZ'))
Have a try escaping them using \ and using single quotes.
logOutput = FOREACH parsedLog GENERATE uid, ToDate(timestamp,'YYYY-MM-DD\'T\'hh:mm ss:00') as theTime:datetime
Not sure what you mean with '00'.

Redis command line SET value containing double quotes

I want to use redis command line (using redis-cli) to store json values. This is what I do
redis 127.0.0.1:6379> set test '{"a":"b"}'
This command fails with message :
Invalid argument(s)
I don't have problem with setting values that don't contain double quotes. What is the correct way to escape double quotes?
Add slashes to quotes
set test "{\"a\":\"b\"}"
We can use single quotes for storing the JSON value:
set name '{"a":"b"}'
Run get query like: get name
output : {"a":"b"}
later redis has fixed this problem. single quote works fine.

Quoting YAML (for Travis CI)

How would I escape a whole line in YAML? I want to have json='{"title": "travis_saulshanabrook_site","key": "'$(cat ~/.ssh/id_rsa.pub)'"}'
in a list, but I can't get it to parse into a string. I can put single quotes around the whole line, but then I would have to escape every single quote in my string, making it very hard to read. The string will be run as a bash command in Travis CI
The most elegant solution is to use the literal style | indicator, with the - modifier to strip the final newline. That way there are no extra quotes necessary.
If this scalar happens to be the only thing in a YAML file use:
|-
json='{"title": "travis_saulshanabrook_site","key": "'$(cat ~/.ssh/id_rsa.pub)'"}'
if it is a mapping value for key abc:
abc: |-
json='{"title": "travis_saulshanabrook_site","key": "'$(cat ~/.ssh/id_rsa.pub)'"}'
or if it is part of a list:
- |-
json='{"title": "travis_saulshanabrook_site","key": "'$(cat ~/.ssh/id_rsa.pub)'"}'
I'm not sure there's a solution that makes escapes that string and makes it easy to read.
FYI this is what that string looks like escaped :
script: ! 'your_cmd json=''{"title": "travis_saulshanabrook_site","key": "''$(cat ~/.ssh/id_rsa.pub)''"}'''