Adding today date in Table name when using Create Table function in standard sql GBQ - google-bigquery

I am quite new to GBQ and any help is appreciated it.
I have a query below:
#Standard SQL
create or replace table `xxx.xxx.applications`
as select * from `yyy.yyy.applications`
What I need to do is to add today's date at the end of the table name so it is something like xxx.xxx.applications_<todays date>
basically create a filename with Application but add date at the end of the name applications.
I am writing a procedure to create a table every time it runs but need to add the date for audit purposes every time I create the table (as a backup).
I searched everywhere and can't get the exact answer, is this possible in Query Editor as I need to store this as a Proc.
Thanks in advance

BigQuery doesn't support dynamic SQL at the moment which means that this kind of construction is not possible.
Currently BigQuery supports Parameterized Queries but its not possible to use parameters to dynamically change the source table's name as you can see in the provided link.
BigQuery supports query parameters to help prevent SQL injection when
queries are constructed using user input. This feature is only
available with standard SQL syntax. Query parameters can be used as
substitutes for arbitrary expressions. Parameters cannot be used as
substitutes for identifiers, column names, table names, or other parts
of the query.
If you need to build a query based on some variable's value, I suggest that you use some script in SHELL, Python or any other programming language to create the SQL statement and then execute it using the bq command.
Another approach could be using the BigQuery client library in some of the supported languages instead of the bq command.

Related

How do I generate a table name that contains today's date?

It may seem a little strange, but there are already tables with names for each date.
In my project, I have tables for each date to make statistics easier to handle.
Of course, I don't think this is always the best way, but this is the table structure for my project.
(It's a common technique in Google BigQuery and Amazon Athena. This question is about Google BigQuery)
So to get the data, I want to generate today's date. If I use TODAY, I can get the data of the latest day without rewriting the code even if it is the next day.
I tried, but the code didn't work.
Not work 1:
CONCAT in FROM
SELECT
*
FROM
CONCAT('foo_', FORMAT_TIMESTAMP('%Y%m%d', CURRENT_TIMESTAMP(), 'Asia/Tokyo'))
Error:
Table-valued function not found: CONCAT at [4:3]
Not work 2:
create temporary function:
create temporary function getTableName() as (CONCAT('foo_', FORMAT_TIMESTAMP('%Y%m%d', CURRENT_TIMESTAMP(), 'Asia/Tokyo')));
Error:
CREATE TEMPORARY FUNCTION statements must be followed by an actual query.
Question
How do I generate a table name that contains TODAY's date?
In this case, I would recommend you to use Wild tables in BigQuery, which allows you to use some features in Standard SQL.
With Wild Tables you can use _TABLE_SUFFIX, it grants you the ability to filter/scan tables containing this parameter. The syntax would be as follows:
SELECT *
FROM `test-proj-261014.sample.test_*`
where _TABLE_SUFFIX = FORMAT_DATE('%Y%m%d', CURRENT_DATE)
I hope it helps.
Your first query should go like this:
select CONCAT('foo_', FORMAT_TIMESTAMP('%Y%m%d', CURRENT_TIMESTAMP(), 'Asia/Tokyo'))
For creating temporary function, use the below code:
create temp function getTableName() as
((select CONCAT('foo_', FORMAT_TIMESTAMP('%Y%m%d', CURRENT_TIMESTAMP(), 'Asia/Tokyo'))
));
select getTableName()
The error "CREATE TEMPORARY FUNCTION statements must be followed by an actual query." is because once the temporary functions are defined then you have to use the actual query to use that function and then the validity of function dies out. To define persistent UDFs and use them in multiple queries please go through the link to define permanent functions.You can reuse persistent UDFs across multiple queries, whereas you can only use temporary UDFs in a single query.

BigQuery create Table differences between standard and legacy sql

I have a few questions around the create table syntax in standard and legacy sql
The new BigQueryUI doesn't show standard sql types and shows only legacy types. I understand they are mapped one to one with the legacy types but the examples in creating partitioned tables shows options which are not available in the UI
If I create a table using the JSON field schema can I still use the standard sql?
The BigQueryUI shows only partitioning the table by Ingestion time, but I want to create a table with date column and I don't see an option for it. If I have to create the DDL manually, I did not see the examples on how to use JSON field schema to construct a create table statement.
The new BigQueryUI doesn't show standard sql types
BigQuery standardSQL and LegacySQL are 2 options to write SQL syntax (See this link for more detail) and have nothing to do with the column Types in BigQuery, Details on table types can be found in this link, I also find this Link helpful
If I create a table using the JSON field schema can I still use the standard sql?
To create a table using JSON you need to run bq command line, If you need help how to write this syntax let us know
but I want to create a table with date column and I don't see an option for it
You can use this standardSQL syntax to do this:
#standardSQL
CREATE OR REPLACE TABLE `project.dataset.tableId`
PARTITION BY myDate
CLUSTER BY cluster_col AS
SELECT * from sourceTable
Note: myDate column is a column in the source table

BigQuery standard SQL - run query with table decorators ($ signs)

I'm using BigQuery Standard SQL.
I'm trying to use a "$" decorator on a table in order to refer to a specific partition:
SELECT user_id
FROM `raw.events$20161109`
And I'm getting the next error:
Table "raw.events$20161109" cannot include decorator Dismiss
I am able to run the query (bq validation is ok), and the error pops right after I click on "Run Query" button.
When I use the Legacy SQL
I have no problem of doing it:
SELECT uid
FROM [raw.events$20161109]
Is there any way to run a query using the decorators with Standard SQL ?
I have to do it this way, as a lot of other procedures are based on this decorators format (using the Legacy SQL)
At this time Table Decorators in BigQuery are only available when using Legacy SQL. There is an open feature request which can be followed to see the progress on bringing this functionality to Standard SQL.

Using a Teradata UDF in SAS Implicit Sql Pass Thru

I am trying to use a Teradata UDF (User Defined Function) in a SAS Implicit SQL which establishes the connection to Teradata using LIBNAME Statement.Assume that the function is called PTY_DECRYPT and is defined in a Database called TEST in Teradata. The Purpose of this function is to decrypt values in a Column of a View in Teradata.
What works is using the UDF in an Explicit Sql .Below I am using the function on a column called SSN_NBR in a view called V_TEST_PERS present in the Database called SAMPLE.
Explcit Sql:
Options debug=DBMS_TIMERS sastrace=',,,d'
sastraceloc=saslog no$stsuffix fullstimer;
Proc Sql;
Connect to TERADATA(User=XXXXX pwd=XXXXX server=XXXXX);
Create Table Final as
select * from connection to teradata
(
Select
sub_id,
SSN_NBR,
TEST.PTY_DECRYPT(SSN_NBR,'T_ssn_test',400,0,0 ) as SSN_NBR_Decrypt
from SAMPLE.V_TEST_PERS
);
disconnect from teradata;
Quit;
But I would like to use the same function in an Implicit SQL but it does not work. Any ideas as to how to make it work in an Implicit Sql with minimum changes to the Implicit SQL?
Implicit Sql
Options debug=DBMS_TIMERS sastrace=',,,d'
sastraceloc=saslog no$stsuffix fullstimer;
Libname Td Teradata User=XXXXX pwd=XXXXX server=XXXXX database=SAMPLE ;
Proc sql;
Create table Final as
select
sub_id,
SSN_NBR,
TEST.PTY_DECRYPT(SSN_NBR,'T_ssn_test',400,0,0 ) as SSN_NBR_Decrypt
from Td.V_TEST_PERS;
Quit;
In your implicit SQL you reference the view with the LIBNAME alias TD, however when you reference the UDF you are not aliasing the TEST database containing the UDF with the LIBNAME alias. Syntactically, you may not be able to do that in SAS. (e.g. TD.TEST.PTY_DECRYPT() - in fact I wouldn't expect this to work)
The UDF may need to be placed in SYSLIB or TD_SYSFNLIB so that it is in a default search path for the database optimizer to find the UDF without being fully qualified. (e.g. TD_WEEK_BEGIN()) Alternatively, the UDF could be placed in database SAMPLE but that likely violates how UDFs are maintained in your environment, as it would in my environment.
Otherwise, the UDF call could be embedded in a view on the database, but then you have other issues to consider with the security of that column if your environment is not granting security on a column level basis to views containing encrypted data elements. (e.g. PHI, PII, etc.) Without a row-column level security mechanism in place to dynamically filter a users ability to see the column you are decrypting in the view putting the UDF into the view isn't going to work.
I asked the same question the SAS Communities Forum and I am glad to say that i did find a Solution to this Problem.
Please see the link below :
https://communities.sas.com/t5/Base-SAS-Programming/Using-a-Teradata-UDF-in-SAS-Implicit-Sql-Pass-Thru/m-p/266850/highlight/false#M52685

Updating an Oracle table with data from a SAS data set using SAS code

I am rather new to SAS and I have run into a problem that I think probably has a better solution than what I've found so far.
I need to update a Oracle db table that has around 1 million rows with data from a SAS data set that has about 10,000 records.
I used an update statement within proc sql, but it takes hours to update the Oracle table. Right now, I am loading the data from the SAS data set into a temporary table in the Oracle db and doing a proc sql pass through execute statement to update the main table from the temporary table. This takes only a couple of minutes at most.
However, this is rather cumbersome to program and and I need to update the Oracle table from multiple functions within my SAS code.
Is there an analog to JDBC batch update in SAS (I uses to do Java programming before getting involved in SAS)? Something that is faster than using an update statement in proc sql, but easier to code than temp table + update using pass through?
Are you using SAS/Access to connect your SAS sessions to Oracle?
In my situation, I use SAS/Connect JDBC.
SAS/Connect is a very simple but effective strategy for interfacing the SAS substrate system to JEE. Essentially sas/connect is yet another telnet implementation by sas to execute sas -dmr.
I draw the sas data out using sas/connect jdbc into my jsp and then push the data into oracle or sql server using java programming techniques we are all familiar with.
Read my ancient paper on using sas/connect to connect sas to JEE:
http://www.nesug.org/proceedings/nesug04/ap/ap02.pdf.
BTW do not try to contact me with the contacts listed on the paper - they are ancient.
In response to your further statement:
I thought you wanted a way to use JDBC to insert the data into Oracle?
My paper shows you how to embed a whole block of SAS macro or SQL or any text in a JSP and then submit that block of text to be run through SAS/Connect.
String datasetname = request.getParameter("datasetname");
String where = request.getParameter("where");
<t:text id="macHello">
%macro hello(datasetname);
data &datasetname;
/* code to create your data */
run;
%mend;
%hello(<%=datasetname%>);
</t:text>
sasConnect.submit(macHello);
<t:text id="SQLgetRecs">
SELECT *
FROM <%=datasetname%>
WHERE <%=where%>
</t:text>
ResultSet mydata =
sasConnJDBC.executeQuery(SQLgetRecs);
Then do whatever you need to do with Java,
either by interweaving insertion in Oracle per iteration of Resultset
or iterate resultset to produce a text block of SQL insert VALUES
which you then submit to Oracle JDBC.
It would just be a single JSP, provided you know how to work a JSP and willing to understand how the text-block tag library I wrote works. You see, I use this technique to allow a JSP run SAS macros that have been running in production batch mode for ages, without any change to the macros. Not only so, the tag lib allows me to embed java and jsp variable resolution into the macros or sas/sql blocks.
I wrote this block-text tag lib because I used to do such operations in Perl (prior to 2003), where Perl (and other scripting languages) allows you to assign a variable to a continuous block of text within the code of the script.
Instructions on using the tag lib:
http://h2g2java.blessedgeek.com/2009/07/jsp-text-custom-tag.html
http://h2g2java.blessedgeek.com/2009/07/referencing-text-jsp-custom-tag-defined.html