I am going to run a very simple query using python script. To make sure my script is working I am going to create a very simple .sql file with a simple query which does not depend on any other table.
something like: select currentDate
or something similar
You don't need a from clause in BigQuery. So something like this:
select 1 as x
or:
select current_date as curdate
I would suggest that you give the column a name so you can verify that you can access the column in your code.
Related
I am trying to build a fully parametrised pipeline template in ADF. With the work I have done so far, I can do a full load without any issues but when it comes to delta load, it seems like my queries are not working. I believe the reason for this is that my "where" statement looks somewhat like this:
SELECT #{item().source_columns} FROM #{item().source_schema}.#{item().source_table}
WHERE #{item().source_watermarkcolumn} > #{item().max_watermarkcolumn_loaded} AND #{item().source_watermarkcolumn} <= #{activity('Watermarkvalue').output.firstRow.max_watermarkcolumn_loaded}
where the 'max_watermarkcolumn_loaded' is a datetime format and the 'activity' output is obviously a string format.
Please correct me if my assumption is wrong and let me know what I can do to fix.
EDIT:
screenshot of the error
ADF is picking a date from SQL column 'max_watermarkcolumn_loaded' in this format '"2021-09-29T06:11:16.333Z"' and I think thats where the problem is.
I tried to repro this error. I gave the parameter without single quotes to a sample Query.
Wrap the date parameters with single quotes.
Corrected Query
SELECT #{item().source_columns} FROM
#{item().source_schema}.#{item().source_table}
WHERE #{item().source_watermarkcolumn} >
'#{item().max_watermarkcolumn_loaded}' AND
#{item().source_watermarkcolumn} <=
'#{activity('Watermarkvalue').output.firstRow.max_watermarkcolumn_loaded}'
With this query, pipeline is run successfully.
I am trying to parameterize certain where clauses to standardized my Postgres SQL scripts for DB monitoring. But have not found a solution that will allow me to have the following script run successfully
variablename = "2021-04-08 00:00:00"
select * from table1
where log_date > variablename;
select * from table2
where log_date > variablename;
Ideally, I would be able to run each script separately, but being able to find/replace the variable line would go a long way for productivity.
Edit: I am currently using DBeaver to run my scripts
To do this in DBeaver I figured out you can use Dynamic Parameter Bindings feature.
https://github.com/dbeaver/dbeaver/wiki/SQL-Execution#dynamic-parameter-bindings
Here is a quick visual demo of how to use it:
https://twitter.com/dbeaver_news/status/1085222860841512960?lang=en
select * from table1
where log_date > :variablename;
When executing the query DBeaver will prompt you for the value desired and remember it when running another query.
I have a simple query that I'm sure anyone whos not a novice will be able to easily solve, I have a table which has a bunch of 'tags' for instagram. I am trying to extract TAGS where DATE = today.
A rough copy of what I want is shown below, I know the syntax is incorrect which is why I'm struggling, but I think you can see what I'm trying to achieve, any advice welcome :)
ExecuteSQL
(
SELECT IGs::tag
FROM IGs::
WHERE IGs::creation date = 22/10/2048
;)
However I would prefer that creation date = $$todaysdate ( I will set the variable before the ExecuteSQL in the script.
You need to use quotes around the query. Something like this.
ExecuteSQL
(
"SELECT tag
FROM IGs
WHERE \"creation date\" = ?"
;"";"";$$todaysdate
)
I am new to Informatica Data Quality Analyst (Version 9.5.1 HotFix3) and I am having trouble in generating a basic SQL statement.
The SQL statement is being written against a mapping specification of a table that was originally imported as a flat file. The statement looks like:
Select ColumnA, ColumnB FROM Table1
WHERE Table1.ColumnA = 'S'
The SELECT .... FROM portion of the statement works fine but I encounter errors when I throw in the WHERE clause. I think my statement looks like standard SQL so I'm not sure why this will not work. Does Informatica Analyst accept SQL written only in a specific form? Are the inverted commas causing problems?
the query must work which you are trying to execute. If its not fetching the results, you need to do the following steps:
1) Load the data from source flat file to any database(oracle). You can directly import the data from flat file to table via sql developer.
2) Execute the query with the filter condition. If it doesnt fetch any rows, your query in idq is fetching correct results. If not, there is something missing in your idq code.
is there a way to actually query the database in a such a way to search for a particular value in every table across the whole database ?
Something like a file search in Eclipse, it searches accross the whole worspace and project ?
Sorry about that .. its MS SQL 2005
SQL Workbench/J has a built in tool and command to do that.
It's JDBC based and should also work with SQL Server.
You will need to use the LIKE operator, and search through each field separately. i.e.
SELECT * FROM <table name>
WHERE (<field name1> LIKE '%<search value>%') OR
(<field name2> LIKE '%<search value>%') OR
... etc.
This isn't a quick way though.
I think the best way would be to
1) programatically generate the query and run it
2) use a GUI tool for the SQL server you are using which provides this functionality.
In mysql you can use union operator like
(SELECT * from table A where name = 'abc') UNION (SELECT * from
table B where middlename = 'pqr')
and so on
use full text search for efficency
http://dev.mysql.com/doc/refman/5.0/en/fulltext-search.html
Well, your best bet is to write a procedure to do this. But to give you some pointers you can use the INFORMATION_SCHEMA.Tables to get a list of all the tables in a given database and INFORMATION_SCHEMA.Columns to get a list of all columns. These tables also give you the datatype of columns. So you will need a few loops on these tables to do the magic.
It should be mentioned most RDBMSs nowadays support these schemas.
In phpmyadmin, go to your database, reach the search tab.
Here you will be able to select all of your tables and search through your entire db in one time.