Efficient way to query more than 50 BigQuery projects data - google-bigquery

I'm trying to write a query that fetches data from more than almost 50 different BQ projects.
I'm using union all but it's not an efficient way to write 'union all' 49 times for 50 different projects. Is there any way to efficiently fetch data from all projects?
I used this query
SELECT '1' AS id
FROM `d-1.daataa.table`
UNION ALL
SELECT '2' AS id
FROM `d-2.daataa.table`

Related

Oracle SQL: single SELECT request for multiple owners

I would like your advice on the best method to use.
From a single Oracle server, we have 3 different owners that contain the exact same tables/data structures. Essentially, the owners allow us to separate the data by administrative regions.
Currently, when I want to do a SQL query on the entire data set (regions), I have to do 3 separate queries:
select * from owner1.Table1 union all
select * from owner2.Table1 union all
select * from owner3.Table1
In this simple example, there are no issues, but if the query is complex, it quickly becomes very difficult to maintain.
So, would there be a more efficient way to make only one global query instead of 3. I guess it's possible to do it via a PL/SQL script, or Dynamic SQL, but I don't know...
Basically, I would like to be able to do (where owners would contain the names of my 3 owners):
select * from owners.Table1
It is not possible to build views that would contain the data of all 3 owners (there would be too many).
Thanks
In this simple example, there are no issues, but if the query is complex, it quickly becomes very difficult to maintain.
So, would there be a more efficient way to make only one global query instead of 3.
Use a sub-query factoring clause (a.k.a. a CTE) to combine the queries with the simple UNION ALL queries and then perform the complex query on the combined table (rather than trying to perform the queries on the individual tables):
WITH subquery_name AS (
select * from owner1.Table1 union all
select * from owner2.Table1 union all
select * from owner3.Table1
)
SELECT <your complex query>
FROM subquery_name;

Do a select query on all tables of PostgreSQL database using pgAdmin 4

I have a database of around 90 schemas. In each of these schemas, I go to "Materialized Views" and go to a one of the views called, "product_visitor_view" and I create a SELECT script and I write this script and run it and see the results:
SELECT priority, count(*)
FROM ag_modem_01.product_visitor_view
group by priority;
However, I cannot do this for all 90 around schemas. Is there a way I can do this for all schemas and the results would be shown for each schema in a page and how can I do this?
Thank you in advance.
You can prepare the SQL for each schema using below query.
The idea is to instead of manually writing all the query, you can use the system table pg_matviews which contains information about materialized views.
Once you have the list, just need to do union all between all the rows.
select
string_agg('select '''||schemaname||''' as schema,priority,count(*) from '||schemaname||'.'||matviewname
||' group by priority '||chr(10),' Union All '||chr(10)
)
from pg_matviews
where matviewname='product_visitor_view'
and ispopulated=true; -- to filter out Views which has not been populated
Take the output from this query and run it in the Querytool.

Snowflake sql table name wildcard

What is a good way to "select" from multiple tables at once when the list of tables is not known in advance in snowflake sql?
Something that simulates
Select * from mytable*
which would fetch same results as
Select * from mytable_1
union
Select * from mytable_2
...
I tried doing this in a multistep.
show tables like 'mytable%';
set mytablevar =
(select listagg("name", ' union ') table_
from table(result_scan(last_query_id())))
The idea was to use the variable mytablevar to store the union of all tables in a subsequent query, but the variable size exceeded the size limit of 256 as the list of tables is quite large.
Even if you do not hit 256 character limits, it will not help you to query all these tables. How will you use that session variable?
If you have multiple tables which have the same structure, and hold similar data that you need to query together, why are the data not in one big table? You can use Snowflake's clustering feature to distribute data based on a specific column to your micro-partitions.
https://docs.snowflake.com/en/user-guide/tables-clustering-micropartitions.html
Anyway, you may create a stored procedure which will create/replace a view.
https://docs.snowflake.com/en/sql-reference/stored-procedures-usage.html#dynamically-creating-a-sql-statement
And then you can query that view:
CALL UPDATE_MY_VIEW( 'myview_name', 'table%' );
SELECT * FROM myview_name;

How can I UNION ALL on all columns of a table in Access

I have two select queries with the same number of columns (c.150) and I am trying to UNION ALL the two with:
SELECT *
FROM query1
UNION ALL
SELECT *
FROM query2
I am getting the error "Too many fields defined", but I understand that Access can process 255 fields? Given I don't want to have to write out every field name within each of my select queries, is there a practical way to achieve this union?
As Parfait mentions in his comment, this error is caused as Access is counting the column count of each of my tables towards the limit. 150 + 150 > 255 => Too many fields defined. See a similar question here.
Provided you don't have too much data, an alternative is to write one into a table and append the other into the same table.

Rails - Get latest entries from multiple tables

I am using RoR for app development and want to implement the following:
For the app's RSS feed I need to pull out 10 latest entries from the database. The part I am not able to figure out is how to get last 10 entries across multiple tables.
Pulling last n records from a single database table is easy but how to pull last n records across multiple database tables.
Any suggestions?
There are none built-in methods in ActiveRecord for this, doing this in SQL very simple You just do UNION.
SELECT * FROM (
SELECT table1.*, 'table_1_type' AS table_type FROM table1
UNION ALL SELECT table2.*, 'table_2_type' AS table_type FROM table2
UNION SELECT table3.*, 'table_3_type' AS table_type FROM table3
) AS my_union ORDER created_at DESC LIMIT 10;
This query will return 10 rows, each row will have tables from each of those. I suggest You to create plain Ruby class that executes this raw query and rebuilds the objects.
Or even better just fetch the values necessary for You and represent them using plain Ruby classes (not using You models).
You can access postgres connection through any AR class, like this MyModelClass.connection
Example of using raw connection:
conn = MyModelClass.connection
res = conn.exec('select tablename, tableowner from pg_tables')
res.each do |row|
row.each do |column|
puts column
end
end
In same fashion You can execute Your query and populate Ruby objects.
UPD
There is also a third option You can use. Create a new model RssItem and back it with SQL view instead of table. The approach is explained in this post:
http://hashrocket.com/blog/posts/sql-views-and-activerecord
Is there a particular reason you want to combine queries for multiple tables into one sql query? I don't think its possible to do a union query.
You can try something like this, its not exactly what you want though:
#all_items = [#record_videos.all, #musics.all, #new_topics.all].flatten.sort_by{ |thing| thing.created_at}
taken from:
rails 3 merge multiple queries from different tables
If it is important to you then the true way to accomplish this in Rails is to use STI (single table inheritance). But that would require restructuring your DB and models.