How do we display the query plan executed by Google Cloud BigQuery if it is available?
Thanks.
Regards,
You can explore Execution details tab of the Query results. The tab explains query plan besides including some additional information like query execution time, slots consumed etc.
For more details you can refer https://cloud.google.com/bigquery/query-plan-explanation?hl=en_US&_ga=2.267980487.-856640327.1563816458.
Related
I find myself using the Kusto query language (KQL) via Azure Log Analytics, and I'm struggling to find a way to get any sort of detailed execution report or query plan.
In PostgreSQL I'd use EXPLAIN to produce a report on how the DBMS intends to execute the query, or EXPLAIN ANALYZE for a report on how a query actually got executed. Is there anything akin to that in KQL?
Searches for "kql query plan", "kusto explain query" etc have been largely fruitless, but this probably just means I don't know the right terms.
Supporting the suggestion provided by David דודו Markovitz and posting that as an answer to help the other community members who are having the related discussions.
Yes I do agree that Log analytics has many limitations and we can't link or query log analytics from ADX web UI.
Here are the few documents related to Log Analytics and it's limitations.
More of a curiosity question really. I load data into Power BI report from Google BigQuery (using native Google BigQuery connector in Power BI). All works fine, but for some reason I don't see this query in BigQuery's query history.
Did anyone experience something similar and knows the reason why this happens or how to change that (if at all possible)?
If I do exactly the same thing but using simba ODBC connector, I see this query in BigQuery's query history as expected.
Never seen that before. I am always able to find the query history no matter what 3rd party connection I used. Could you confirm the GCP service-account or auth-account and the GCP project for BQ job query that you used for your native Google BigQuery connector in Power BI?
Please make sure you have the access to the query history of that GCP account in that BQ job project.
I've been having a very hard time using SPL to query data in Splunk... I wish to replace all of that with some simple SQL, Is that possible ? If so how ?
I don't want to pay a lot just to get Splunk training... would rather use my SQL skills :)
Hope you all agree and can help me find a solution !
Cheers!
It is not possible to use SQL to query data in Splunk. Introductory training in Splunk's query language is free. Go to https://www.splunk.com/en_us/training.html, click on "Free Courses", and select "Free Splunk Fundamentals 1".
Splunk as a manual to help SQL users transition to SQL. See https://docs.splunk.com/Documentation/Splunk/7.3.0/SearchReference/SQLtoSplunk.
For general help with searching in Splunk, see https://docs.splunk.com/Documentation/Splunk/7.3.0/Search/GetstartedwithSearch.
I'm actually working on a connector between Apache Drill and Splunk that will enable you to execute SQL queries against a Splunk installation. The code hasn't been committed to Drill yet but you can view the pull request here 1. Here is a link to the documentation 2.
I am currently developing some BigQuery queries for data from Google Analytics (GA), and need a method to asses the validity of the results obtained by them, i.e. that the results returned are correct.
What I am doing so far for validation is to use a combination of the following:
Running the equivalent queries in GA Query Explorer and comparing the results;
For the queries for which I cannot directly check in GA Query Explorer, I have a Python script which calculates the expected results.
What are the good practices for this kind of validation? Do you have any further suggestions on how to improve this?
I need to build a kibana like dashboard over a sql database. Is this possible? or is there an alternative as easy as kibana (in term of integration) for sql?
Siren ( http://siren.io ) is an extended Kibana which has support for connecting directly to SQL (or other APIs) and create filters and analytics.
Check it out
There is a blog about how to index SQL databases in Elasticsearch here:
http://blog.comperiosearch.com/blog/2014/01/30/elasticsearch-indexing-sql-databases-the-easy-way/
Once you get it indexed, you can set up your Kibana to view your data.
You can also find more options suggested here:
http://community.spiceworks.com/topic/377151-generate-a-dashboard-from-sql-database
this could be an alternative for you- https://github.com/KPIWatchdog/DBconnector
creates an API to the tool where you can create queries, visualise data, prepare dashboards and share to others. Depending on the number of metrics, you can opt for free plan or small monthly fee.