Billing Charges when Querying a DataStudio Report Connected to a View in BiqQuery - google-bigquery

I have created a report in DataStudio using data pulled from BigQuery and saved as a view. After playing around with the report for a while I have noticed that I have been billed 100+ times for the query (all the exact same size of data), but I only ran it once to build the view. Am I getting charged every time I interact with the report e.g. apply a filter? If not, what is causing these costs?

Your report will run a query against the view for each element on the page so 4 graphs = 4 queries.
If you then change a filter for example, that will run a further 4 queries (assuming the filter affects them all).

Related

Why BigQuery BI engine does not use all the reservation?

I have a dashboard connected to a BigQuery Table, BI engine works as expected as I am using a calendar filter and my table is partitioned per date.
when I select a longer date range, BI engine stop working with this message "The table or data volume was larger than BI Engine supports at this time", that's fair.
Please notice, I am already filtering by a partition, but sometimes, I need to see the whole data
to solve that, I created a BI reservation, and I notice regardless of the size 1,2,4 GB the memory used is always 600MB? and I get the same message, I attached a screenshot here, is this by design?
Bug Report here: https://issuetracker.google.com/issues/150633500
turn out the error is not related to reservation, but to the fact that BI engine support only 500 partition, my table has more
https://cloud.google.com/bi-engine/docs/overview#limitations
the solution is instead of partition per day, I will use something like week or month

Charged way too much for BigQuery Analysis

We are using Google data studio to create mobile analytic reports using google Firebase data, linking with BigQuery. We have a dataset created in the report which uses a query to pull data from BigQuery - SELECT * FROM table.events_* which returns 69.4 GB data (verified in validator). The problem is when we create report using this query in dataset, for each report we are charged for 'BigData Analysis' in Tebibytes, which is way too much. But when we calculate the pricing for query, it is not even $1 for the data that we use.
Not sure why the data is processed in tebibytes. Here are some details about the data table in BigData -
Table size: 246.69 MB
Number of columns: 57
Tried query with some filters as well, but still returning Tebibytes data for single report. Is reducing or filtering number of columns only way to restrict the data processing? what transactions comes under BigData Analysis? (other than query processing).
Your help is greatly appreciated. Thanks in advance

Value only showing the first item in SSRS report

So my problem here is that I have a Part number which lives in two warehouses hence it has two bin locations. If I just use =Fields!PrimBin.Value it only ever returns the first location. I need to display the PrimBin if the location is from a specific warehouse. To get the warehouse I use =Fields!WarehouseCode.value
What I need to do is only show the PrimBin.Value of MAINWHSE and not CELLWHSE
Thanks in advance.
Ok so the database it quite vast. However, for the information required I am using two tables. Part and PimWhse.
Part shares the Product ID to PrimWhse. In PrimWhse each partID has two locations "MAINWHSE", "CELLWHSE "and 1 bin to pick in each warehouse giving to possible locations.
So WarehouseCode.Value will have the information for which warehouse the part is located. and PrimBin.Value will have the warehouse position ID stored in it.
This is all setup via report style within the Epicor system. When I create a query in business activity to look in MAINWHSE it shows the correct information.
However, in the report data builder I'm not able to set this query so I assume SSRS will be able to see of both theses possible values for PrimBin.Value!? If not I guess I need to work out how to add a query to report data builder, which at the moment does no seem possible?
Thanks again.

SSRS - Report doesn't loads via Report Builder, while it does via SQL

this is my first question here.
I struggled days and days trying to find a solution everywhere with no success.
Basically I have a standard stored procedure pulling out a report dataset in a few seconds (5-6 seconds).
It aggregates (GROUPING BY and SUMMING) 23000 rows.
Indeed, my final dataset comes out with 4 rows and 33 columns executing, as said, in 5-6 seconds.
Unfortunately, while trying to load it via ReportBuilder, it loads endlessly (querying SQL Server, the StoredProcedure remains stuck in a RUNNING status forever).
Everything on ReportBuilder (DB Accesses, Dataset, Parameters, Matrix....) is right configured: I was indeed able to load it until I added a few additional (4) fields.
The SQL dataset is basically something like:
PARAMETERS DECLARATION
SELECT
FIELDS
FROM
(SELECT
FIELD A
SUMS
FROM
TABLE
JOIN TABLES
WHERE
PARAMETERS MATCHING
GROUP BY A
) AS B
ORDER BY FIELD
An "external layer" SELECT was needed to make some calculations on some FIELDS, also in some cases using some PARAMETERS.
That's it.
I use to work with huge datasets, sometimes pulling out 30,000 rows with 110 fields, but if something loads via SQL it also does always via ReportBuilder: this is the very first time it behaves in this different way.
So I'm asking if there are some strange SSRS/ReportBuilder limitations I never faced in my experience.
Any help would be really really appreciated!
Thanks in advance to everyone who'll spend time :)

Bigquery console does not show all tables

We are now having 1144 tables in one dataset, but many of them are not listed in the left hand list of Bigquery console. I wonder if this is due to a set limit.
The BigQuery Web UI will only show 1000 tables in a dataset (likewise will show only 1000 projects and 1000 datasets in each project). I've filed a bug to either show a longer list or provide a way to load more in the UI.
In the meantime, however, you can use bq to list your tables:
$ bq ls --max_results=10000 your_dataset_name