Not found: Project fh-bigquery - google-bigquery

I had a dataset that was joining with one of the BigQuery public tables in fh-bigquery and now, out of the blue, the project is not available. Anyone experiencing the same?
I tried joining my usual table with
fh-bigquery.geocode.201806_geolite2_city_ipv4_locs
but I get 404 Not found: Project fh-bigquery

Related

Unable to export query results to a new table in google bigquery

Unable to save query results to a bigquery table
I have been facing the issue for a couple of days now and its annoying. I have already tried multiple query locations in EU (due to data privacy regulations, I am limited to EU) and could never seem to save data. Also not getting any errors too.

How to reference the latest table from a manually partitioned BigQuery table

We have a manually partitioned "video metadata" table being fed fresh data each day. In our system, old data is only kept for historical reasons since the latest data is the most up to date.
What we cant figure out is how to reference only the latest partition in this table using LookML.
So far we have attempted to store views in BigQuery. We have tried and failed to store a simple "fetch the newest partition" query as a view, in both standard and legacy SQL, and upon some searching, this seems to be by design, even though the error message states "Dataset not found" instead of something more relevant.
We've also tried to build the filtering into Looker, but we're having trouble with getting things to actually work and only having the latest data returned to us through it.
Any help would be appreciated.
We've managed to find a solution, derived tables
We figured that since we couldn't define a view on BigQuery's side, we could do it on Looker's side instead, so we defined the table in a derived table block inside a view.
derived_table: {
sql: SELECT * FROM dataset.table_*
WHERE _TABLE_SUFFIX = (
SELECT max(_TABLE_SUFFIX) FROM dataset.table_*
);;
sql_trigger_value: SELECT max(_TABLE_SUFFIX) FROM dataset.table_*;;
}
This gave us a view with just the newest data in it.

BigQuery dataset deletion and name reuse

When I delete a dataset in BigQuery and create another one, in the same project but in a different region, with the same dataset name, this throws an error. It simply says 'Not found: Dataset 'projectId:datasetName' '
This is an important problem, as GA360 imports rely on having the dataset named with the view ID. Now that we have BigQuery in Australia, we would like to be able to use it.
How can I fix this problem?
False alarm. It turns out that BQ just needs some more time to complete this operation. I tried again after some minutes and it now works.

Cannot run query: project does not have the reservation in the data region

Since today, suddenly, I am constantly receiving the following error when tried to execute query jobs in Google Big Query:
Cannot run query: project does not have the reservation in the data region
I tried with several projects and still this error persists.
Has anyone ever encountered this error?
"Reservation" here refers to computing slots. You have slots for computation in one region (or none available at all), but data lies in another region.
Your reservation has been configured on Feb 13th. Now the problem should have been fixed.

SSAS dimensions hierarchies: A duplicate attribute key has been found when processing

I am completely new to SSAS an I am trying to deploy a simple cube with only one dimention comprised of multiples attributes. What I did already was to create a DSV from my data source and then I created a dimension from my fact table. It seams that no matter what happens, I get the following error message:
Errors in the OLAP storage engine: A duplicate attribute key has been found when processing: Table: 'dbo_Fact_Statistics', Column: 'Team', value: 'ANA'. The attribute is 'Team'.
This is my hierarchy: Id (SK) -> Player id -> Team -> Player Name -> Salary
I don't understand, obviously the problem is not that the value is null, like I've seen in other threads, telling me to set NullProcessing under KeyColumns to something else than automatic, but this is not the problem in this context.
Any help would be greatly appreciated.
Probably you have Team ANA listed under multiple Player Names and/or Salary values.
This is a really tricky area of SSAS. The quickest way forward is probably to install BIDS Helper and use the "Dimension Health Check" function:
http://bidshelper.codeplex.com/wikipage?title=Dimension%20Health%20Check&referringTitle=Documentation
It will show you all the issues in your data (not just the first one which you have discovered so far) and give you some info on how to proceed.
Personally I've gone off building attribute relationships due to the difficulty of debugging and fixing these issues. I tend to build dimensions now where every attribute relates directly to the key attribute. You never see these errors and performance seems very similar. You can still present the users with hierarchies.
If that is not an option for you, then you could try adding the columns for the higher-level attributes to the Key property of all the lower levels. Technically this will work but it is awkward to set up and maintain.
This approach solved my problem:
Instead of having the attributes following chained relationships, I simply leave the relationships as they were by default.
Player id (SK) -> Conference
Player id (SK) -> Division
Player id (SK) -> Team
Player id (SK) -> Player Name
Player id (SK) -> Salary
run this in sql find your douplicate for example ID... used it as my dimension key
Select id,count(*) as how_many
from [RC_Dailer_WH].[dbo].[RC_call_logs]
group by id
having count(*) > 1
(3647 row(s) affected) of which there are more than 50k records in my DB
removed duplicates then my cube processed proper
go for that dimension on which it is showing error.
and right click
go for view code
and search for the below line in that code:
ReportAndStop
delete that XML tag
and save
and reprocess it will works
Go to dimension for which it is showing Error. and give right click - go to view code -
CTRL+F - sarch for Connection - find
ReportAndStop
connection String.
Delete the above Command from the Code – and Save.
Process it again.
It will work.