Google Bigquery says "Billing has not been enabled for this project." - billing

I'm trying to load data in a new BigQuery table, but when I run the following in the 'bq shell'
load ct.ads /tmp/data.csv id:integer,source:string,clicks:integer
I get
Waiting on job_7e1d39b261d041da8674a769e8275b91 ... (0s) Current status: DONE
BigQuery error in load operation: Billing has not been enabled for this project.
I've enabled billing and the tab in Google Api Console says:
Your billing information may take a few minutes to update. Please refresh this page for updates.
Authorized by: martin#foo.com - you
Unbilled usage (estimate, updated daily)
Start date May 2, 2012
Total (before taxes) 0.00 USD
Statements
None
Any hints?
Ok, it seems when I entered the CC information it was refused; I tried with another CC and it seems to be fine...
Now it says:
Billing is enabled for all active, billable services
tnx

BigQuery caches billing state, it may take a few hours to be updated. When did you fix the CC number? If it was today, it may just not have been updated in the cache. If it takes more than 8 hours or so, please let us know and we'll investigate.

Related

How to manually test a data retention requirement in a search functionality?

Say, data needs to be kept for 2years. Then all data that were created 2years + 1day ago should not be displayed and be deleted from the server. How do you manually test that?
I’m new to testing and I can’t think of any other ways. Also, we cannot do automation due to time constraints.
You can create the data with backdating of more than two years in the database and can test, if it is being deleted or not automatically, In other ways ,you can change the current business date from the database and can test it
For the data retention functionality a manual tester needs to remember the search data so that the tester can perform the test cases for the search retention feature.
By Taking an example of a social networking app , being a manual tester you need to remember all the users that you searched for recently.
To check the time period of retention you can take the help from the backend developer so that they can change the time period (from like one year to 10 min) for testing purpose.
Even if you delete the search history and then you start typing the already entered search result the related result should pop on the first location of the search result. Data retention policies concern what data should be stored or archived, where that should happen, and for exactly how long. Once the retention time period for a particular data set expires, it can be deleted or moved as historical data to secondary or tertiary storage, depending on the requirement
Let’s us understand with an example, that we have below data in our database table based on past search made by users. Now with the help of this table, you can perform this testing with minimum effort and optimum result. We have Current Date as - ‘2022-03-10’ and Status column states that data is available / not available in database, where Visible means available, while Expired means deleted from table.

Search Keyword
Search On Date
Search Expiry Date
Status
sport
2022-03-05
2024-03-04
Visible
cricket news
2020-03-10
2022-03-09
Expired - Deleted
holy books
2020-03-11
2022-03-10
Visible
dance
2020-03-12
2022-03-11
Visible

Bigquery - How to in crease the expiration time of tables in free sandbox?

I am using the free bigquery sandbox to generate some custom metrics based on my analytics data. I have read in the documentation that the expiration time of table in free account is 60 days. What does this expiration time means ? What will exactly happen after 60 days. All my datas will be lost ? How can i increase the expiration time in this case ? Should i need to pay for it ? If yes, what will be the cost ?
According to the documentation:
The BigQuery sandbox gives you free access to the power of BigQuery
subject to the sandbox's limits. The sandbox allows you to use the web
UI in the Cloud Console without providing a credit card. You can use
the sandbox without creating a billing account or enabling billing for
your project.
In addition, according to the limits :
All datasets have the default table expiration time and the default
partition expiration set to 60 days. Any tables, views, or partitions
in partitioned tables automatically expire after 60 days.
You can edit this expiration date if your data is exported to BigQuery but, in order to do that, you have to upgrade the project's plan to use it (if needed). Then you would be billed by the amount of bytes processed, you can check the billing options here.
Thus, within BigQuery you can edit the expiration date. In BigQuery, you go to Project > Dataset > Table > Details > click in the pencil next to the table's name and set expiration date to never or select a date. As follows:

Data Transfer V2.0 Campaign Manager : No Costs and Revenues inside storage CSV

We are trying to solve an issue of costs and revenues from DoubleclickCampaignManager/Campaign Manager.
The goal is to create a daily dashboard of media costs (paid search, display, videos and social) thanks to Google Cloud.
We have right now access to Facebook data, Google Analytics and Campaign Manager. The issue is on the last one.
For Campaign Manager, the bucket oustide our organization, have been added to our organization thanks to Data TransferV2.0. We have access to impressions, clicks, activity and match tables csv on Storage and so on, on BigQuery.
We have date, clicks, impressions, cities, ad name metrics, etc... but we only have 0 in costs metrics.
What i mean about costs, it's how much we paid for 1 impression. In revenues, DBM costs, total media costs... (Avertiser, Partner or Account Currency) we only had 0.
We ask Google to help us : they told us to check a checkbox meaning that "Check Campaign Manager and DV360 are linked into Data Transfer".
They told us, that it should work, but we still have 0 on Revenues and Costs.
We should have 32.00 for instance, instead of 0. Do you have any idea how to solve this issue ?
Best,
Theo
If after this solution you have not get any information. I reccomend you to send your offline reports to BigQuery directly.
You could do it following some steps as follows:
create a dataset in BigQuery and copy its ID,
Then, go to settings in CM360 and activate the BigQuery Export:
Copy the iam email account that CM settings has returned to you,
then after, go to bigQuery and include this account with only editor permissions to your dataset.
After all this procces the option will be available to activate in the delivery options of CM360's offline reports :
Have you tried to enable "Report Display & Video 360 cost to Campaign Manager" from Basic Details under Partnet level from DV360.
Report Display & Video 360 cost to Campaign Manager

Cannot run query: project does not have the reservation in the data region

Since today, suddenly, I am constantly receiving the following error when tried to execute query jobs in Google Big Query:
Cannot run query: project does not have the reservation in the data region
I tried with several projects and still this error persists.
Has anyone ever encountered this error?
"Reservation" here refers to computing slots. You have slots for computation in one region (or none available at all), but data lies in another region.
Your reservation has been configured on Feb 13th. Now the problem should have been fixed.

Error: Not Found: Project <project-id>

I've recently signed up to Google BigQuery for curiosity's sake and saw that it allows one to play with sample data sets without enabling billing. I followed the installation steps, first creating a Project named "Test Cloud Project" and then enabled BigQuery in the services tab of GoogleAPI.
I have tried running the following:
SELECT repository.url FROM [publicdata:samples.github_nested] LIMIT 1000
and receive the error Error: Not Found: Project p-testcloud-bren
Did I miss a setup step somewhere or do you have to enable billing to actually query the sample datasets?
You don't need billing enabled to run a query on a publicdata:sample table (the first 100GB of data processed per month is free of charge).
If you are making your own API calls, double check that you have the "project id" correct. You should be able to use either the project number (a unique integer value) or the project id (an alpha numeric value you can choose) for your requests to the BigQuery API.