How frequently the Kaggle Stackoverflow BigQuery dataset is Updated? - google-bigquery

I am looking for a dataset which has continuous updates of the stack overflow data. I was checking the kaggle bigquery stack overflow dataset and noticed that the dataset is being quarterly updated.
But, I can see from the metadata that the last updated was in March 2019.
Is there any source where I can get the continuous updates of stack overflow data or updates within a 1 month period?

Though the metadata says different, I found from the tables data that they are updating the dataset quarterly.
The most updated data can be found from in the stack exchange data explorer.

Related

GA4 streaming export intraday tables exist but not daily tables

I've been running the GA4 to BigQuery Streaming export for over a month now because the amount of daily events is bigger than the daily export limit (currently around 1.5 million events per day).
Google docs (https://support.google.com/analytics/answer/7029846#tables): If the Streaming export option is enabled, a table named events_intraday_YYYYMMDD is created. This table is populated continuously as events are recorded throughout the day. This table is deleted at the end of each day once events_YYYYMMDD is complete.
According to the docs I should have events_YYYYMMDD tables for previous days and events_intraday_YYYYMMDD table for current day. But that's not the case - all I'm stuck with are events_intraday_YYYYMMDD tables for previous days.
This is the same issue reported in the following posts (I actually copied and pasted from the first post):
BigQuery events_intraday_ tables are being generated daily but no daily events_ table is created
Firebase Analytics doesn't export events table to Bigquery, even though streaming export is enabled
GA4 exports only intraday tables to BigQuery
Unfortunately, none of these posts have a solution and I don't yet have enough reputation here on SO to post a comment to them. I'm currently not paying for Google support because I'm still evaluating GA4, so I’m hoping someone here can provide an answer (and maybe then I can share it with the other's that had the same problem).

Why google analytics dataset delete my old data in bigquery?

I have configured a GA4 bigquery linking, but the dataset only retains 60 days of data and I can't find a way to prevent it from deleting the old days.
In the configuration of the dataset I have the expiration in never as well as the partitions see screens below.
I would appreciate it if someone help me to solve the problem.
See Image
Once you set expiration date to never, only tables created after that will be affected.
You need to change the flag on older tables by yourself. See here

Exporting old logs from Stackdriver to Bigquery

I have Stackdriver logs dating back to Sept. 3rd, and a sink I created on Sept. 14th pulling those logs into a Bigquery dataset. Currently, the data in Bigquery starts only from when I created the sink. Can I export previous logs to a giant .csv and then reupload? I found a similar question here, but with no answer.
Thanks, and sorry for not being more technical with my question -- I am new to Stackdriver logging!
As of late 2021, an alpha feature known as copy logs is available, which will allow you to dump older logs into a GCS storage bucket, and from there, it's a short trip back into BigQuery.
As a caveat, this must be done via shell, and as an alpha feature, no guarantees or SLAs are made.
Prior to this feature, you would have been out of luck. Downloading old logs only allowed 10K logs per request.
Since exporting happens for new log entries only, you cannot export log entries that Logging received before your sink was created. Please refer this Documentation

BigQuery python client library dropping data on insert_rows

I'm using the Python API to write to BigQuery -- I've had success previously, but I'm pretty novice with the BigQuery platform.
I recently updated a table schema to include some new nested records. After creating this new table, I'm seeing significant portions of data not making it to BigQuery.
However, some of the data is coming through. In a single write statement, my code will try to send through a handful of rows. Some of the rows make it and some do not, but no errors are being thrown from the BigQuery endpoint.
I have access to the stackdriver logs for this project and there are no errors or warnings indicating that a write would have failed. I'm not streaming the data -- using the BigQuery client library to call the API endpoint (I saw other answers that state issues with streaming data to a newly created table).
Has anyone else had issues with the BigQuery API? I haven't found any documentation stating about a delay to access the data (I found the opposite -- supposed to be near real-time, right?) and I'm not sure what's causing the issue at this point.
Any help or reference would be greatly appreciated.
Edit: Apparently the API is the streaming API -- missed on my part.
Edit2: This issue is related. Though, I've been writing to the table every 5 minutes for about 24 hours, and I'm still seeing missing data. I'm curious if writing to a BigQuery table within 10 minutes of it's creation puts you in a permanent state of losing data or if it would be expected to catching everything after the initial 10 minutes from creation.

How can I retrieve data from SAP EWM from the query 0WM_MP17_Q0001?

I wanted to know how one can retrieve data from the various query tools available in SAP EWM.
I found the queries in the following link: Extended Warehouse Management - SAP Library
The description of the query 0WM_MP17_Q0001 says:
0WM_MP17_Q0001
You can use this query to see the number and duration of confirmed warehouse orders by day, week, or month. This allows you to see when typical warehouse trends are changing, and thus take actions such as:
Adjusting work schedules to meet demands
Hiring new workers, or letting existing workers go
Requesting budget for expenses such as extra equipment
And I need to retrieve the data for the reasons above.
However, is there a transaction code that I can run to get this report? How can I retrieve this data?
I think you already asked this question on SDN and got a response, see your message and response.
This is BI content.