In firebase I have data till date i.e, 17th of Aug 2022 however when I was querying in SQL workspace in bigquery I was getting data till 8th of Aug . My billing was reactivated on 17th ,now I am getting all data except 9th to 11th. Why there is data loss and is there any specific interval for the same .What could be the possible reason .
Related
I want to get the raw time series data for all the charts for Profit and Loss and Balance Sheet. I went through the API documentation and could only find report/summary data for P&L and BS. Is there a way I can fetch all the data for all the charts/categories.
For example: If I request the ProfitAndLoss data then I should get the data shown below in json format.
https://api.xero.com/api.xro/2.0/Reports/ProfitAndLoss?fromDate=2020-08-01&toDate=2021-07-31&periods=11&timeframe=MONTH
should return JSON that gets what you want.
From Xero ProfitAndLoss endpoint docs
Edit to modify API query:
Should be:
https://api.xero.com/api.xro/2.0/Reports/ProfitAndLoss?fromDate=2021-08-01&toDate=2021-08-31&periods=11&timeframe=MONTH
Xero has quirky way to retrieve data. The modified query basically says get Aug 2021 and compare it to the previous 11 monthly periods eg all the way back to Sep 2020. This is the way the Xero UI reporting works too. The result is a table with columns for Aug 2021, Jul 2021, Jun 2021, May 2021, Apr 2021, Mar 2021, Feb 2021, Jan 2021, Dec 2020, Nov 2020, Oct 2020, Sep 2020, and rows for each account values by month.
Also note that you should choose a month with 31 days as the first month to compare to, else another Xero quirk will truncate all compared periods to how ever many days your start month has eg 30 days if you choose Sep to start.
The first query retrieves a not quite cumulative, read forward result, which to my mind is not useful at all.
In June, I run a youtube bigquery transfer backfill for the data of 2017-04-10, I got 120,000+ records in total for "asset estimated revenues".
In September, I run the same youtube bigquery transfer backfill for the same date, I got about 98,000 records in total for "asset estimated revenues".
Should youtube bigquery transfer backfill data for 2017-04-10 be the same? Does not matter either I run this backfill in June or September?
When run youtube bigquery transfer backfill for the date 2017-04-10, did Google bigquery return the data they cached for 2017-04-10, in this case, does not matter when I request the bigquery transfer backfill, the data for 2017-04-10, the data should be the same.
Or each time when we run the bigquery transfer, such as for the date 2017-04-10, Google bigquery recalcuate the data that it would use to backfill for 2017-04-10 for us. So due to in June and in Sept. I have different assets under my content owner id. So each time the "re-calculation" is different?
OK. I figured it out.
It is youtube bigquery transfer work in progress.
In June, when we run youtube bigquery transfer backfill for "asset estimated revenue", the backfill included all the records with the revenue = 0.
In September, the same backfill, youtube bigquery transfer excludes the revenue = 0.
The data revenue != 0 is the same!
I just need to how to create Rolling months in Obiee. If I click for jan 2017,it should show datas from feb 2016.For past previous 12 Months it should show.
You will need a properly configured time dimension. As soon as you have that all the time series functionalities are at your disposal and will work immediately.
https://gerardnico.com/wiki/dat/obiee/obis/time_dimension
https://gerardnico.com/wiki/dat/obiee/obis/logical_sql/function_time
I'm using Firebase to register some events from an iOS/Android app and log them into BigQuery. As I understood from the documentation, BigQuery creates a different table each day in order to store the events of the single day.
Each day, Firebase Analytics creates a new table in the BigQuery dataset corresponding to the app. The tables are named using the pattern app_events_YYYYMMDD and contain the events recorded for the specified day.
However I'm getting some events in a certain day registered in the table of the following day. For example the table app_events_20160727 contains some events from July 26th, the table app_events_20160728 contains some events from July 27th.
Am I missing something?
Thanks for your support
Sep, 14 Update
I'll try to better explain the issue through an example: the events recorded in the first part of the day (let's say until 3PM/4PM but I don't see any pattern) are collected in the table of that day, the events of the last part of the day are collected in the table of the following day.
So, let's take the events of Sep, 12: here below the screenshot of the first and last entries of the tables related to Sep 12 and Sep 13
First entries of Sep, 13
Last entries of Sep, 13
First entries of Sep, 12
Last entries of Sep, 12
As you can see, the events from Sep, 12 are split into two tables.
Thanks for your support.
Firebase register the timestamp of when the event was track client side.
This is likely to happen in that scenario:
You trigger an event while offline, day N
Your user reconnect to internet only the following day, day N+1, (or the day after)
Thus Firebase base receive the event of day N, on day N+1.
During day N, firebase will export all the event he received (erver side) on day N. on day N+1 he'll export all the event he received on day N+1, even the one actually track client side on day N, but not sent to server on day N.
I'm unsure the explanation is clear, can you tell if it was clear ?
I am querying Jawbone API for up with following HTTP request :
https://jawbone.com/nudge/api/v.1.0/users/#me/moves?start_time=1368392836&end_time=1399928836
I started using UP24 in March, 2014 and with above request, I get data for March and April (till April 18, 2014) but not after that. The data is continuously synced with the app by my UP24 device which I can also see on the app. I am not sure why I am not getting data after April 18, 2014.
The epoch start time stamp corresponds to Sun, 12 May 2013 21:07:16 GMT
The epoch end time stamp corresponds to Mon, 12 May 2014 21:07:16 GMT
The results are paginated and the next link is supplied at end of each response to get the next data