youtube analytics api, specifying two dimensions - api

I'm trying to get a report from the YouTube Analytics API.
I need this report specifying the country and the dates for an specific video.
This code works:
dimensions=country&metrics=views,estimatedMinutesWatched,averageViewDuration,averageViewPercentage,subscribersGained
sort=-estimatedMinutesWatched&filters=video==VIDEO_ID
If I specify just the country or day dimension, it works.
If I specify day and country dimensions, it throws a 400 error Bad request "The query is not supported. Check the documentation at https://developers.google.com/youtube/analytics/v1/available_reports for a list of supported queries."
This doesn't work:
dimensions=country,day&
metrics=views,estimatedMinutesWatched,averageViewDuration,averageViewPercentage,subscribersGained
sort=-estimatedMinutesWatched&filters=video==VIDEO_ID
Is there another way to get the data in the format I'm looking for, since seems this query is not supported by the API?

This is not allowed. Check the docs:
https://developers.google.com/youtube/analytics/v1/channel_reports
So you could either
query country dimension and filter by day
or
query day dimension and filter by country

Related

Podio API query

I have podio data with more number of column, but we need to fetch 5-6 column data through API. I attached column name screenshot. If we need only for example order id, city, country then how to write API query?
/item/app/{app_id}/filter/
If it is right, how to write query with selected column name with GET/POST.
The filter endpoint uses a POST body to filter which records to return, not which fields/columns to return. It is not possible to specify which fields/columns to return with an API call according to this SO thread from an old Podio support person.
If you are looking to remove fields from the query to reduce your datasource size within Klipfolio, I would recommend returning the API call in CSV format instead of JSON. Klipfolio support documents how to do this HERE by performing a GET operation and adding /csv to the end of the URL.
https://api.podio.com/item/app/Your-APP-ID/csv/

Google Analytic dimensions different from web to API

I have a strange error that i am not sure how to handle.
We have GA monitoring a non-English website ( we are a global brand) and some of the campaign values in GA have the native language ( in this case symbols). So when we do custom channel grouping in GA any campaign that matches the criteria goes into this custom grouping. (the channel grouping is set up using something like campaign like 'KOREAN LANGUAGE HERE').
However when i fetch the data via the API for custom channel groupings the data for this row/campaign is not the Korean value but a simplified English value. I've matched the rows due to the transactions/users/session counts - its the same row.
What on earth is going on? Does google provide a name/title type of framework for campaigns? Is it translating the custom channel grouping somehow?
Any help appreciated.

Coinbase API v2 Getting Historic Price for Multiple Days

I'm having some trouble with a Coinbase.com API call for historical data.
Previously, I was getting a variable length of days that would match the amount of space available on a terminal screen with a request URL that looked like this:
https://api.coinbase.com/v2/prices/historic?currency=USD&days=76
This would pull the previous 76 days of price history. An example of the old output is here:
https://gist.github.com/KenDB3/f071a06ab3ef1a899d3cd8df8b40a049#file-coinbase-historic-days-example-2017-12-23-json
This stopped working a few days ago. The closest I can get to this is with this request URL (though I don't get the data I want):
https://api.coinbase.com/v2/prices/BTC-USD/historic?days=76
The output from this can be seen here:
https://gist.github.com/KenDB3/f071a06ab3ef1a899d3cd8df8b40a049#file-coinbase-historic-days-example-2018-07-19-json
In the second example, it is just displaying prices from the day of the query at different times of that day. What I really want is the first example output where it gives a single price per day going back as many days as the request is for.
The project this is connected to is here:
https://github.com/KenDB3/SyncBTC
Links that do not work:
https://api.coinbase.com/v2/prices/historic?currency=BTC-USD&days=76
(No Results)
https://api.coinbase.com/v2/prices/BTC-USD/historic?2018-07-15T00:00:00-04:00
(Does not pull data from 7/15/2018)
Any reason you aren't using coinbase pro?
The new api is very easy to use. Simply add the get command you want followed by the parameters separated with a question mark. Here is the new historic rates api documentation:
https://docs.cloud.coinbase.com/exchange/reference/exchangerestapi_getproductcandles
The get command with the new api most similar to prices is "candles". It requires three parameters to be identified, start and stop time in iso format and granularity which is in seconds. Here is an example:
https://api.pro.coinbase.com/products/BTC-USD/candles?start=2018-07-10T12:00:00&end=2018-07-15T12:00:00&granularity=900
EDIT: also, note the time zone is not for your time zone, I believe its GMT.
Here is a wrapper for the CoinBase API for the export of Historical Data: https://pypi.org/project/Historic-Crypto/
It should provide the required outcome through invoking:
pip install Historic-Crypto
from Historic_Crypto import HistoricalData
new = HistoricalData('ETH-USD',300,'2020-06-01-00-00').retrieve_data()
for a full list of cryptocurrencies available:
pip install Historic-Crypto
from Historic_Crypto import Cryptocurrencies
data = Cryptocurrencies(extended_output=False).find_crypto_pairs()

Bigquery Active User count not accurate (Google Analytics)

I have Google Analytics integrated to Bigquery and I'm trying to write a query to fetch Active Users that should match with the number on GA Portal.
Here's the query I've written;
SELECT
date(date) as date,
EXACT_COUNT_DISTINCT(fullVisitorId) as daily_active_users,
FROM TABLE_DATE_RANGE([<project_id>:<dataset>.ga_sessions_],
TIMESTAMP('2018-01-01'),
TIMESTAMP(CURRENT_DATE()))
group by date
order by date desc
The numbers I get in response are somehow related to the ones Google Analytics shows me, but they aren't a 100% accurate.
The numebers I get in return are slightely higher than the ones on the portal and I assume I need to put a where clause to filter a property GA might be filtering on the portal.
Your query looks fine to me. Assuming that you're looking at the same GA view as the one linked to BigQuery, I think that the problem could be sampling.
Even if the GA UI says that "This report is based on 100% of sessions.", try to export it as an Unsampled Report and check the numbers (in my experience, the users metric sometimes doesn't match between unsampled reports and default reports without sampling).

Historical aggregate Twitter data

I want to graph the number of tweets and the number of followers over the last three months, but I haven't been able to find a way to do that either through the API or any ready-made tool.
I tried TwitterCounter, but the data they provided was basically the result of some sort of interpolation function, not based on actual historical data.
Is there a way to get historical aggregate data from Twitter (not the actual tweets, but the sums, averages, etc.)?
There are no such numbers. Or not that I am aware of them. Before they updated their tweet id algorithm it was possible to estimate the numbers of tweets per day via a simple difference, but now - since they use a different algorithm to create the ids - it is not possible anylonger.
You could try if google's twitter search could give you some stats.
What do you mean with the 'number of followers'? Whose followers?