Mixpanel: Data Export API gives different results and ranges to the web dashboard - api

I'm getting different numbers in Mixpanel's dashboard and Data Export API.
In the dashboard on the segmentation tab, I've selected an event ("login"), start and end dates (Sept 17th and 24th), and the week unit. It shows a graph, and a table with four values, for this week, Sept 17th, Sept 10th and Sept 3rd.
I make this call to the Mixpanel API using the Python library:
api = Mixpanel(
api_key = '----',
api_secret = '----'
)
data = api.request(['segmentation'], {
'event': 'login',
'unit': 'week',
'from_date': '2012-09-17',
'to_date': '2012-09-24'
})
print json.dumps(data)
Here's the data it returns:
{
"legend_size": 1,
"data": {
"series": [
"2012-09-17",
"2012-09-24"
],
"values": {
"login": {
"2012-09-17": XXXXX,
"2012-09-24": YYYYY
}
}
}
}
The value XXXXX is different to the value shown in the web dashboard, why is this?
The API is returning a value for Sept 24th which isn't in the dashboard view. The dashboard shows values for Sept 3rd and 10th which aren't in the API. Why is this happening?
How can I ensure the results are consistent between the two interfaces for the same date range?

Geddes from Mixpanel's Solutions Team here. The Mixpanel website actually uses the same APIs that we document publicly for your use, so one tip is to use Firebug / Chrome Inspector to view all the XHR requests on the Mixpanel page. You'll see the exact API query Mixpanel is using to get it's numbers, and you can compare that to your own API query, and it will become clear where the difference is.
Of course, we'd be more than happy to look at your case. If you can provide details like account name, event name, etc to support#mixpanel.com we can give you a more specific answer.
Best,
Geddes

Related

NetSuite Rest Integration - Get Sales Records By Date

I need direction on how I would get the sales transactions from NetSuite for a given date range and then access the details (such as customer information) and sale amount. Are there a set of APIs best for this or a RestLet approach/sample I can follow?
Queries are done through the search API. You would want to perform a search, and then export its results. You can create a restlet that accepts some parameters by which to search as input and exports search results as output. The restlet would be implemented in javascript (in NetSuite's SuiteScript).
The following code is quick pseudocode with some SuiteScript 2..0, just to help you get started. This is not something you can copy paste, and this is off the top of my head, you will need to still do some research.
var orderSearch = search.create({
type: 'salesorder',
filters: [
['mainline', 'is', true],
'AND',
['trandate', 'on', '9/25/2020']
],
columns: [
search.createColumn({ name: 'entity' }),
search.createColumn({ name: 'trandate'}),
search.createColumn({ name: 'total' })
]
});
var firstPageOfResults = [];
var pages = orderSearch.runPaged({ pageSize: 100 });
if (pages && pages.count > 0) {
var page = pages.fetch({ index: 0 });
firstPageOfResults = page;
}
// here is where you do something like return the search results
// to the output of the restlet function
return firstPageOfResults;
So, again, this is just quickly drafted rough code to get you started.
You will need to learn more about the search operators, you might want "between" instead of "on", so you can express some other start date and some other end date.
You can of course return other fields of sales orders, you need to go look up the fields in the "record browser" help section.
And you can learn more about the search API in the netsuite docs on writing suitescript.
You will also need to learn about setting up a restlet. Also, if you plan to call your restlet from outside NetSuite, you need to learn about how to setup token based authentication and an "integration record" in netsuite so that you can get basically a password-like thing to pass in the Authorization header to securely call the restlet.
There are more advanced ways of doing all this stuff. You can also learn about and use SuiteQL. I am not going to go into detail here. It is a way to write a query in SQL and run it. You might want to do that instead.

Amadeus Self-Service API currency bug. Response always in origin country currency

I originally reported this to Self-Service support in December, but I never got a response. I recently realized that, even in the production environment, selecting a currency parameter for an Inspiration or Cheapest-Date endpoint always returns the origin country's currency despite selecting another currency. (In the Low-Fare endpoint it seems to work as designed.) I tested this in both my web application and in Amadeus' own explorer tool. Here is a snip from the JSON response in the Explorer:
"meta": {
"currency": "EUR",
"links": {
"self": "https://test.api.amadeus.com/v1/shopping/flight-dates?origin=MAD&destination=MUC&departureDate=2019-04-14,2019-10-10&oneWay=false&duration=4,7&nonStop=true&currency=USD&viewBy=DATE"
},
"defaults": {
"departureDate": "2019-04-14,2019-10-10"
}
}
Notice that the meta.currency value is EUR, but the meta.links.self (the query I ran) shows a GET parameter of currency=USD. The same problem I reported in Dec.
I am posting this for suggestions about how to get some action from Amadeus (hope they read this), or a suggested workaround (obvious one is hiding Currency field from the Inspiration and Cheapest-Date form).
The currency parameter in Flight Inspiration and Cheapest Date Search, works only along with maxPrice. Prices in the response are computed in a currency determined by the origin/destination pair: they cannot be converted in a given currency.
Since it's a bit confusing, we are going to update the currency parameter naming and documentation. Point taken and sorry for the delay in the response.

Use BigCommerce API to get a list of prices and name only

I have an bigcommerce headless ecom site. I keep data synced to my own database so I dont have to rely on bc api calls for every user.
Problem now is that certain data changes often. Like prices. — How can i use the BigCommerce api to get a list of only prices/name/id the list would loook like the below.
[
{
name: xxx,
id: xxx,
calculated_price: xxx,
},
{
name: xxx,
id: xxx,
calculated_price: xxx,
},
]
You can use the ?include_fields parameter to control the fields in the response when using the V3 Catalog API.
For example:
GET /v3/catalog/products?include_fields=calculated_price,name
IDs will always be returned.
From there you could apply other filters to control which items are returned in the collection.
If you also need variant prices, try including the variants with ?include=variants

Can an object have 2 active snapshots?

According to this page in the docs, only one snapshot can be active for a given object. However, I seem to have a Defect with 2 active snapshots. All snapshots are shown in the screenshot below:
As you can, see I have connected the snapshots with arrows and they do not all link together. Is this a bug with Rally or is it in fact possible to have 2 defects with _ValidTo dates in the year 9999?
My query is taken from the example in the docs:
URI: https://rally1.rallydev.com/analytics/v2.0/service/rally/workspace/12345/artifact/snapshot/query.js
POST data:
{
"find": {
"ObjectID": my funky object
},
"fields": ["State", "_ValidFrom", "_ValidTo", "ObjectID", "FormattedID"],
"hydrate": ["State"],
"compress": true
}
The object should not have two current snapshots with _ValidTo set to 9999-01-01. Please contact CA Agile Central (Rally) support, and they will raise the issue with the LookbackAPI team, which I believe has a way of fixing the data on their end.

How to use Wikipedia API to get the page view statistics of a particular page in wikipedia?

The stats.grok.se tool provides the pageview statistics of a particular page in wikipedia. Is there a method to use the wikipedia api to get the same information? What does the page views counter property actually mean?
The Pageview API was released a few days ago: https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/{project}/{access}/{agent}/{article}/{granularity}/{start}/{end}
https://wikimedia.org/api/rest_v1/?doc#/
https://wikitech.wikimedia.org/wiki/Analytics/AQS/Pageview_API
For example https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia/all-access/all-agents/Foo/daily/20151010/20151012 will give you
{
"items": [
{
"project": "en.wikipedia",
"article": "Foo",
"granularity": "daily",
"timestamp": "2015101000",
"access": "all-access",
"agent": "all-agents",
"views": 79
},
{
"project": "en.wikipedia",
"article": "Foo",
"granularity": "daily",
"timestamp": "2015101100",
"access": "all-access",
"agent": "all-agents",
"views": 81
}
]
}
No, there is not.
The counter property returned from prop=info would tell you how many times the page was viewed from the server. It is disabled on Wikipedia and other Wikimedia wikis because the aggressive squid/varnish caching means only a tiny fraction of page views would make it to the actual server in order to affect that counter, and even then the increased database write load for updating that counter would probably be prohibitive.
The stats.grok.se tool uses anonymized logs from the cache servers to calculate page views; the raw log files are available from http://dammit.lt/wikistats. If you need an API to access the data from stats.grok.se, you should contact the operator of stats.grok.se to request one be created.
Note this was written 4 years ago, and an API has since been created (see this answer). There's not yet a way to access that via api.php, though.
get the daily JSON for the last 30 days like this
http://stats.grok.se/json/en/latest30/Britney_Spears
You can look into the stats here.
Have anyone experienced some API to get the Pageview Stats?
Furthermore, I have also looked into the available Raw Data but could not find the solution to extract the Pageview Count.
There doesn't seem to be any API; however, you can make HTTP requests to stats.grok.se and parse the HTML or JSON result to extract the page view counts.
I created a website http://wikipediaviews.org that does exactly that in order to facilitate easier comparison for multiple pages across multiple months and years. To speed things up, and minimize the number of requests to stats.grok.se, I keep all past query results stored locally.
The code I used is available at http://github.com/vipulnaik/wikipediaviews.
The file with the actual retrieval code is in https://github.com/vipulnaik/wikipediaviews/blob/master/backend/pageviewqueries.inc
function getpageviewsonline($page, $month, $language)
{
$url = getpageviewsurl($page,$month,$language);
$html = file_get_contents($url);
preg_match('/(?<=\bhas been viewed)\s+\K[^\s]+/',$html,$numberofpageviews);
return $numberofpageviews[0];
}
The code for getpageviewsurl is in https://github.com/vipulnaik/wikipediaviews/blob/master/backend/stringfunctions.inc:
function getpageviewsurl($page,$month,$language)
{
$page = str_replace(" ","_",$page);
$page = str_replace("'","%27",$page);
return "http://stats.grok.se/" . $language . "/" . $month . "/" . $page;
}
PS: In case the link to wikipediaviews.org doesn't work, it's because I registered the domain quite recently. Try http://wikipediaviews.subwiki.org instead in the interim.
em.. this question was asked 6 years ago. There's no such an API in official site in the past.
It changed.
A simple example:
https://en.wikipedia.org/w/api.php?action=query&format=json&prop=pageviews&titles=Buckingham+Palace%7CBank+of+England%7CBritish+Museum
See document:
prop=pageviews
Shows per-page pageview data (the number of daily pageviews for each of the last pvipdays days). The result format is page title (with underscores) => date (Ymd) => count.