API to Twitter Analytics - api

I am currently exporting CSV files from Twitter Analytics for Impressions, Engagement and Cost purposes for a dashboarding and analysis tool.
Does Twitter have an API for that? Do they provide it for high paying clients only? I was told that some companies are able to download the insights automatically.
Thank you!

They currently don't offer a public API for that (frustrating), but I wrote a Python script that logs in and downloads the CSV programmatically: https://github.com/ashleycoxley/twitter-analytics-export

A bit late to the party. There is a hackish python package that does the work.
$ pip install twitter-analytics
More here: https://github.com/philippe2803/twitter-analytics-wrapper
Disclaimer: I'm the author.

Twitter doesn't offer a public API for that, but there are paid services that have free trials if this isn't a regular task. Crowdbabble creates data visualizations using the Twitter API and can export to CSV, PDF, or PNG (or scheduled reports that go to your email automatically). They download the insights via the API.

Related

Google Local Service Ads to Big Query

I need help writing a simple script or example for "Code.gs" in Google Sheets using the BigQuery to import from Google Local Ads API (Not Google Ads) to Google Sheets. The information needed would be everything that the LSA API has to has offer. I have currently done a lot of research on the back end and can't seem to find any examples on how to write function scripts for importing from Google LSA. Thank you in advance for your help.
What I have set up.
I have created my Oauth credentials and tokens
Tested the tokens in Google playground to ensure that everything works correctly.
What I need done:
Need a function script (for Google Sheets - Code.gs) that can grab my local service ads api info to into Google Sheets using the BigQuery API scripting language.

How to move data from Adjust, Funnel and facebook Ads to google big query?

Can someone provide me the steps to push data from Funnel,Adjust and Facebook Ads to google big query ??
We are not interested in third party softwares like stitch.
Thanks
Welcome to StackOverflow!
Assuming you mean Funnel.io here? If so, isn't this what Funnel already does for you? If you are looking to migrate away from that product and do this yourself, you'r going to need to find importers for each of the data sources which Funnel connects too individually. It may be, actually, that you just want to stick with Funnel to make life easier?
BigQuery offers several connectors to external data sources via their Data Transfer Service, while Facebook Ads is not one of the official connectors, Google recently opened up the platform to third-party connections via the Google Cloud Platform Marketplace, which does contain a connector for Facebook Ads.
However, this means you are once again using a third party connection, so may not be what you are looking for.
If you don't want to use third party connections, and there isn't a Google built solution, then your remaining option is to build your own data importer, usually via the data sources API.

How to build Google Analytics 'collect' like api using Google Cloud services

I'm trying to build a data collection web endpoint.The Use case is similar to Google Analytics collect API. I want to add this endpoint(GET method) to all pages on the website and on-page load collect page info through this API.
Actually I'm thinking of doing this by using Google Cloud services like Endpoints, BQ(for storing the data).. I don't want to host it in any dedicated servers. Otherwise, I will be end up doing a lot for managing/monitoring the service.
Please suggest me how do I achieve this with Google Cloud Service? OR direct me to right direction if my idea is wrong
I suggest focussing on deciding where you want to code to run. There are several GCP options that don't require dedicated servers:
Google App Engine
Cloud Functions/Firebase Functions
Cloud Run (new!)
Look here to see which support Cloud Endpoints.
All of these products can support running code that takes the data from the request and sends it to the BigQuery API.
There are various ways of achieving what you want. David's answer is absolutely valid, but I would like to introduce Stackdriver Custom Metrics to the discussion.
Custom metrics are similar to regular Stackdriver Monitoring metrics, but you create your own time series (Stackdriver lingo described here) to keep track of whatever you want and clients can sent in their data through an API.
You could achieve the same thing with a compute solution (Google Cloud Functions for example) and a database (Google BigTable for example) and writing your own logic.. but Custom Metrics is an already built solution that includes dashboards and alerting policies while being a more managed solution.

Is there any quick and easy way to upload a Google Doc from SAP?

We're creating a custom table in SAP comprising all of the information we need and the customer needs the report from this table uploaded to Google Docs. We do not use Business By Design. Is there any other quicka nd easy way to upload our report?
I don't know much about SAP but the Documents List API has methods to programmatically upload a document to Google Docs: https://developers.google.com/google-apps/documents-list/.
For instance, if you can export the SAP table as a csv file, that can be automatically converted into a Google Spreadsheet during the upload process.
You could also go with a no-programming required solution and install the Google Drive app on a machine with access to the files for automatic sync up to Google Drive:
http://support.google.com/drive/bin/answer.py?hl=en&answer=2374989
suggest you take a look at the SAP Developer Network (SDN) / SAP Community Network (SCN) where there is a project called ABAP2GAPPS that has done this.
Note the ABAP2GAPPS example is a bit difficult to figure out (but you can learn a lot from it), and it also uses the OAuth2 'authorization code flow" OAuth2.0 flow/pattern, which requires an end-user 'consent' in an browser pop-up...so if you want to push up a file from ABAP automatically from a background job without end-user interaction then ABAP2GAPPS is not the full answer (but again, ABAP2GAPPS is a great example, suggest you look at it.)
We recently were able to achieve an interface from SAP ABAP to the Google Fusion Table API using OAuth2, with only about a 100 lines of ABAP...and the techniques we employed could be used on any of the Google API's...here's a link to the video:
Link to YouTube video interface ABAP to Google API
hope you find this helpful

Bloomberg Open API

Bloomberg Open API announced recently - is it just the Bloomberg SDK which had been (limitedly) exposed to public for quite a while?
My understanding is that Bloomberg SDK is possible to use only on the machine with a Bloomberg Terminal installed, but the recently announced Open API (which is syntactically the same) will be possible to use from any machine.
Is that correct? Are there any restrictions on the new API (say, delayed responses etc)? Just cannot believe they're giving away for free something that costed money - any clarifications are welcome!
EDIT: The above was probably not clear, so to rephrase:
I wonder if the newly announced Open API is syntactically the Bloomberg SDK API (or how they call it?) which has been available for years already
assuming there are restrictions on using Open API on any machine (comparing to using SDK from a machine with Bloomberg Terminal installed) - I wonder if those restrictions are specified in detail in some official Bloomberg doc.
I can myself guess on both questions, but I thought I'd rather ask :)
Since the data is not free, you can use this Bloomberg API Emulator (disclaimer: it's my project) to learn how to send requests and make subscriptions. This emulator looks and acts just like the real Bloomberg API, although it doesn't return real data. In my time developing applications that use the Bloomberg API, I rarely care about the actual data that I'm handling; I care about how to retrieve data.
If you want to learn how to use the Bloomberg API give it a try. If you want to test out your code without an account, use this. A Bloomberg account costs about $2,000 a month, so you can save a lot with this project.
The emulator now supports Java and C++ in addition to C#.
C#, C++, and Java:
Intraday Tick Requests
Intraday Bar Requests
Reference Data Requests
Historical Data Requests
Market Data Subscriptions
Edit: Updated Project link, moved to github
The API's will provide full access to LIVE data, and developers can thus provide applications and develop against the API without paying licencing fees. Consumers will pay for any data received from the apps provided by third party developers, and so BB will grow their audience and revenue in that way.
NOTE: Bloomberg is offering this programming interface (BLPAPI) under a free-use license. This license does not include nor provide access to any Bloomberg data or content.
Source: http://www.openbloomberg.com/open-api/
This API has been available for a long time and enables to get access to market data (including live) if you are running a Bloomberg Terminal or have access to a Bloomberg Server, which is chargeable.
The only difference is that the API (not its code) has been open sourced, so it can now be used as a dependency in an open source project for example, without any copyrights issues, which was not the case before.
I don't think so. The API's will provide access to delayed quotes, there is no way that real time data or tick data, will be provided for free.