How to get data from a specific ID from The Guardian API? - api

I want to get data from a specific id from The Guardian API.
The documentation states that you just have to fetch this:
https://content.guardianapis.com/business/2014/feb/18/uk-inflation-falls-below-bank-england-target
You also have to include your API key, but when I try to fetch it, I always get only the status and some basic information and not the actual data.
response screenshot
I even added
show-fields=body
and
show-section=true&show-blocks=all
But nothing seems to work, any help would be appreciated.
Documentation: https://open-platform.theguardian.com/documentation/item

Related

Vimeo API get multiple videos in one GET request

I'm in a situation where I need to make one GET request to Vimeo and get back info for multiple specific videos. Here is what I have for the query string currently:
https://api.vimeo.com/users/XXXXXXXX/videos?fields=uri,duration,pictures.sizes.link,download&containing_uri=/videos/ID1,/videos/ID2&per_page=2
Unfortunately, this only returns the information for ID2 and the video ID before it in its channel, instead of for both IDs specified. I've also tried appending multiple containing_url fields to no avail. Is there any way to make this happen? I'm using axios in react native if that helps.
Instead of "containing_uri", use "uris" as documented here:
https://developer.vimeo.com/api/common-formats#batch-requests
https://developer.vimeo.com/api/reference/videos#GET/videos
The "containing_uri" parameter will only return the page of the specified uri. The "uris" parameter will return the specified videos/objects. Your request should look like this:
https://api.vimeo.com/users/XXXXXXXX/videos?fields=uri,duration,pictures.sizes.link,download&uris=/videos/ID1,/videos/ID2&per_page=2
I hope this information helps!

Neto API Limitations

I'm currently attempting to integrate with the Neto Ecommerce API. I've hit all sorts of limitations that I never see on other platforms and the latest is to do with custom fields.
The API Im using is the GetOrders API, and Im following the requirements to fetch transaction information, however custom fields appear to be missing. Hoping someone out there has made use of this API to extract custom fields and can advise on how to go about getting custom field information.
Any tips appreciated
var netoString = '{"Filter":{"OrderID":[""],"OutputSelector":["ID","ShippingOption","DeliveryInstruction","RelatedOrderID","cust1"]};
Is there an undocumented naming convention used to fetch custom fields or other pattern I can try to see if I can fetch the data?
I am not certain this will be the same for the API, but when using exports the correct format to access Custom Sales Order Fields is "customer_ref1".
To get a Custom Customer Fields is "usercustom1"
Note: For the Custom Customer Fields, the numbers do not match up correctly (I.E. usercustom1 doesn't match misc1 in the cpanel). The correct matches are:
misc1=usercustom1
misc2=usercustom4
misc3=usercustom5
misc4=usercustom6
misc5=usercustom7
misc6=usercustom11
misc7=usercustom12
misc8=usercustom13
misc9=usercustom14
misc10=usercustom15
misc11=usercustom16
misc12=usercustom17
misc13=usercustom18
misc14=usercustom19
misc15=usercustom20
misc16=usercustom21
misc17=usercustom22
misc18=usercustom23
misc19=usercustom24
misc20=usercustom25
misc21=usercustom10
misc22=usercustom26
misc23=usercustom27
misc24=usercustom28
misc25=usercustom29
misc26=usercustom30
misc27=usercustom31
misc28=usercustom32
misc29=usercustom33
misc30=usercustom34
misc31=usercustom35
misc32=usercustom36
misc33=usercustom37
misc34=usercustom38
misc35=usercustom39
misc36=usercustom40

How to get maximum number of tweets on an user

i have this code
("https://api.twitter.com/1.1/statuses/user_timeline.json?screen_name=".$twitteruser.'&count=500'
But it is giving me only 200 records , I found in twitter document that it will give 3200 tweets.Is i am doing wrong what should i do to get that much tweet.
Since there is no page system in twitter 's API, to go throught timelines, you must use the "max_id" parameter.
Here is an helpful link that explains how to work with timelines with nice illustrations: https://dev.twitter.com/rest/public/timelines.
Edit: here is how you do it.
"To use max_id correctly, an application’s first request to a timeline endpoint should only specify a count."
Make your request "https://api.twitter.com/1.1/statuses/user_timeline.json?screen_name=".$twitteruser.'&count=500 (you can put 200).
Then when you get all your data, " keep track of the lowest ID received" and use it as parameter (the same way you do for the count) for your next request. it will give you the 200 next posts with a lower id than the one you specified. Do it again until you reach the end.

Create direct url to LinkedIn company update

I'm implementing a Compony newsfeed on a website and ran into the following problem. The LinkedIn API doesn't provide a direct URL to a company update. Looking at the LinkedIn site there are direct URL's and they're like this for example:
https://www.linkedin.com/company/1441/comments?topic=5849556347070205952&type=U&scope=1441&stype=C&a=5uHW&goback=%2Ebzo_*1_*1_*1_*1_*1_*1_*1_1441
Trying stuff out it seems that the parameters topic, type, scope, stype and a are mandatory for the URL to work.. (goback is the only one that isn't).
Using the LinkedIn API with the Company updates call I'm able to buid the direct url, except for the a parameter. The value is always 4 (for me unexplainable) characters long.
Has anyone ever successfully build a direct URL to a company update or can someone maybe explain the a parameter or how to generate its value?
Updated to new format
You can link directly to any update (company or user) using the following url:
https://www.linkedin.com/feed/update/urn:li:activity:[topic_id]
You can get [topic_id] by getting the last bit of the updateKey in the api response from Linkedin. When updateKey = UPDATE-c7352-6410848097894756353, your topic_id = 6410848097894756353.
In your example that would become https://www.linkedin.com/feed/update/urn:li:activity:5849556347070205952 which links directly to the specific update. The post is too old to work with the new link format
The url used to be
https://www.linkedin.com/nhome/updates/?topic=[topic_id]
Updated thanks to the comment from #sethpollack
For anyone trying to get the topic id from the API response object (as already commented on the OP question), the topic id is the value after the last hyphen of the updateKey property, which can be used with #Daan answer:
"updateKey": "UPDATE-cXXXX-YYYYYYYYYYYYYYYYYY"
Direct URL:
https://www.linkedin.com/nhome/updates?topic=[YYYYYYYYYYYYYYYYYY]
Using the URL format above, get the topic_id by opening the update in its own window/tab, look at the page source code in your browser and search for the string :activity: the long number after the string is the infamous topic_id

Is it correct that the Instagram Location/Search endpoint does NOT support pagination?

I have read several postings about the fact that the Instagram API returns only 20 results at a time. In many cases, people have suggested that all you need to do is to use the next URL which is returned in the pagination information. I would be fine with that, but the JSON returned by
https://api.instagram.com/v1/locations/search?
does not appear to have any pagination information. I have seen a posting that says that /media/search does not support pagination. I just wanted to confirm that the same is true for /locations/search. Can anyone confirm?
And if this is correct, does anyone have any thoughts about how you can get a list of all Instagram events in a specific area, rather than just the first 20 or so?
/media/search end point does not have pagination, but you can get next set of 20 pics by using url param max_timestamp, get the created_time of the last photo in api response and apply it as max_timestamp for next url call, it will return the next 20. Thats how I implemented here: http://www.gramfeed.com/instagram/map
for locations/search end point, timestamp does not apply, you can change distance url param and get more results: http://www.gramfeed.com/instagram/places