Trouble with Google Custom Search API - google-search-api

Need: Search google via the API and get a json result that mimmics the result found when I search on the webapi
My Custom Search settings is to search the Entire Web
My search resutns:
Search Term: 072745546181 (which is a UPC label for some Chicken Breasts)
https://www.googleapis.com/customsearch/v1?key=AIzaSyBaPxycT3gj82T5qm66XGgIvtSEP31LISo&cx=015261035819156121642:qj7jmhlymjw&q=072745546181
Web search returns (see results)
Search Term: 072745546181 (which is a UPC label for some Chicken Breasts)
Example 1: https://www.google.de/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=072745546181
Notice the q= at the end is the same q=072745546181
There must be a simple answer, what am I doing wrong here?
Conversely, searching with terms like "Donald Trump President"
https://www.googleapis.com/customsearch/v1?key=AIzaSyBaPxycT3gj82T5qm66XGgIvtSEP31LISo&cx=015261035819156121642:qj7jmhlymjw&q=donald trump president
returns an okay result I can do something with. No problem here, but why when searching UPC's it fails?
What should I do?
Update 1.26.17 - Added 50 Point Bounty, I can make more. What is normal rate? Need some help!

This is a fascinating question. I just ran a series of tests that confirms that keywords are treated oddly if they contain more than 8 numerical digits (even if seperated by whitespace or hyphens). They are not simply ignored - because this SO page is found - but most websites are not returned. My best guess for this behaviour is it is a deliberate filter put in by google to restrict numerical searches to "trusted" websites in order to prevent phone number lookups. It might even be a more aggressive move to limit UPC, government records and patent lookups so automated tools can't compete with current or planned Google services that do the same.
I experimented with all sorts of tests including advanced operators like inurl%3A072745546181, allintitle%3A0727+4554+6181 and targeting sites that appear in the regular search like url%3Abuycott.com+072745546181 and the behaviour is consistent. It is so consistent that it has to be deliberate.
I'd say with 95% certainty you can't do what you want with Custom Search and it's highly unlikely Google will provide you a workaround.
I would suggest trying another search API provider, maybe Bing Web Search API or Faroo or one of these product search APIs

This is an old one, but still relevant. You have to create a custom search engine to look for UPCs here: https://cse.google.com/all
Once you do that, you need to add sites to search (e.g. https://www.barcodespider.com, https://www.upcitemdb.com)
From there, you cURL looks like this:
curl -X GET \
'https://www.googleapis.com/customsearch/v1?key={{googleApiKey}}&cx={{googleUpcSearchEngineCode}}&q=034449787178' \
-H 'Accept: */*' \
-H 'Cache-Control: no-cache' \
The request will filter the results by searching for the UPC within the sites specified.

Related

How to search and filter tweetts using Twitter API?

I am exploring how to search and filter tweets using the Twitter API version 2 which as of this writing has been newly released. The documentation for this particular endpoint is available here.
I tried successfully searching for the following query:
https://api.twitter.com/2/tweets/search/recent?query=puppy
As I needed to be more specific, I checked out v1.1 docs for rules and filtering and tried to look for tweets containing puppy images (filter:image) and no retweets (-filter:retweets) but I could not get the query in v2 (preferably) or v1.1 working with postman even though I tried percent-encoding for the special characters.
It is also not clear to me from the documentation (though mentioned in the docs) how to specify a certain language like English (lang=english) and a certain distance in the query "37.781157,-122.398720,1mi"
Does somebody know how to pass it into the query?
For language filter you can refer the post
https://community.postman.com/t/define-the-language-of-tweets/20643
But I am not sure about the image filter. But in recent times a developer sean.keegan from Postman is talking more about twitter API's in Postman. Please do check out https://community.postman.com/search?q=twitter and https://www.youtube.com/watch?v=ySbLo13Fk-c
I hope these will be helpful for you!!

How to write filter {key} for a 'get items as xlsx' request via Podio API

I'll get this out of the way first: I'm an amateur programmer (at best)...i have some knowledge of how APIs work, but little to no experience manipulating the podio API directly (ie I use zapier/globiflow a lot and don't write any php/ruby). I'm sure other people can figure this out via the API documentation, but I can't. So i'm really hoping someone can help clarify and give some more detailed instruction.
My Overall objective:
I frequently export podio files as xlsx from the podio front-end. This is used by my team and me to do regular data analysis tasks in excel. I would like to make this process easier by automating the function of getting an updated podio export into my excel. My plan is to do this via excel VBA. I understand from other searching that it is possible to send an HTTP request using VBA, so i want to make sure i understand what I need to send to the Podio API to get what I need. The method of how to write the HTTP request in excel VBA is outside the intentional scope of this question (though i'd accept any help on this!)
What I've tried so far:
I know that 'get items as xlsx' is part of the podio API: https://developers.podio.com/doc/items/get-items-as-xlsx-63233
However I cannot seem to get this to work in the sandbox environment on that page so that i can figure out a valid request url. I get this message: 'Invalid filtering key' ... because i have no idea how to fill in that field. The information on that page is not clear on this. Nor is it evident on the referenced 'views page'. There are no examples to follow!
I don't even want to do any filtering. I just want to get ALL items in the app. Or i can give it a pre-existing view_id, but that doens't seem to work either without a {key}
I realise this is probably dead simple. Please help a noob? :)
Unfortunately, the interactive API sandbox does not behave appropriately for this particular endpoint. For filtering, this API endpoint expects query string parameters where the field-value pairs consist of integer field IDs and the allowed values for each field. Filtering by fields is totally optional. It looks like this sandbox page isn't built for this kind of operation with dynamic query string field names; wherever you see the {key} field on that page is meant as a placeholder for whatever field IDs that you would use for filtering.
If you want to experiment with this endpoint, I would encourage you to try another dedicated HTTP client first. I was able to get this simple example working with the command-line program wget:
wget --header="Authorization:OAuth2 $MY_SECRET_TOKEN" \
--content-disposition \
"https://api.podio.com/item/app/16476850/xlsx/"
In this case, wget downloaded an Excel file containing all the items in my app without any filtering applied. The additional --content-disposition argument tells wget to save the output as a file with a name using the information in the server's Content-Disposition response header.
With a filter applied:
wget --header="Authorization:OAuth2 $MY_SECRET_TOKEN" \
--content-disposition \
"https://api.podio.com/item/app/16476850/xlsx/?130654431=galaxy"
In this case, the downloaded file filtered the results to items where field id 130654431 (which is a category field) contain the value galaxy.

Filter Google Custom Search Engine results by site

I have been having issues using the Custom Search Engine API while (trying to) using the functionality of its specific site search.
I have created (using the web console) a CSE and defined it to search in 2 sites:
*.ebay.com
*.amazon.com
First, when searching for the term 'pcrush' (with curl), I receive results from amazon.com and ebay.com domains, as expected. this is ok.
curl -X GET 'https://www.googleapis.com/customsearch/v1?key=<my-key>&cx=<my-cx>&q=pcrush'
When I try making the same search but limit the results to be just from a specific site, I still get results from both eay and amazon.
Here, I want to receive only the ebay.com results:
adding &as_sitesearch=ebay.com
adding &as_sitesearch=ebay.com&as_dt=i
adding &as_sitesearch=amazon.com&as_dt=e
all of the above example, when the domain is *.ebay.com or *.amazon.com
It all still returned results both from ebay.com and amazon.com
I should mention we are following the API as described here.
What am I doing wrong?
Thanks in advance.

How to batch rename Tumblr tags?

I have tagged over 700 blog posts with tags containing hyphens, and these tags suddenly stopped working in 2011, because Tumblr decided (without any notice) to forbid hyphens in tags (I guess hyphens are blocked now, because spaces in tags (which are allowed) get changed to hyphens.). Unfortunately, Tumblr is not willing to globally rename all tags containg hyphens (although these tags are of no use anymore → 404).
Now I want to rename my tags myself.
I tried to do it with the "Mass Post Editor" (tumblr.com/mega-editor), but it's not possible to select posts by tag. I'd have to manually select post after post and look if a certain tag was used, and if so, delete it and add a new one instead. This would be a huge job (700 tagged posts, but more than 1000 in total).
So I thought that the Tumblr API might help me. I'm no programmer, but I'd be willing to dig into it, if I could get some help here as a starting point.
I think I need the following process:
select all posts that are tagged with x (= a tag containing hyphens)
tag all these posts with y (= a tag without hyphens)
delete the tag x on all these posts
I'd start this process for every affected tag manually.
I see that the method (or whatever you call it) /post knows the request parameter tag:
Limits the response to posts with the specified tag
(I guess I can only hope that this works for tags containing hyphens, too.)
After that I'd need a way to add and remove tags from that result set. /post/edit doesn't say anything about tags. Did I miss something? Isn't it possible to add/remove tags with the API?
Have you an idea how I could "easily" rename my tags?
Is it possible with the API? Could you give me a starting point, tip etc. how I could manage to do it?
I don't know if this might be helpful, but I noticed that the search function is still able to find posts "tagged" with tags that contain hyphens.
Example: let's say I have the tag foo-bar. It is linked with /tagged/foo-bar (→ 404). I can find the posts with /search/foo-bar (but this is of course not ideal because it might also find posts that contain (in the body text) words similar/equal to the tag name).
I tried to encode the hyphen (/tagged/foo%2Dbar), but no luck.
just for the record, because this is a popular google search: i've done it! you can use it at http://dev.goose.im/tags/.
i used a combo of PHP and jquery, basing my jquery off of a previous tumblr api script i wrote a year or two ago, and used this tumblr php oauth script for the authentication. if anyone wants me to put up the source code, i'd be happy to.
If you aren't a programmer, how much is your time is worth to you? As they say, time is money. Not only do you have to figure out how to use the API, but choose a language and learn to write in it. That's no small task. You could higher a freelancer for $50 for an hour worth of work.
To answer your question, yes it is possible to do this with the API. It mentions "These parameters are used for /post, /post/edit and /post/reblog methods." and tags is mentioned as a string of comma separated words.
What you want to do is get a listing of every single blog post using the /posts method. You'll want to look at the "Request" section to figure out the criteria to pass to this URL. You want it to be as general as possible to get a complete listing of all your posts.
After you get a listing of posts you'll want to iterate over it and modify the tags parameter provided in the response for each post. You'll want to use the id paramater along with /post/edit, which again takes tags as a string.
The simplest language you can use for this task is PHP. You'll want to look at the curl extension to make your requests. You'll want to read up on arrays as you'll be using them a lot. You'll also need to look at explode, implode, str_replace (for the dashes), and foreach for iterating over the result.
When you do this I would highly recommend you use break at the end of your foreach loop so it only affects one post at first. Testing it first will be important, as you don't want to accidentally erase your tags/posts. print and var_dump are good ways to help you debug the code. xdebug is a nice extension that allows you to step through the code line by line as it runs. Netbeans is an IDE that has good xdebug support.
There's also a nice page here to get you started with PHP. You'll need to install PHP on your machine. You don't need to install a web server - for this PHP-CLI (command line) sapi is good enough.

How to make a Google Maps address - like lookup

You've probably all seen the maps.google.com.au address lookup. Start typing into the text box and your address auto completes in the list before you've finished. It also bolds the matching sections of the text that link to what you are typing.
I've used both the javascript api of maps and the http api. The geocoding seems to do something decent with the matches but i'm not entirely sure how one would go about getting this to work.
Anyway have a tutorial or a quick five step process that they would recommend I follow to get this feature going?
The feature you are looking for is "find as you type" or "suggest as you type" or AJAX live search.
To get the functionality via the Maps API is possible as any other find as you type solution. For each key entered into your search box you already send the request to the server and see what matches the entered text so far. The problem is that you can only send so many requests to google before you get a 620 (too many requests) error. Having a find-as-you-type mechanism is usually easier when you have your own small DB which you can query, that is faster and you won't have problems with too many requests.
Some links with tutorials:
Javascript Autocomplete Combobox - find as you type
Suggest as you type
AJAX Live Search