Semantic/Contextual Categorization API [closed] - semantic-web

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
I'm working on a project that requires categorizing content of a given URL. Basically, I want to pass a URL to this API and it will return the category or list of categories based on its content. I think Textwise.com may have this service. Are there other similar services out there?

OpenCalais from Reuters is another such system.
You can also roll your own using open source systems like Gate which will allow you to avoid things like API limits but may require you to train your system appropriately.

You can use SimilarWeb Categorization API.
The API returns a given domain's category based on content analysis and machine learning in XML or JSON.
Some implementation of a similar usage can be found here.
Also:
function api_category(URL, KEY) {
var apiurl = "http://api.similarweb.com/Site/"
+ URL
+ "/v2/category?Format=JSON&UserKey="
+ KEY;
var fetch_category = UrlFetchApp.fetch(apiurl);
Utilities.sleep(2000);
var data = JSON.parse(fetch_category);
return data.Category.replace(/_/g, " "); // convert underscores to spaces
}

CommTouch is also one of the option.

zvelo.com is also an option, if it's for a commercial offering. zvelo has been a Leading provider of content + context categorization services for years. Unmatched URL database.

Related

Is there a Wattpad public API to retrieve the story 'Reads' and 'Votes' counts? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I've looked at the Wattpad API documentation at http://developer.wattpad.com/docs/api but there doesn't seem to be a public API to retrieve a story's 'Reads' and 'Votes' counters.
Currently I'm using a simple bash script with curl and awk to retrieve the counters, but this seems a waste of resources because typical page size appears to be 60K and a JSON response would be much smaller.
There isn't currently a direct API for doing what you're looking for, however you can get the data you're looking for indirectly using the story search API.
For example if you hit the following API (with your auth key of course):
https://api.wattpad.com:443/v4/stories?query=your%20story%20title&limit=1
you'll get back a JSON payload. The documentation is a little out of date as to what fields you will receive, but you should see voteCount and readCount properties as part of the response.
We're working on getting the documentation updated, as well as possibly providing more API capabilities. Please stay tuned. Also, please let us know what you end up doing with the API, we're very curious about what people are looking to do with it.

Documenting a Spring HATEOAS API [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 5 years ago.
Improve this question
Are there any plugins out there (similar to Swagger) which provide the ability to document HATEOAS APIs?
The Swagger interface is quite good but it doesn't have level 3 REST support.
I use spring-restdocs in combination with the HAL-browser.
You don't necessarily need HAL for restdocs though, although it is recommended.
Restdocs will generate code samples and link & field descriptors in the asciidoc format. You can then link to these asciidocs from inside the HAL-browser.
To see the result in action (although this is hardcoded), check this out: foxycart. Click on the little doc links next to the rels.
After further investigation I discovered HAL-browser (https://github.com/mikekelly/hal-browser) which is quite good. Although, your API must return content-type of HAL for it.
You don't need to configure anything on the server for this tool. Just open it in a browser and point to your API.

Radar Images on Google Maps API [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
I have been struggling to find a solution for this after numerous searches. I am looking for a web service or API or something that will allow me to fetch smoothed NEXRAD (Weather Radar) data for the United States and display it on a Google Map. I have only been able to find non-commercial or pure (not smoothed) data. The project I am working on is requiring a radar image that is comparable to what you would see through WeatherBug.
Has anyone had any experience with data like this or know any APIs that are avaliable?
I cannot even find a program that can process the raw data from the National Weather Service into smoothed radar images that I could slice up and use in a Google Maps overlay...
Give this a shot. It is what I am using and it works great.
Here is the link to the page where the data is coming from and they offer several other products as well.
http://mesonet.agron.iastate.edu/ogc/
tileNEX = new google.maps.ImageMapType({
getTileUrl: function(tile, zoom) {
return "http://mesonet.agron.iastate.edu/cache/tile.py/1.0.0/nexrad-n0q-900913/" + zoom + "/" + tile.x + "/" + tile.y +".png?"+ (new Date()).getTime();
},
tileSize: new google.maps.Size(256, 256),
opacity:0.50,
name : 'NEXRAD',
isPng: true
});
In case you're still looking... http://www.eldoradocountyweather.com/scripts/weather-scripts.php has some good stuff.

Any API to search a list of domain names? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
Assume there is a list of domain names but you dont know whether they are taken by someone else. Assume that the list is too big that you dont want to manually type in each word and see if the domain name is available or not. How would you get around this issue? Is there a public API by a company that we can use in our program and see the availability of names based on, say, the return value of a method call? If there is, I'd appreciate code snippets.
What you need to do is a whois on the domain. This will tell you if its registered and with which registrar. There are many ways to issue the whois, via a command line, via PHP, via a web API.

Google API to check number of indexed pages? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
Is there a Google API similar to Yahoo and Bing's API's to check for the number of indexed pages on a specified domain?
For example, for Yahoo if I type in the following URL:
http://search.yahooapis.com/SiteExplorerService/V1/pageData?appid=MTSlade&query=http://www.dave-sellers.co.uk&domain_only=1&results=1
Then it will return some XML detailing the number of pages indexed as 'totalResultsAvailable'
Any idea?
Thanks
I'm not sure about an API but you can view the pages Google has indexed by doing a search like so:
site:http://thesitesurl.com
Here is an example. You could apply some logic to the pagination and number of items per page etc (or simply use the "Results 1 - 100 of about 9,100,000"). You could even choose to display 100 items per page by using this sort of syntax. I'm not sure if this would fit your exact requirements, but it's better than nothing.
With the still operating - but deprecated - Google search api you can do this:
http://ajax.googleapis.com/ajax/services/search/web?v=1.0&q=www.bbc.co.uk
The result returned is of type "text/javascript" which you can parse as a JSON. The field you are after is estimatedResultCount. There doesn't seem to be an option to return the results as XML, but all you need to do is convert the JSON to XML. I don't know what language you're using, but there's bound to be utilities to do this.
If you do not want to use a deprecated API, then use the new Custom Search API - but you'll need to sign up for an API key:
http://code.google.com/apis/customsearch/v1/overview.html
and here's details on how to construct your query:
http://code.google.com/apis/customsearch/v1/using_rest.html