I recently joined a team working on an application that maintains listings with addresses. The user searches, and includes their zipcode, and the application displays the distance to each listing. Currently we use the Google Maps API for this. Reading through questions here on StackOverflow seem to suggest that this is the best way of doing things:
php/mysql zip code proximity search
Search engine by distance
However, while reading through the API documentation, this seems to be expressly forbidden unless we also show a map for each result (and possibly also for each result we filter out, depending on how you read the following statement):
Use of the Distance Matrix API must relate to the display of information on a Google Map; for example, to determine origin-destination pairs that fall within a specific driving time from one another, before requesting and displaying those destinations on a map. Use of the service in an application that doesn't display a Google map is prohibited.
( https://developers.google.com/maps/documentation/distancematrix/ )
What's the best way to accomplish this without running afoul of any API terms?
do you consider the Geo::PostalCode module (perl)? It uses maxmind database to calculate distances between locations (and there is bind in different languages).
Related
Is there a (free) webservice where I can a) give it an address (either full or city, state/province and zip), and b) get the IANA timezone of that address?
Use case: I have a form where a user manages clinics. Each clinic has an address. I want to preselect that IANA timezone of that address, so that I can adjust appointment date/times associated with that clinic.
I know that the Google Maps API and Bing Maps API can give me the lat/long of an address, and that there are other services that can give me the IANA TZ based on the lat/long. That said, those api's require subscriptions and keys and contracts, and it would also require a two request approach to get the single piece of data.
Address geolocation is a hard problem. It requires deep understanding of address systems and reacting to ever-changing, real-world, messy scenarios.
For example, you could get a database of the Zip codes of the United States and their approximate locations, but it would only be a snapshot. You'd find over time that new Zip codes would be added and missing from your data, and that existing Zip codes had been expanded to include other areas. You'd also find many Zip codes that are "non-locatable", such as those used to send mail to military overseas. You could even find a single Zip code that has addresses in two different time zones.
Take international concerns into effect and you'll find all sorts of edge cases. Every country and territory has their own special rules and situations.
It is a problem worth paying a service provider for. Trying to do geolocation in an offline manner might be possible, but it isn't advised.
The second part - figuring out which time zone goes to a location, is also messy. However it's slightly easier to coordinate. The Time Zone Boundary Builder is the main open-source project that attempts this. Most libraries referenced here use that data. But even then, it has updates and relies on borders established by Open Street Map data. Some of those borders are in dispute, so a service from Google, Microsoft, or others could give different results because they have map data with different borders. If you care about such things then you might want to test some edge cases against the different providers to see if you're satisfied with the results. You may find that the TZBB data works well, or you might prefer one of the online solutions.
This is more of a general programming question.
I'm trying to create an app, think of it as a Yelp clone. I have most of it working but I'm missing one important feature. The data of the places around me. For now I'm only focused on food, so I'd like it if I search something like "Pizza", it'd show me all the pizza joints near me.
I was originally planning to use Google Places API. However if you havent heard, they're changing their pricing and lowering the free tier and upping the cost by a huge margin.
There's also the problem of saving the data. One workaround I saw a user suggest was to just keep using Google's API, but every time you make the query, store the data in your own DB as well (I only need address and name and latitude and longitude) so eventually, you'd have what you need in a sense. However I also want to have something like a simple rating system for each place like Yelp, but Google (and all other places like MapBox, Here Maps, etc) states something along the lines of "info from their API should not be stored or cached for more than 24hrs" but it's very broad and not specific.
So what I was planning to do was, call the Google API, grab the 3 info I need (Address, Name, Lat/Lng), add more fields to store the rating, likes, whatever else the user will add. Then store it in my database, but that doesn't seem like a solution now.
So does anyone have any ideas or advice? Or know of a service where I can get the details of all the food places? And if possible, can anyone confirm that storing the Name, Address, Lat&Lng is a violation of their policy since in my eyes, it's public data, but something like the rating that Google provides, or the pictures that Google provides, now that's Google property.
For obtaining places you can use OpenStreetMap, e.g. using Overpass API. Since larger traffic can be expected you should run your own database(s) instead of using the public APIs.
However OSM doesn't contain ratings. So you have to combine this data with some other publicly available rating system.
I'm interested in finding an API (or even a website that I could screen scrape) that would let me find out popular websites used in a geographical area.
For example, I'd like to be able to do a call like so:
getWebsitesUsedByPeopleNear(latitude, longtitude, maxDistanceAwayInKm, fromDate, toDate)
and it would return a list of websites like so:
http://www.google.com
...
By popular, I mean that sites have a high # of hits over a certain time period, where the hits are coming from people in the above described zone (lat,long,maxDistanceAwayInKm)
EDIT: I think ISP's, Analytics collectors, and browser extensions would all be able to get this type of data. For example, Alexa shows some of this type of data, but not at the level of detail I'd be interested in. There's appears to be too "global", and at a country level. I'd like to see it at a city level.
http://www.alexa.com/topsites/countries/CA
The only people that would have such precise information are the providers of the websites itself. Even if there was such a service (I know none) it would be depending on all the companies who host websites giving away the data. I highly doubt that this would be the case, which makes any information you get unrepresentative.
I've been running into some location based searches using the Google Maps API (more of a structural issue on my end than any criticism of the mapping api)
For example, if a user searches for "Victoria, Canada" it will bring up results for "Victoria, Canada" as expected. However, if a user searches for "Canada," google returns a longitude and latitude for the middle of the country, which is essential for correctly centering the map. However, it will not display any results since the nearest location is too far away from the location returned by google. I'm filtering out results that are about 20 miles away.
Can the Google Maps API return anything that I could use to tell if a user has entered a state or country name? If not, has anyone developed a work around solution?
Ideally, I would like to avoid just ordering the results by nearest location. I don't want items for "Spain" showing up at the bottom of a list when a user searches for "United States." I would try to determine if the query is a state or country prior to the search, but this seems very daunting task given the different possible spelling of country and region names. If I was only expecting english spellings, it would be a much easier approach.
Assuming this is a form on a site that has a controller parse the form and send the request to Google's API, you could break the form out into address, city, state, and country (if needed) form elements then make the address and city required fields.
I'm doing a similar approach with a site I'm building using Google Places API and it seems to work for me.
I'm trying to find out if there is a programmatic way to determine how far down in a search engine's search results my site shows up for given keywords. For example, my query would provide my domain name, and keywords, and the result would return a say 94 indicating that my site was the 94th result. I'm specifically interested in how to do this with google but also interested in Bing and Yahoo.
No.
There is no programmatic access to such data. People generally roll out their own version of such trackers. Get the Google search page and use regexes to find your position. But now different results are show in different geographies and results are personalize.
gl=us parameter will help you getting results from US, you can change geography accordingly to get the results.
Before creating this from scratch, you may want to save yourself some time (and money) by using a service that does exactly that [and more]: Ginzametrics.
They have a free plan (so you can test if it fits your requirements and check if it's really worth creating your own tool), an API and can even import data from Google Analytics.