How to use ArcGIS VRP Service with numerous stops on single route? - arcgis

I am trying to optimize a route using ArcGIS VRP REST Service.
I have a situation where I want to solve for a single route with more than 200 orders (approx. 1000) with only one depot at the end of route.
The API has limitation of max. 200 orders per route.
Is there any any work around or any other appropriate solution?

When using services from Esri, such as the routing service, you're stuck with the limit they place -- in this case, 200 orders per route.
The only solution I am aware of is to create and use your own network analyst service on a local ArcGIS Server.

Related

How can I cache GraphQL requests with VueJS and GraphQL-Yoga

I have a Vue2 app which grabs data from my GraphQL backend. Think User count, Posts made, your Posts and things like that.
The HTML, CSS, JS etc of the Vue2 app is on a CDN and loads very quickly. The GraphQL server, my own, is located in one location and can load slowly if you're far away from the server. I want to increase my site's loading times.
How can I form a kind of CDN for my GraphQL layer that caches results in various locations so that common requests are snappy and fast. I have a rough idea on how I might begin doing this but I still feel I am in need of existing services/frameworks for guidance or direct use.
I have heard of GraphCool and Hasura, are these things I am looking for?
You have a few options at your disposal:
Use AWS with location-based Routing and multi-region EC2 instances. For the most reliable and fastest service, you should have an instance in the following locations: Northern California (USA), Northern Virginia (USA), Sao Paulo (Brazil), Paris (France), Mumbai (India), Hong Kong (China), Tokyo (Japan), Singapore, Sydney (Australia). You can use a free ec2 instance in all of these zones and pay next to to nothing yearly while you're getting started, and scale them up as you need. I recommend the t3 micro, which is one of the absolute cheapest solutions you can get. This will run you approximately $840 for the year.
Move over to Heroku, which basically allows you to do the exact same thing I've outlined above in AWS, with less overall total control.
Use Vuex to store the results in localStorage on the users computer by combining Vuex with the power of a persistent storage plugin, like vuex-persistedstate. Combine this with server sent events to avoid ever having to make a request for updated information outside of the initial request you make. Note: This will not solve the initial slow load up-front.
Ignore Vuex all together and just store the result on the client side in localStorage and fetch it whenever you'd like. Note: This will not solve the initial slow load up-front.
Why haven't you mentioned Hasura yet?!
Hasura its self is not a cloud-service that works stand alone. Instead, Hasura allows you to drop in a GraphQL Engine on top of a pre-existing PostgreSQL database.

Podio API limit

I am working on one product which is fetching all the organization/workspace and app details of the customer. The customer can refresh them any time.
So let’s say I have one customer who has 100 applications across multiple workspaces so around it is making around 110 calls to get each application detail, workspace details and organizations.
Now if that customer refreshed the applications multiple times like 10 times in an hour so the action only for that is 1000 API calls. If I have 50 such users active and doing this thing then it will be something 50000.
AFAIK I can not make so many API calls in an hour so how to handle this scenario. I know a lot of applications are doing such things so want to understand how everyone is handling this.
If you need a higher rate limit, I would encourage you to contact Podio support and ask specifically for what you need. We have internal guidelines for evaluating these kinds of requests and may increase the limit for your user and client ID if appropriate.
In general, though, I would expect your app to implement some kind of batching, transient storage, and/or caching layers, especially if your customers are interacting with Podio exclusively or primarily through your system.
Please see our official statement here: https://developers.podio.com/index/limits
Summary:
The general limit is 5,000 API calls per hour, but if the API call is marked as "Rate limited" in the API reference the call is deemed resource intensive and a lower rate of 1000 calls per hour is enforced. If you hit the rate limits the API will begin returning 420 HTTP error codes for all API calls. Rate limits are per user per API key.
Contacting support:
If you have a project that requires a higher rate limit contact support#podio.com with a brief description of your project, your estimated usage and the client_id of the API key you are using.
Usage tips:
Tips for reducing API usage
Avoid making API requests inside loops. Instead of fetching individual objects inside a loop, fetch a collection of objects in one API operation. E.g. filter items
Cache results whenever possible. This is especially true when you are displaying data to the public (i.e. every sees the same output).
Don't poll for changes. Instead of polling Podio to see if your content has changed use webhooks or push to receive a notification. This might save you thousands of requests: https://developers.podio.com/doc/hooks
Use logging to see how many requests you're making
Bundle responses with "fields" parameter
You might want to build an API proxy app; you would need a messaging queue and a rate limiter. This would lets you keep track of the api calls consumptions across apps and users.
Also worth noting: some API routes are more expensive than others if they are more resource intensive on the Podio side… The term in use is rate-limited: rate limited api route are bound to 1k calls an hours, so in effect costs 5 times as much as regular routes.
Hope this helps!

Reverse Geocoding with Worklight

I'm currently working on a Worklight Project that deals with location based services. I want to be able to get the ZipCode of an user's current location for the iOS platform specifically. I researched online and there are many ways to approach this. I currently have it implemented using a custom cordova plugin using native location manager features and retrieve the zip code through reverse geocoding. This approach seem like I'm doing it the long way. I noticed that google provides an api call for the reverse geocoding by just supplying the lat and long. However, there is a limit to how many calls you can make.
Users of the free API:
2,500 requests per 24 hour period.
10 requests per second.
Maps for Business customers:
100,000 requests per 24 hour period.
10 requests per second.
This app needs to have no restrictions on how many times it can get the location based on zip code.
Does Worklight have a simpler or better way of getting the zip Code for user's location(I've checked the worklight api reference calls but didn't see anything about retrieving user's zip code)?
Worklight provide a way to implement this by using adapters, but not the API itself. Although you could the adapter to work as something like a local cache of the ZIP you already know.
To save money due to the APIs that would be usually based on a number of calls, we would need to have some cache, database(more likely: CouchDB or mongoDB) to handle this cache of what you already know.
A mobile(app-side) solution + a server side solution. On putting this 2 together, worklight would help you.

Get many geo maps for addresses

How can I get maps for addresses without requests limits ? Google provide only 2500 requests per day. First of all, I want to use free services. Thank you.
You left a ton of info out... What the heck is maps for addresses? Do you mean map tiles? Or are you talking about geocoding? Like getting addresses for maps.
Is it a website making the calls or mobile? Where are you exicuting the code from?
If you are talking about gps geocoding (getting an adress from a GPS cord) then there are tricks you can use to get around those limits. If it's based on a key then its a 2500 limit for the key. However, there are apis you can use that are based on calling IP (google is one) If you make the client make the call then unless your client is making 2500 calls your good to go.
You will notice here that the geocoding call doesn't require an api key. So the usagelimit is going to be based on calling IP
https://developers.google.com/maps/documentation/geocoding/#GeocodingRequests
Here's a similar question Usage limit on Bing geocoding vs Google geocoding?.
Google will start denying your request at around 2500. Bing has a much higher daily limit (used to be 30k - i think it's up to 50k now).
There are a number of free geo-coding services. I recommend staggering your requests to use multiple services if you need a large number of addresses coded daily. Here's a list of 54 providers: http://www.programmableweb.com/apitag/geocoding.

Google Search API for desktop application. Which API to use to make maximum requests per day?

I'm building an application that should query Google search very often. But i'm having trouble choosing which API i should use. There are so many of them - AJAX, REST, Web, SOAP, Custom and maybe something else. Some of them are deprecated now. From that list, from what i understand, only AJAX and Custom Search API are not. Custom Search API has 100 requests per day limit. Very small amount. I couldn't find any published limits for AJAX API, but it looks like i can do only 20 requests per hour or so. Also not so good.
So, which API should i use in desktop application to get as much as possible? And second question: what else i can do to increase the limit? Maybe set appropriate http headers, use API key or something else?