Is increasing the max. number of instances in the same zone possible? - instance

I am working on an experiment that needs a large number (~120) of Google Compute Engine instances. It does not matter how powerful each instance is. I just use n1-standard-1 instances.
The experiment needs to have all instances in the same zone, but I found that I could only create 22~23 instances in the same zone.
Would there be any way to increase that limit?
Thank you in advance!

You could request to change your default per-region quota as describe here :
https://developers.google.com/compute/docs/resource-quotas#perregion
There is a link at the end of the doc to google docs "quote change request form"
hope that helps

Related

Performance optimization for route calculations via mouse drag

Our Team is currently working on a car routing web application with the HERE-API. To meet the users expectations, we want to integrate all the typical features everyone is used to nowadays, most importantly the possibility to manipulate a route interactively by dragging waypoints out of it.
While for the most part everything seems to work fine, we are experiencing serious performance issues on long routes combined with large drag distances.
Our applicaton works a follows:
-at first the user has to provide two addresses
-the route is initially calculated using a full calculateroute request (representation = 'display')
-now, when the user drags the route, we request a new route with a waypoint at the mouse position and reduced response data (representation = 'dragNDrop') every 500ms for the time the dragging process lasts
While this procedure is working really well and fast when zooming in to a small section of the route, it is very slow and laggy when zooming out to country size and dragging while the whole route is being displayed. Implementing a throttling mechanism and experimenting with different call rates helped a bit, but not as much as we hoped.
Having a look at the constant performance on wego.here.com, we were hoping that there might be a better way to implement this feature with the HERE-API or maybe some kind of optimization.
We would very much appreciate any help.
Routing API will provide the best solution with respect to the use case. can you please share the API response time or full API request. please check clustering document if it aligns with the use case.
developer.here.com/documentation/maps/dev_guide/topics/clustering.html
if it does relate to the number of request(which cause the implementation of throttling and increase API requests), please connect us at
developer.here.com/contact-us
Thanks for your reply! It had been pretty crazy this year so this post is a bit late - my apologies for that.
The solution to our Problem was a small adjustment to a parameter value. We lowered the value of the parameter 'resolution' to '25:25' for calculateRoute-Requests while doing route draggings.

BigQuery API: Query usage per day quota reached for unknown reasons

I'm a user of bigquery and the last week I just keep hitting the Query usage per day for no apparent reason. According to the Quota page I have reached 17TiB, however according to the Billing page I am only billed for 9 TiB since yesterday. See this and this screenshot.
I've also set up billing export (saved in a table cloudaudit_googleapis_com_data_access_) for my project and if I add up the processed bytes of all jobs since yesterday I only get to about 10TiB.
I have no idea where else to look for the reason I keep hitting this quota limit? So some help would be much appreciated.
The Billing doesn't seems to work in real-time as you can see in other questions.
I suggest that you wait a little to generate a new report and compare with the usage indicated in the quota.
If you need to increase your quota at the moment to keep everything working, you can do:
Go to APIs & Services
Find BigQuery API and click
Go to Quotas in the menu at the left of the screen
Edit the desired category's limit
Please, let me know later if the Billing still not matching the usage.
I hope it helps

Is it possible that I already exceeded the limit of query for Google Maps Static in just an hour of testing?

I just started playing with the Google Map API for Static Images, and in just an hour it has appeared this image:
http://www.coon.it/drop/limit.png
Is that normal?
My page need 6 static image, that means that I call it, like 170 times?
I don't think it's possible since the pictures are always the same and the documentation says that if i call the same image it doesn't count.
What can i do?
Thank you
Although there is an absolute limit on the number of images, that limit may be calculated over a shorter period than 24 hours. 2400/day could be interpreted as 100/hour, so you can't use all 2400 in one go [I can't remember what the limit is for Static Maps, but you get the idea].
With most Google services there is also a rate limit, and it's possible that fetching six images almost simultaneously breaks the rate limit. Rate limits vary depending on server load, but spacing requests out to 200ms should be ok.
Or it may look like an automated image-fetcher (especially if you have run it 30 times in the last hour). Google don't like automated crawlers either.
What should it display? With me it just displays a landscape with a row of trees(what i gues is what you want to see?).

mapping location to a time zone

I need to get the time zone for a given address/location. Assume that the address/location can be reverse geocoded (using google) to a lat/lng if necessary.
This means that I may not have a zip code.
I was really hoping that google provided some kind of API for this, but it seems that they don't. At a minimum you can google search for "time in washington, dc" and get the time/TZ -- but then I'd have to screen scrape that which is not fun :(
I know there are databases available that map locations to time zones, but that'd have to be maintained. Has anyone come up with a tricky solution to this problem?
Thanks!
Google provides an API for this.
https://maps.googleapis.com/maps/api/timezone/<json or xml>?location=<lat>,<lng>&timestamp=<unix timestamp>&sensor=<true or false>
eg:
https://maps.googleapis.com/maps/api/timezone/json?location=39.6034810,-119.6822510&timestamp=1331161200&sensor=false
http://www.geonames.org/ provides an API that returns the timezone given a lat/lng
The question needs to be clarified a bit more. What are you doing this for? What language(s) are you using? But how I would approach the problem is: First, create a table like structure that translates longitudes into zones(much like this table ). Next I would query the GMT either locally, or on the web somehow, and finally take the offset from the table and add it to GMT. This way there is no "maintenance" of the table since these longitudes don't change.
I've written a small Java class to do this, with the data structure embedded in the code. If you want to avoid using a web service, and accuracy of 22km is good enough, and you're happy to assume that timezone boundaries don't change, then it could help.
https://sites.google.com/a/edval.biz/www/mapping-lat-lng-s-to-timezones

FaceBook Graph API

When I use FaceBook API for retrieving posts information, I found that the returned information are changing all the time.
for e.g., when I retrieved information 2 times with 1mins interval, one record appears in the 1st time, and disapeared in the 2nd time.
https://graph.facebook.com/search?q=baby&type=post&limit=100&since=2010-05-19&until=2010-05-21
Does anyone know what happen?
Cheers,
LingChen
Searches of large datasets are (in general) nondeterministic.