Would a google cloud platform machine with many different CPUs allow me to run API requests through several differen IP addresses? - api

I am trying to query public utility data from an API (oasis.caiso.com) with a threaded script in R. Apparently this API will refuse requests from certain IP addresses if too many are made. Therefor I need to run many different API requests in parallel across different IP addresses, and am wondering if a machine with many different CPUs on google cloud platform will allow this?
I was looking at the n1-highcpu-96 option from this page: https://cloud.google.com/compute/docs/machine-types
If this is a poor solution can anyone suggest another distributed computing solution that can scale to allow dozens or even hundreds of API queries simultaneously from different IPs?

If I needed multiple IP to perform "light" API calls I would not scale vertically (with a machine having 96 core). I would create an instance group with 50 or 100 or n Debian micro or small preentible instances with the size depending on the kind of computation you need to perform.
You can set up a startup script loaded in the metadata or in a custom image that connects to the API server do what it has to do and save the result on a bucket and if the instance get a "API refuse" I would simply kill the instance automatically having the instances group creating a new one for me with possibly a new IP.
This I think is a possible easy solution to achieve what you want, but I guess there are multiple solutions.
I am not sure what you are trying to achieve and I think you need to check first that it is legal and if the owner of the API agree.

Related

Different backend endpoints in APIs depending on Products in Azure API Management

I'm an absolute newbie in Azure API Management and I have a doubt regarding how to manage Products and APIs.
Let's imagine this scenario:
I create 3 diferent Products: One for representing my Development environment (DEV), the second one for representing my Preproduction environment (PRE) and the last one to represent my Production environment (PRO).
I create several APIs which I want to publish in my DEV environment and later promotion to the others. So I need every API in every different Product to point a different backend service, as my backend services are different in every environment.
In example:
I have 3 different versions of my backend service: ServiceDEV, ServicePRE and ServicePRO. As I develop my API, I use as backend service the one named ServiceDEV, and so my API is assigned to the Product DEV. Later I want to keep this DEV version for my API but I also want to "deploy" that API in the Product PRE to make it act as a façade for ServicePRE, and the same would happen when promotioning it to PRO.
The problem with this approach is that I need to clone the APIs and change their settings to make them point to the correct backend endpoint every time I want to promotion one of them from one environment to another, thus losing all the versioning for that API, as the cloning operation just clones the current version of the API.
I don't know if policies would meet my needs in this subject.
I hope you get what I'm trying to mean...
How can I manage this situation?
Am I focusing this subject in a wrong way?
Any idea about how to overcome this?
Thank you!
If you follow this approach then you indeed could use policies to manage different backends for different products. You could create APIs without specifying backends ervice URL entirely and later use set-backend-service policy at product level to direct call to a proper endpoint.
One limiting factor of this approach is that whatever changes you may want to do to an API in dev environment (think change signature of an operation, or policy) will be immediately visible in other environments as well as this is a single API in all of them. If this is an issue, then consider having duplicate (triplicate) APIs - one per environment and later move their configuration via Azure API call.

Is it possible to have multiple keys of same vendor for same service in one application?

Let me take an example, I wanted to use Google Places API in my app. But due to usage limit it only allows me to use 1000/day requests for one key. So what I did was created multiple keys from different gmail accounts. I am using these multiple keys in one app. Is it allowed to use commercially?
Google already mentions that on the documents
You can use one Developers Console project to manage all of your work, or you can create multiple projects, depending on your development and collaboration needs. Consider whether you're collaborating with a different set of people, want to track usage differently, or need to set different traffic controls for different parts of your work. If so, breaking up your work into multiple projects might make sense. That said, you cannot use multiple projects to try to exceed API usage limits.
From :Google Docs : Creating and shutting down projects
So you cant use it multiple key for project its better to use the commercial package offer by google.
Try use the Google Place for web, it haven't requests limit. I think you are developing for android, so
use this tutorial: http://wptrafficanalyzer.in/blog/showing-nearby-places-and-place-details-using-google-places-api-and-google-maps-android-api-v2/

Get all images instances - REST API Azure

I searched in the Azure REST API documentation but I didn't find something.
I created several virtual machines using an image and I want to retrieve all this virtual machines with REST API Azure.
I'm wondering if there is an URI I can call to get all instances of a Virtual Machine image ?
ListDisks Operation will give you all the Azure Disks that are present in your account.
The result set contains objects with properties like AttachedTo which you can use to identify a VM (if any) to which this disk is attached, or SourceImageName which you can use to identify the source image.
There is no direct API call to identify which images are in use. You have to make two calls - first one to get all disks and second one get all images. Then mix and match. (Or try out filtering the images only based on names, which you are interested in, but I am not sure whether the REST API supports filtering).

how to build google gadget with persistent storage

I'm trying to make a google gadget that stores some data (say, statistics of users' actions) in a persistent way (i.e. statistics accumulates over time and over multiple users). Also I want these data to be placed at google free hosting, possibly together with the gadget itself.
Any ideas on how to do that?
I know, Google gadgets API has tools for working with remote data, but then the question is where to host it. Google Wave seemed to be an option, but it is no longer supported.
You should get a server and host it there.
You have then the best control over the code, the performance and the data itself.
There are several hosting providers out there who provide hosting for a reasonable price.
Naming some: Hostgator.com (US), Hetzner.de (DE), http://swedendedicated.com (SE, never used, just a quick search on the internet).

Do i need a Content Delivery Network If my audience is in one city?

So ive asked question earlier about having some sort of social network website with lots of images and the problem is the more users , the more images the website will have and i was afraid it would take a LONG time for the images to load on the client side.
How to handle A LOT of images in a webpage?
So the feedback i got was to get a content delivery network. Base on my limited knowledge of what a content delivery network is, it is series of computures containing copies of data and clients access that certain servers/computers depending where they are in the world? What if im planning to release my website only for a university, only for students. Would i need something like a CDN for my images to load instantly? Or would i need to rent a REALLY expensive server? Thanks.
The major hold up for having lots of images is the number of requests the browser has to make to the server, and then, in turn, the number of requests the server has to queue up and send back.
While one benefit of a CDN is location (it will load assets from the nearest physical server) the other benefit is that it's another server. So instead of one server having to queue up and deliver all 20 file requests, it can maybe do 10 while the other server is simultaneously doing 10.
Would you see a huge benefit? Well, it really doesn't matter at this point. Having too much traffic is a really good problem to have. Wait until you actually have that problem, then you can figure out what your best option is at that point.
If you're target audience will not be very large, you shouldn't have a big problem with images loading. A content delivery network is useful when you have a large application with a distributed userbase and very high traffic. Underneath that, and you shouldnt have a problem.
Hardware stress aside, another valuable reason for using a CDN is that browsers limit the number of simultaneous connections to one host, so let's say the browser is limited to 6 connections and you have in one page load 10 images, 3 CSS files and 3 javascript files. If all 10 of those images are coming from one host, then it will take a while to get through all 16 of those connections. If however, the 10 images are loaded from a CDN that uses different hosts, that load time can be drastically reduced.
Even if all your users are geographically close, they may have very different network topologies to reach your hosting provider. If you choose a hosting provider that has peering agreements with all the major ISPs that provide service in your town, then a CDN may not provide you much benefit. If your hosting provider has only one peer who may also be poorly connected to the ISPs in your town, then a CDN may provide a huge benefit, if it can remove latency from some or all of your users.
If you can measure latency to your site from all the major ISPs in your area to your hosting provider, that will help you decide if you need a CDN to help shorten the hops between your content and your clients.