Web API + Client Architecture - api

We're building:
A bunch of services exposed through a web API.
A mobile app and a browser app.
Is it common practice for the apps to respond to their own conduit servers that end up talking to the API services? We're going to be setting up a reverse proxy - is it enough to directly hit our APIs (instead of setting up a conduit)? This is definitely a general architecture question.

I'm not sure what you mean by a "conduit", but a lot depends on how complete and hardened your APIs are. Do they already handle things like authentication, abuse detection/control, SSL, versioning, etc...
There are companies that specialize in providing this "middleware" of APIs (Apigee, Amazon API Gateway, Azure API Management, and many others). Your reverse proxy is a start, and is probably good enough to get going with (at least you do things like terminate your SSL, and lock down your API servers behind a firewall). If you make your API services stateless, you will probably be able to add new layers at a later date without too much pain and complexity.

Related

Using RabbitMQ for communication in a Microservice architecture but should I create a API Gateway on top?

I basically have a smaller software that is using the Microservice architecture. I am currently using RabbitMQ to do the communication between UI and services and that works great.
However I am thinking about creating a new microservice, a API Gateway, that basically takes the RabbitMQ logic from the UI and encapsulate into a service, which would become the entry point to all the other services.
The benefit is that I would encapsulate the logic that give access to the services and also being able to add authentication in the API Gateway.
However I would need to use HTTP request to interact with the API as I am moving the messaging logic from the UI. Could there be any major drawbacks in this approach?
I have being able to find examples about RabbitMQ and examples about API Gateways but never those two together, I might just be overthinking it a bit.

Is there any benefit in creating a proxy for api consumption on 3rd party apis

I'm build a apache cordova mobile app that consumes a specific google api.
What i'm planning to do is consume google apis directly from the app.
Is there any benefit to create a proxy service that consumes google apis and have my app consuming the proxy api ?
I ask because it seems a common practice, but i don't see any benefit.
Is it a best practice or a bad practice ?
If yes maybe a legacy system, maybe yes to do transforms, but now in days, so many "more modern" RESTful API providers have pretty nice client libs. Seems like one more link in the system to fail if I was proxy, and extra load to deal with... (I wouldn't proxy requests to youtube or some storage account for large static assets).
I do find it nice to proxy my own APIs if it's not public. It usually helps to remove the whole CORS mess and eliminate any performance penalties from extra preflight requests, but most public APIs don't have heavy CORS issues since they are, well public and don't have limited allowed origin.
Even simple JS libs hosted on CDNs like JQuery, I usually leverage 3rd party CDNs if available rather than bundle and host on my own CDN through Azure or Google that I have to pay for.

Unconventional to bundle web socket server with REST API?

For an enterprise REST API (PHP in this case), is it a bad practice to include a web socket server along with a REST API? The pairing of the two makes a nice mix with event dispatching services, but I'm not sure if these two services are different enough where they warrant separation? I guess the only con I can see at the moment, would be that if the REST API were to go down, then your web socket servers are also down, which removes the possibility of having fail-over for any connected clients, or something to that degree.
If you're looking for a really robust way to manage web sockets, check out http://faye.jcoglan.com/ - it has libraries for JavaScript, Ruby, etc, and runs independently of your other servers.
If you don't need that kind of resilience, then I wouldn't worry about mixing your REST and web socket APIs on the same server.

Service oriented architecture with api gateway and secure IPC

I have reading and developing my understanding with SOA, I found this approach of development useful. However there are couple of thing confusing me which are:
Background: we are designing online financial application, recently we are in phase of designing brain storming sessions. (audit & logging need to be done)
1 - how to make sure Secure inter process communication?
My thinking: to restrict external access to these API's by firewall. So these can only be call internally by system.
2 - what will be preferred protocol rest or soap in context of private api (IPC) and public APIs (exposed to clients for e.g mobile,web and desktop)
My Thinking: For read we can use Rest and for ACID compliance transactions we can utilize soap as its provide point to point security. Or for IPC we are thinking to utilize soap as its provide audit mechanism also.
3 - What is Role of API gateway? specially is it involve in inter process communication?
I am confused on this specially with IPC. I think all request even services to services call will perform via API gateway. Kindly elaborate this in detail if I am wrong.
4 - is this possible to keep some services public and some private in micro services architecture? how to logically separate both of them?
What ever I have research we can do this.
5 - What is major difference and pros and cons of SOA and Micro SOA?
My view: Msoa is extention of SOA.
Thanks in advance.
Can anyone draw a diagram or provide link of MSOA architecture diagram with API gateway.
APIGEE is primary source of my understanding, then google it specifics which move in different directions.
Security is our major concern.
1 - how to make sure Secure inter process communication?
Internally, whitelisting IP's and firewalls are probably the most secure. If these servcies scale and have dynamic IP's you may have an issue with that, in which case shared secrets can work, but ned to be put in place and respected across all services. JWT's are quite good for this (similar to SAML, but not as painful), used with an authentication microservice.
Externally - tokens, OAuth2 depending on how much pain you want to go through.
2 - what will be preferred protocol rest or soap in context of private api (IPC) and public APIs (exposed to clients for e.g mobile,web and desktop)
I would use REST, SOAP is slowly becoming an antiquated standard, you can secure comms p2p by using TLS or HMAC signing.
3 - What is Role of API gateway? specially is it involve in inter process communication?
An API gateway is usually used to expose legacy APIs to the public, or to manage a large set of internal services via single amanged interface. An API gateway can also help manage tokens for clients and offer a single token for multi-service access and obfuscate the internal APIs from the external interface.
Gateways tend to also offer developer portals and some kind of self-enrollment process as well as control flows for request content (inbound and outbound).
Tyk.io is an open source API Gateway - you can see the kind of features to expect from a gateway on their home page
4 - is this possible to keep some services public and some private in micro services architecture? how to logically separate both of them?
Yes, you can with an API gateway. So long as there are no inter-service dependencies.
5 - What is major difference and pros and cons of SOA and Micro SOA?
I think one is a subset of the other, SOA tends to be interconnected with a messaging structure like an ESB, but micro-SOA will bhe even more specialised and may not use an ESB.

What are the best practices for caching 3rd party API calls?

Our team currently operates 4-5 applications, and all of those applications use various 3rd party services (SimpleGeo, FB graph API, Yelp API, StrikeIron, etc). There is large overlap between applications, and frequently we call the same APIs for the same input parameters multiple times. Obviously that is not ideal: it is slow and it is expensive (some of the APIs are not free).
What are the best practices for caching these API calls across multiple applications? I see several options:
Write a custom app that creates facade for all of those APIs, and change all of my apps to use it.
Configure some sort of HTTP proxy in a very aggressive caching mode, and perform connections to APIs via that proxy.
Are there any other options I am missing?
Is there anything wrong with option 2? What HTTP proxy would you recommend for it (Squid, Varnish, Nginx, etc)?
You can use any of the three, but I would go with squid. Squid was created (and is heavily used) for this purpose (as a caching proxy). Varnish is designed as a Reverse Proxy (a cache in front of your own back-end), and nginx more like a load balancer and web processor (serving files and dynamic pages).