I'm building this website which internally calls some APIs to interact with data in a server,
but not planning to make those APIs officially public.
Even in this case, should I make those RESTful?
There are lots of trade-offs. You say you're prototyping, but may need to implement "seriously".
Firstly, even for prototyping, you get a lot of benefit from sticking with a consistent API approach, ideally based on client and server libraries in your framework of choice. Common choices are "synchronous/asynchronous", "function-based/resource-based", "JSON/XML" etc. Mixing and matching those choices just makes everything much harder.
Some business domains are great for resource-based API structures. Order management systems, social networks, question-and-answer web sites all work well. Some are not so easy to represent as resources - real-time/IoT applications, chat/messaging systems, etc.
If you decide that "synchronous" and "resource based" are a good fit for your business domain, you may as well take advantage of the libraries that exist to build and consume RESTful APIs. You can decide for yourself how "pure" and "future-proof" you want to make those APIs. You may not care about versioning, for instance.
If "synchronous" and "resource-based" are not a good fit, I'd not try to shoe horn them into a RESTful API design.
The REST interface is designed to be efficient for large-grain hypermedia data transfer, optimizing for the common case of the Web, but resulting in an interface that is not optimal for other forms of architectural interaction. -- Fielding, 2000
REST is software design on the scale of decades: every detail is intended to promote software longevity and independent evolution. Many of the constraints are directly opposed to short-term efficiency. -- Fielding, 2008
That doesn’t mean that I think everyone should design their own systems according to the REST architectural style. REST is intended for long-lived network-based applications that span multiple organizations. If you don’t see a need for the constraints, then don’t use them. -- Fielding, 2008
For a private API, you probably control both the clients and the server, and its unlikely that you are going to be facing Web Scale[tm] traffic levels where you will need to offload work to caches.
Which means that you aren't likely to get full return on your investment.
Related
While not a code-based question, I feel this question is relevant to the developer community in pursuit of a deeper understanding of API's and their role in business and the IoT at large.
Can someone please expand on the statement below? Other than in-house dev time, how exactly do API's save businesses money and foster agility?
"...APIs save businesses money and provide new levels of business agility through reusability and consistency."
Additionally, while we all know that API's are cool and can be used to build amazing things, I'm seeking to understand this from the perspective of risk vs. reward for a business.
APIs benefit larger organizations or distributed organizations with separate business units or functional units. In that scenario it allows the different functional units to deploy independently assuming you do API versioning. This has a very substantial work queuing benefit in a larger organization.
In a small organization their benefits are questionable and APIs should probably be extracted from systems as duplication arises or new problems could benefit from old solutions. Having gone through this transition I can say it's unwise to build APIs without existing applications.
In the context of IoT APIs make a lot of sense because you have largely dumb devices (supercomputers by 1980's standards) that connect back to smart infrastructure. If that is done in a bespoke or ad-hoc way it's going to be an enormous headache to change things as you release new devices. With versioned APIs separating the devices and the smart infrastructure you have a greater chance of introducing change without disabling your customers' legacy devices.
In IoT Space, APIs offer the following benefits:
New device types (e.g. from different vendors) can be easily added to the IoT platform. This saves money, because you as business owner can select from multiple devices and choose the best for your purpose, i.e. the most cost efficient one. (This relates to the API between the device and the platform).
New applications or new features can be added easily. E.g. in case you need an additional feature, it can be added on top. Even better, you can ask your internal IT or an external system integrator to do the work, again giving you the choice to select the best offer. (This relates to the API between the platform and other application).
From risk viewpoint, APIs do need to be protected as any other endpoint that you expose to the Internet (or Intranet). As minimum, you need authentication (username, password or other means), authentication (access to a subset of the data) and encryption (i.e. use TLS). Depending on your scope, you might need additional governance and API protection (e.g. throttling).
I wonder if anyone could share their thoughts on my question regarding web based APIs (we use Microsoft stacks)..
We are currently in the process of building an infrastructure to host web apis across our business.
As a organisation we have seperate business areas that provide services to our customers. These individual areas of our business generally have their own best of breed IT system. Offering APIs is something we've long thought about and we have started the design process.
The APIs we aim to offer shall be web based (.NET/webAPI/WCF etc.) and will largely (99%) be consumed within our organisation but some may be exposed externally in the future should the requirement arise (new mobile app may need to use the services etc.)
I'd love to hear your thoughts and experiences around how you architected yuor farms. I understand its quite an open question without understanding the crooks of our requirements but its more general advice/experiences I'd like to hear.
Particularly we are trying to decide whether we should design the infrastrcuture by:
1) Providing each area of the business with their own API server whereby we shall deploy each web API within a new application inside IIS.
or
2) Setup up a load balanced web api farm whereby we have say 2/3 iis web servers, all built the same, hosting the same web apis but the business areas will all share the same server effectively. Each area would have a segregated site within iis and new APIs shall be setup under new applications inside their respective web sites.
I dont foresee us having thousands of APIs but some will be business critical so I'm certainly bearing resilience in mind which is why as much as I like each business area having their own API server, I'm being swayed towards the option of having a load balanced farm which the whole business shares.
Anyone have any thoughts, experiences etc.?
Thanks!
That's a very interesting question, and i'd love to hear what others might think. I'm no big expert, but here are my two cents.
It seems to me, that the answer should be somewhere in between those two options you specified. Specifically, each critical business area, should get their own resilient, load balanced farm, while less critical services can utilize single machine deployments. Critical business area may not mean only one API, but can actually be a group of APIs, with high cohesion among themselves.
Using option 1 environment to full extent can be hard to maintain,
while utilizing option 2 fully, can be inefficient in terms of redeployment if (or better yet, when) business logic changes. Furthermore, i think it will be possible for greedy APIs to hog resources in peak traffic, making other services temporary less performant (unless you have some sort of dynamic scaling mechanism).
Suppose you are developing a platform which has a web-based interface for its users and APIs for third-party developers. Something similar to Salesforce (or even Facebook).
Salesforce and Facebook, both platforms have normal web-based interface for its users and APIs for third party developers.
Ideally any API internally calls the same function which is being used by the web-based interface. For e.g. "Create a Project" button and "CreateProject" API calls the same "createProject()" function internally. So you can maintain the same version for both as in most cases they are tightly integrated.
Now sometimes you add a feature which makes you increment the minor version of the web-based interface but since you are not implementing that feature in API, API version should remain as is.
How do you handle such cases? Should you maintain separate versions of your web-based interface and APIs for your platform?
It Depends. Because It is difficult to offer a conclusive answer to this question. But I would share some ideas and drill-down some scenarios to help at best.
I would suggest there should be two versions of the api. internal apis and public apis. At a caller's end, they would be two physically distinct apis/end-points so that the security policies and a of lot of other things can be done without exposing much information as well as without relaying any responsibility on code to handle the distinction policy based on who's calling from where as that may quite easilyfail.
It is ok if both versions of the apis consolidate down the line to some extent without involving any security risk but a separate team of expert engineers can design this consolidation to be seamless yet safe. It's a trade-off of between code-reuse and everything else. This is very subjective and can turn into endless discussion. But the software evolves very well as result of this design process if it is agile and iterative.
The apis should be externalizable and inter operable. If the project is very large, then different teams working on separate parts of the project will interact with each other's work using internal apis only. No hanky-panky. No direct data access. Only apis.
This approach will help you create awareness of what's being done in the project across all teams if the apis are discoverable which I personally believe is a very good thing for collaborative team work. In fact it also helps in re-usability. Testing becomes unified and automated. Every team will become responsible for their own work and hence it should be easy to address accountability.
There's more stuff that can go in here but I think this is sufficient at a high level.
IF allowed, I would also read this question as
"Should you have purely service oriented architecture or not ?"
And my answer would be, **It Depends.**Because It is difficult to offer a conclusive answer to this...
Do not publish core function directly via API, instead create all API functions as proxy functions and all changes in core functions will be handled in proxy functions.
Change public api version if there is change in API input/output.
This way you could achieve code re-usability and it does not increase public API version frequently.
Edit:
If you are talking about software version number. My answer is Yes.
I am currently developing a very simple web service and thought I could write an API for that so when I decide to expand it on new platforms I would only have to code the parser application. That said, the API isn't meant for other developers but me, but I won't restrict access to it so anyone can build on that.
Then I thought I could even run the website itself through this API for various reasons like lower bandwidth consumption (HTML generated in browser) and client-side caching. Being AJAX heavy seemed like an even bigger reason to.
The layout looks like this:
Server (database, programming logic)
|
API (handles user reads/writes)
|
Client application (the website, browser extensions, desktop app, mobile apps)
|
Client cache (further reduces server reads)
After the introduction here are my questions:
Is this good use of API
Is it a good idea to run the whole website through the API
What choices for safe authentication do I have, using the API (and for some reason I prefer not to use HTTPS)
EDIT
Additional questions:
Any alternative approaches I haven't considered
What are some potential issues I haven't accounted for that may arise using this approach
First things first.
Asking if a design (or in fact anything) is "good" depends on how you define "goodness". Typical criteria are performance, maintainability, scalability, testability, reusability etc. It would help if you could add some of that context.
Having said that...
Is this good use of API
It's usually a good idea to separate out your business logic from your presentation logic and your data persistence logic. Your design does that, and therefore I'd be happy to call it "good". You might look at a formal design pattern to do this - Model View Controller is probably the current default, esp. for web applications.
Is it a good idea to run the whole website through the API
Well, that depends on the application. It's totally possible to write an application entirely in Javascript/Ajax, but there are browser compatibility issues (esp. for older browsers), and you have to build support for things users commonly expect from web applications, like deep links and search engine friendliness. If you have a well-factored API, you can do some of the page generation on the server, if that makes it easier.
What choices for safe authentication do I have, using the API (and for some reason I prefer not to use HTTPS)
Tricky one - with this kind of app, you have to distinguish between authenticating the user, and authenticating the application. For the former, OpenID or OAuth are probably the dominant solutions; for the latter, have a look at how Google requires you to sign up to use their Maps API.
In most web applications, HTTPS is not used for authentication (proving the current user is who they say they are), but for encryption. The two are related, but by no means equivalent...
Any alternative approaches I haven't considered
Maybe this fits more under question 5 - but in my experience, API design is a rather esoteric skill - it's hard for an API designer to be able to predict exactly what the client of the API is going to need. I would seriously consider writing the application without an API for your first client platform, and factor out the API later - that way, you build only what you need in the first release.
What are some potential issues I haven't accounted for that may arise using this approach
Versioning is a big deal with APIs - once you've created an interface, you can almost never change it, especially with multiple clients that you don't control. I'd build versioning in as a first class concept - with RESTful APIs, you can do this as part of the URL.
Is this good use of API
Depends on what you will do with that application.
Is it a good idea to run the whole website through the API
no, so your site will be accessible only through your application. this way This implementation prevents compatibility with other browsers
What choices for safe authentication do I have, using the API (and for some reason I prefer not to use HTTPS)
You can use omniauth
Any alternative approaches I haven't considered
create both frontends, one in your application and other in common browsers
What are some potential issues I haven't accounted for that may arise using this approach
I don't now your idea, but I can't see major danger.
We are developing a middleware SDK, both in C++ and Java to be used as a library/DLL by, for example, game developers, animation software developers, Avatar developers to enhance their products.
Having created a typical API using specific calls for specific functions I am considering simplifying the API by using a REST type API (GET, PUT, POST, DELETE) or CRUD type (CREATE, READ, UPDATE, DELETE) interface.
This would work in a similar way to a client-server type REST API where there are only 4 possible API calls but these can take flexible parameters.
This seems to have the benefit of making the API stable in that new calls are not being added and old calls are not being removed. So a consumer of this API need not worry about having to recompile and change their code to suit any updates to our middleware.
The overhead is that there is an extra layer of redirection in the middleware controller to route API calls and the developer needs to know what parameters are available for each REST call (supplied of course).
I have not so far seen this system used outside of web type client server applications so my question is this: Is this a feasible idea?
I am thinking in terms of its efficiency as well as if for example a game developer would find it easy to use.
Yes, this is a feasible idea. But I'm not sure the benefits would justify the costs. REST is best applied to a networked application scenario, oriented around requests and responses. While there are definite learning curve advantages to a uniform interface, those advantages can be present in almost any well-designed API which provides reasonably abstract procedures.
You also expressed concern for whether a game developer would find a RESTful API easy to use. I'd be dubious. I've implemented many RESTful web services, and helped many developers get up to speed both building them and using them, and the conceptual leap required to grasp REST can be substantial for someone who has been steeped in procedural APIs for years. I'd think that game developers in particular would be very strongly connected to procedural APIs, to the point that attempting to adopt a different paradigm, whatever its benefits, might prove extremely difficult.
Remember that REST is not specific to HTTP, and does not rely on just the 4 HTTP verbs. The verbs you have and can use depend on what protocol you're using.