Oak APIs vs JCR APIs - jcr

I am developing a web application based on Apache JackRabbit OAK (1.3.2) which exposes 2 sets of APIs:
JCR API
OAK API
My question is which API should I use for the application? I am able to implement the applications operations using either APIs. What will be the trade-off in terms of the features/functionalities if any.
Thanks

I would use the standard JCR APIs if possible, and only use the Oak-specific APIs if they bring concrete benefits. Using JCR means you can potentially replace the content repository with a different one, whereas the Oak APIs tie you to that product.

Related

How to identify api version of an application or service

Newbie to API consumption and wasted hours trying to consume a REST 2.0 API resource to find out that we are using 1.0 API. Want to avoid this mistake in the future and looking for ways to identify API version that’s consumable.
In this case it’s an on-prem bitbucket server. Been searching if there’s any default ways to identify an applications API version without success.
Is there a good and easy way to identify which api version an application is accessible by?
No, I don't believe there is a general way APIs commonly use to communicate their versions

Java API for Lucidworks fusion

I could not find the java API for Lucidworks fusion, when searched the internet. Is there one or only REST API's are supported in lucidworks fusion?
There are Java APIs which are available for writing plugins. But REST APIs from Lucidworks Fusion can be leveraged and can be used not only in Java, but in any language. That's the benefit of having REST APIs, right? Please go through the documentation of Fusion 2.1 for all the REST APIs.

Is there any solution for generating the restfual api code both for client and server

The functions for operating the restful api is quite same. Is there any project that can generate the source code for different platform such android,ios and backend stuff.
I suggest you to use API description languages such Swagger ou RAML.
After having described your RESTful application with a language like this, you will be able to generate things like server skelekons and client sdks with different technologies and languages. You can even generate documentations.
With Swagger, swagger-codegen will do that. swagger-ui may also interest you for the documentation part.
To finish, I would like to mention the Restlet studio that allows to define graphically and quickly the structure of RESTful applications and generate then the corresponding Swagger and RAML contents. The APISpark plaform provides a mecanism to introspect Restlet applications and generate the corresponding contents with these languages. It also allow you to generate a set of server skelekons and client sdks.
Hope it helps you.
I will suggest you to use Spring RESTful webservices starter kit. Which will manage your back-end with centralized database. Also Spring has its own android libs to communicate with REST Apis.

How to serve a mobile app

What would be a good way to provide a non-trivial backend for mobile apps, regarding both the protocol used for communication, and the actual hosting?
Most backend platforms (such as parse.com) provides some basic API for performing trivial CRUD data operations, but if the server logic needs to be more complex than that, what would be a good strategy (preferably .NET/C#, secondarily JAVA, but not javascript or any custom scripting approached)? SOAP web services (for example WCF)?
Regarding hosting, I have looked at Azure and AppHarbor, but can't decide between the two. AppHarbor seems like the only place to co-locate the web server and a MongoDB instance in Northern Europe, as Azure (apparently) only provides MongoDB in a US region. Any suggestions?

Fine-grained authorization for web applications

I have a C# .net application which servers both company's internal users and external customers. I need to do fine-grained authorization like who accesses what resource. So I need something like resource-based or attribute-based rather than a role-based authorization.
What comes to my mind is to either:
Implement my own authorization mechanism and sql tables for my .net application
Use/implement a standard mechanism, like a software that has implemented XACML (for instance Axiomatics)
The problem with the first method is that it is not centralized nor standard so other systems cannot use it for authorization.
The problem with the second approach is that it is potentially slower (due to extra calls needed for each resource). Also I am not sure how widely a standard authorization like XACML is supported by applications in the market to make future integrations easier.
So, in general what are the good practices for fine-grained authorization for web applications that are supposed to serve both internal users and external customers?
I would definitely go for externalized authorization. It doesn't mean it will be slower. It means you have cleanly separated access control from the business logic.
Overview
XACML is a good way to go. The TC is very active and companies such as Boeing, EMC, the Veterans Administration, Oracle, and Axiomatics are all active members.
The XACML architecture guarantees you can get the performance you want. Since the enforcement (PEP) and the decision engine (PDP) are loosely coupled you can choose how they communicate, what protocol they use, whether to use multiple decisions, etc... This means you have the choice to go for the integration which fits your performance needs.
There is also a standard PDP interface defined in the SAML profile for XACML. That guarantees you 'future-proof' access control where you are not locked into any particular vendor solution.
Access control for webapps
You can simply drop in a PEP for .Net webapps by using HTTP Filters in ISAPI and ASP.NET. Axiomatics has got one off-the-shelf for that.
Current implementations
If you check Axiomatics's customers page, you'll see they have Paypal, Bell Helicopter, and more. So XACML is indeed a reality and it can tackle very large deployments (hundreds of millions of users).
Also, Datev eG, a leading financial services provider is using Axiomatics's .Net PDP implementation for its services / apps. Since the .Net PDP is embedded in that case, performance is optimal.
Otherwise, you can always choose from off-the-shelf PEPs for .Net that integration with any PDP - for instance a SOAP-based XACML authorization service.
High levels of performance with XACML
Last July at the Gartner "Catalyst" conference, Axiomatics announced the release of their latest product, the Axiomatics Reverse Query which helps you tackle the 'billion record challenge'. It targets access control for data sources as well as RIA. It uses a pure XACML solution so that it remains interoperable with other solutions.
As a matter of fact, Kuppinger Cole will host a webinar on the topic very soon: http://www.kuppingercole.com/events/n10058
Check out the Axiomatics ARQ press release too here: http://www.axiomatics.com/latest-news/216-axiomatics-releases-new-reverse-query-authorization-product-a-breakthrough-innovation-for-authorization-services.html
Definitely look for a drop-in authorization module for your ASP.NET application. I'm not just saying that because I implement drop-in auth systems at BiTKOO, but because I have had to work with home-grown auth implementations in the past. Building your own authorization system for a single application really is not a good use of your time or resources unless you intend to make a career out of implementing security systems.
Externalizing the authorization decision from your app is a good idea from an architectural standpoint. Externalizing the authz decision gives you an enormous amount of flexibility to change your access criteria on the fly without having to shut down your web service or reconfigure the web server itself. Decoupling the web front-end from the authz engine allows you to scale each independently according to the load and traffic patterns of your application, and allows you to share the authz engine across multiple apps.
Yes, adding a network call to your web app will add some overhead to your web response compared to having no authorization at all or using a local database on the web server. That shouldn't be a reason not to consider external authorization. Any serious authorization product you consider will provide some sort of caching capability to minimize the number of network calls required per web request or even per user session across multiple web requests.
In BiTKOO's Keystone system, for example, the user attributes can be cached on the web server per user-session, so there's really only one back-end network request involved on the first page request as part of establishing a user login. Subsequent page requests (within the lifetime of the cached credentials, usually 5 minutes or so) can be handled by the web server without needing to hit the authz service again. This scales well in cloud web farms, and is built on XACML standards.
I need to do fine-grained authorization like who accesses what resource. So I need something like resource-based or attribute-based rather than a role-based authorization.
Check out this: https://zanzibar.academy/. Zanzibar is a project made at Google to solve fine-grained authorization at scale.
Use/implement a standard mechanism, like a software that has implemented XACML (for instance Axiomatics). The problem with the second approach is that it is potentially slower (due to extra calls needed for each resource).
Auth0 is working on a solution called FGA (https://fga.dev) that will be optimized for low latency. It's built upon the Zanzibar paper.
Disclaimer: I am employed at Auth0.