Creating an API call for a new API in Spartacus - spartacus-storefront

We have exposed some new API’s from SAP Commerce and need to get/post data in some of the components or services.
Are there any pre-defined mechanisms to leverage new API integrations, or should we simply use HttpClient from Angular?
We are expecting that there should be a way to get data from new API’s like we have for OOTB API’s such as “StoreFinderService” which helps us to get data for store and there are some other services as well.
Spartacus is using NgRx to handle all API’s and then these services supply data from stores. Similarly, there must be a way to get data for new API’s created as per business requirements.

Spartacus uses the Angular HttpClient under the hood, it's the recommended service to interact with http. However, using HttpClient directly from a component is generally considered a bad practice. You'd better separate concerns, and delegate interacting with the backend to a service.
Spartacus offers an extensive architecture to handle backend APIs, you can read more about it at https://sap.github.io/spartacus-docs/connecting-to-other-systems. You could follow this architecture, but for a project it's overcomplicated. The architecture really targets extension points, which is fair for a product, but most often not for a project.
https://github.com/SAP/spartacus-bootcamp/tree/master/src/app/features/state shows you some examples of introducing a custom state to the Spartacus state.
For projects introducing new features, I would either create and maintain state in a service, or introduce a custom state in ngrx. For the later, you can evaluate https://github.com/SAP/spartacus-bootcamp/tree/master/src/app/features/state that shows how to add new state into Spartacus, or read up on ngrx resources.

Related

Is prisma ORM an API?

I have been trying to clarify the definition of API, form my understanding, API is any kind of services that act like middleman when two applications( such as web framework and database ) cannot talk each other directly.
And I think prisma is one of API, am I correcly understanding it?
An API is an Application Programming Interface.
Originally, typically specified how an application invoked functionality in a programming library. Later, the term was generalized to mean other things: for example, specifying how an HTTP-based application must interact with a REST service.
From Wikipedia:
https://en.wikipedia.org/wiki/API
An application programming interface
(API) is a way for two or more computer programs to communicate with
each other. It is a type of software interface, offering a service to
other pieces of software.[1]
To answer your question:
Prisma is an ORM
Prisma has an API. Prisma client apps use the Prisma API to interact with the underlying database.

How do you name API paths of the same microservice but w/ different consumers

Context
Let's imagine a simple microservices architecture (e.g. 2-3 microservices). Microservices are domain-based, API gateway in place and everything is how it should be. At the same time, microservices APIs are consumed by public mobile applications, admin UI, and other services for S2S communication, hence, we have three possible APIs consumers. Depends on the consumer, response DTOs are different but the business process might be the same (e.g. response for GET /users endpoint has different DTOs for a consumer application and admin UI but technically the data is taken from the same DB).
Question
How do you segment APIs in that case? Do you use namespaces like external, internal and etc?
Also, feel free to share your experience of how you segment APIs.
Thanks in advance!
From my point of view, the APIs should be different depending on the type of consumer that is going to use them.
For example, talking about your use case, It couldn't be the same API one that is intended to provide simple user information that the one used by an administrator. You should define two different APIs in this case, with different paths like internal/users/ and external/users as you said, and internally these two endpoints can use the same logic.
This separation is not only good in order to return different dtos in each endpoint but also to define different security (authentication/authorization) mechanisms for each API because I suppose that these requirements will be different for an admin API that for a general user one
It depends a bit on the philosophy you want to adopt.
The one suggested by #JArgente is good, in that you'd get good separation, and the role of each is (or at least should be) very clear.
The other approach is layering, which (for the OO programmers out there) is a bit like developing overloads for a method. It assumes that the data required by the derived API's is provided by the base API. So:
Develop a base API that provides all the data this API family needs to provide. This API might be the one that internal users use (e.g. Admin User), and it could require authentication.
Develop a public facing API that consumes the base API. This one would be your public-facing one.
Each API has a separate API Spec; depending in how you do this you can leverage inheritance at the Spec level.
Each API also has an actual endpoint which triggers some sort of processing - e.g. logic within the API Gateway itself, or logic handled within a downstream component like a microservice.
The public-facing one can be anonymous, as long as something (e.g. the API Gateway) can make an authenticated call against the base API, using some kind of 'service account'.
The advantage here is that you still get good separation between different API's and their consumers, but you also get the advantages of inheritance, so that code duplication is reduced (testing effort isn't so diffuse, etc).
This approach also allows you to run the endpoints on the same API Gateway, or deployed on separate ones (internal vs external).

How to ensure a RESTful API is backwards compatible?

In my company, we develop an API which is consumed by external customers. We want our API to evolve in a backwards-compatible way without using versions.
We would like to have tests on our side that validate that, whenever our API evolves (e.g. new attributes are added to requests or responses) backwards-compatibility is guaranteed. What is the best way of validating that?
I've had a look at contract-testing (Pact or Spring Cloud Contract) but they seem particularly effective in consumer-driven testing - which is not what we want to do.

What is the difference between System API and Process Api

Kindly, can anyone differentiate between System api and Process api?
Please provide answer in Generic terms, as i am unable to find on internet.
A system api abstracts from an existing system. It talks to the system in the language of the system (e.g. SOAP, direct Java calls, SAP calls, etc.). To the outside world it offers a clean API (usually REST with http and json). When you do a good job implementing your system api, you can exchange your existing system with a different/new one without changing the api of your system api to the outside world: Just implement a new system api with different adapter logic.
A process api should talk REST on "both ends". It calls one or several system apis to do its job. The process api orchestrates different jobs.
When you need more information, do a search with "api led connectivity"
A System API is a layer you build on top of a system, which handles all system specific connection quirks and settings. It then exposes these resources and it's logic in a standard format (usualy REST but you're free to choose something else like SOAP) to the rest of your API's. Like Roger Butenuth states:
"When you do a good job implementing your system api, you can exchange
your existing system with a different/new one without changing the api
of your system api to the outside world: Just implement a new system
api with different adapter logic."
A process API is where you keep your logic and orchestration, it does not 'talk' to end systems directly but instead connects to system API's to get it's data.
A process API should idealy only talk REST on both sides and can aggegrate data from multiple systems.
An example of a complex process API would be an "items you've ordered" API which takes in a user id as it's input, then talks to the system API of a CRM system to get the id used by the "order history system API".
However this API might only return a list of orders without any article information besides an article id. So our Process API then enriches this list with Article information fetched from "article information system API" with the id from the list.
I know it's out of the scope of the question, but for the sake of completeness i'l shortly explain the third variant as well:
An Experience API can be seen as a doorway into your API network, every (type of) client has different information needs and can communicate in different protocols.
It is the Experience API's responsibility to provide ALL the information required by a client in a format they support.
This takes the responsibility away from a client to know where the information needs to be fetched from.
(Customer info from CRM, Order info from proprietary sys one, Article info from article DB)
This concept of design has as a bonus that if when for example, the mobile app your company is making, gets some new functionality which requires extra data.
You can update the "mobille app experience api", which would leave your "superexpensive IBM Experience api" unchanged. Cutting down on development costs as you don't need to implement any changes in your other api consumers which would be the case had you had only one api.
I think the main difference is where you are implementing business process and rules/logics.
System API, within the scope of your design, are atomic APIs which will be used to construct higher level API (experience APIs). Process API is the orchestration layer where you can use Mulesoft flows to implement business process or logic.
System APIs do the heavy lifting work of CRUD operations.
Process APIs focus on business logic
System API's are underlying all IT designs are center frameworks of records that are regularly not promptly accessible because of its many-sided quality and network concerns. APIs give a method for concealing that many-sided quality from the client while uncovering information and giving downstream protection from any interface changes or legitimization of those frameworks.
Process APIs exemplify the fundamental business forms that cooperate with source and target frameworks or channels through an arrangement of framework APIs. For instance, in a buy arrange process, there is some rationale that is regular crosswise over items, geologies and retail channels that can and ought to be refined into a solitary administration that would then be able to be called.
And you will get some more clarity from this article https://dzone.com/articles/api-the-backbone-of-the-software-industry-know-how
System API and Process API will be part of API led connectivity.
System API is like awrapper service to either a main data base or saas platform
Process API involves application logic to validate search or query parameters

Can I make an API from a backend that usually uses a RequestFactory servlet?

I am new to web dev but I have managed to build my site using GWT and GAE. I use RequestFactory for client-server communications.
Now, someone wants to make mobile applications that use my backend.
I have found that RequestFactory works very well with Android. But I am somehow afraid it will not work with other "not-google" front ends (iOS for instance).
So my question is, can I make an API based on my RequestFactory backend (servlet) that can be used by any client? Any initial pointers as to how to implement it would be appreciated.
I guess technically it would be possible. However, if you want to create an api anyone can use, you probably want an api were you specify both how to communicate with the api and the content send/received by the api. With RequestFactory both the how and what is shielded by RequestFactory. So if someone wants to communicate with your api and can't use the RequestFactory code in the project, the how and what of RequestFactory must be reverse engineered, and could change anytime because it's not guaranteed. Not the most elegant way to go forward.
A better approach is define an open api were you specify the how and what. For example with
and apu based on REST (the how), communicating JSON data, and to specify the content format (the what). An example of such an api is the twitter api.
For your own project you could build on your api also, for example by using RestyGWT. Then you don't have to maintain both the code for the RequestFactory interface and REST interface. For other platforms there are probably several libraries available to make developing against a REST interface easy.