How can I know if an API service (e.g. Garmin health API) is down or not when calling from an application? I want to implement a way in my system so that when any API service is down for maintenance or any other reason, the application from where it's called can take some measures like showing a related message to the users who are calling it.
Related
I have 2 web apps. One web app acts as a host (lets label as Host). All Web APIs resides here. Then the other web app calls those Web APIs (lets label as Client).
What I'm trying to accomplish is this:
Client calls a Web API using Jquery Ajax in Host and host processes this. After successful process, I want to be able to send some message in the HOST's client-side so I can update some UI.
That's the part I am unsure about. To notify the client-side of the Host so I can do some changes in UI, when the caller is in another app. I can't think of a way to pass some message so I can raise some popup, change some text, etc.
What I'm trying to accomplish is this: Client calls a Web API using Jquery Ajax in Host and host processes this. After successful process, I want to be able to send some message in the HOST's client-side so I can update some UI.
To achieve the requirement, you can try to integrate SignalR functionality into Apps.
Clients can connect to hub server, and clients can be added to two different groups, which provide a method for broadcasting messages to specified subsets of connected clients.
For more information about ASP.NET Core SignalR, you can check following docs:
https://learn.microsoft.com/en-us/aspnet/core/signalr/introduction?view=aspnetcore-3.1
https://learn.microsoft.com/en-us/aspnet/core/signalr/groups?view=aspnetcore-3.1#groups-in-signalr
I've been doing a ton of research on Microservices however I cannot find a single piece of code that is written for an API gateway. I understand that between the clients and services you would have an the API Gateway which allows a client to make 1 requests over IoT to the gateway and then the gateway can make many requests internally to services which then build up a response. Now from an article on NGINX
The API Gateway is responsible for request routing, composition, and protocol translation.
use case
Suppose we support 2 clients. Android and an Angular App (browser) and let's make a tangible user story that the client is an online shopping store.
the shopping store would then have different services broken out into servers and each service could be built on a different platform/language with a different database. They are completely self contained so that they can scale in the cloud really quickly without having to scale the entire application. If there is some intense Algorithm that needs to run for payment. Then the payment services can quickly spin up a few more servers to balance the load and decrease user wait time.
but that could be written in Java which could have a HTTP/REST exposed api to be consumed. However what if it's written in say c++/Golang/Node it doesn't really matter what language, but instead of exposing their api via HTTP it's by a different protocol what would that mean on the api gateway - how would it handle the response?
the client goes and makes a request to the home page where we have 3 things loaded
shopping cart
list view of shoppings items
current specials
the client would only make 1 request to the api gateway let's say to apigateway/apiv1/home to the api gateway which then would have 3 requests to the services so
serviceShopping/apiv1/shoppingList
serviceCart/apiv1/cart
serviceSpecial/apiv1/specials
at this point the 3 services could be written in a different language and use a different protocol. How would those 3 services be requested and on the response back to the client (a single response) how would it be concatenated? json object with a specific schema? this is where I get confused...
sorry for the long post it's a simple question I think, but I needed to setup something I can conceptualize with and explain.
I currently have a WCF service that the website uses to display data. I also use the same service to run a computation that takes about 8-9 hours and it saves the data it computed into the database during the time (the process/server stays alive). The WCF is only used by the website.
Should I switch to ASP.NET Web API? Can Web API run computation for like 8-9 hours and stay alive like the WCF? Basically I can call the web address and it initiates the computation and I also use different web address to get the data for the clients.
Should I even consider Web API?
I have read Why not publish NServiceBus messages from a web application and another similar question about this but I am not clear if this applies to service layer as well. For example, if the service layer is composed of web services or REST services built using WCF or Web API or any other way, should those services publish events or send commands? If those services are hosted in load balanced web servers, the problems outlined in the articles apply to this layer as well. How would the recommendation change or not change?
If I look from the definition of Event vs Command, the messages I am talking about are Events e.g. "a user was created" and so an event should be published. As a matter of fact, the service that created the user doesn't even know what else to do i.e. may be another application is supposed to create a customized portal for it and yet another application is supposed to send a welcome kit to the user. This would be an event and not a command. I guess I am hung up on the definition of a web application and application service when application service itself is composed of one or many web applications.
The definition of Web Application
A web application is an application that is accessed by users over a
network such as the Internet or an intranet.
However, to me, the users can be computers and thus web services are web applications and that is the reason for this question.
EDIT:
Let's consider a concrete example. An ASP.NET website (MVC or Web form - doesn't matter) displays the form to the operator, gets a post with data about user creation (Name, UserName, Password) and invokes a WCF service to create the user. In between website and WCF service we can put ServiceBus and send command to create the user (Request/Response) so that we get all the benefits described in the first article. WCF service is the actual business processing layer i.e. it would create the user. That is where I have the question. After the user is created, it should announce that a user has been created and other systems can react to it and do whatever they are supposed to do. So it fits perfectly the pattern of publish the message. However, the WCF service itself is a web application and thus has most of the traits of the web applications and thus the confusion.
As mentioned in the answer to the SO question you linked to, publishing event has more to do with where the actual processing takes places. Just as a side-note: it is not a matter of Send instead of Publish since that would imply that the two are interchangeable whereas they have rather different intentions. When you want to publish, you want to publish.
The same questions should arise if you find yourself publishing from your web-exposed integration layer: should you be performing the business processing in that code or rather sending it off to another endpoint for processing? Typically you should just send it off to another endpoint. You may even consider how you would perform the relevant action should anyone wish to invoke it. For instance, if you are publishing a UserCreatedEvent message it implies that you created a user. How would a user be created? Would I be forced to use the WCF / Web-Api layer or can I send a CreateUserCommand message on the bus that is processed by some application endpoint? If it is the former then you may need to rethink your design. However, if the latter you should be sending the command from your WCF / Web-Api anyway and the processing endpoint will perform the Publish bit :)
update:
My take on it is that it is more about cohesion / concerns. You would typically interact with your domain, from within your business, via a service bus for commands and events, and a simple query layer for reads. If you need to expose anything to a third-party (or simply via the web) then you use WCF / WS / Web-APi. The point is that you should try to avoid business processing in an integration endpoint (or in a front-end like a website). Business processing is better suited to application servers. There are usually exceptions to the rule but if you are in a position to influence the structure then you are in a better space.
The fact is, whatever code is truly responsible for performing the action should be the same which publishes the event. If you've got a MVC app and in the controller itself you're using Entity Framework to insert the User record, then that is exactly where the Publish should be, right after the SaveChanges call. If however, the controller calls a referenced binary or service which does the actions involved in the "add user" call, then the Publish should be there. My thought is the event should be right alongside the code that does the action whose event you are trying to publish.
The application that I'm designing will retrieve and store content from a variety of disparate sources on a schedule. In some cases, the content will be retrieved based on a time interval (think stock quotes), and in other cases the content will be retrieved based on a custom schedule (MWF # 2pm). Many of the processes lend themselves to MS Workflow. The built-in SQL tracking service will provide a lot of value. The content sources are sufficiently different that each different type of content retrieval will be a custom workflow.
My question is, how should I host, monitor,schedule, and expose the Workflows?
Requirements:
Must be able to monitor the health of each content "agent" via admin UI
Must be able to start and stop individual workflows via admin UI
Workflows are recurring based on a schedule, but not necessarily "long-running"
"Service" must have high availability
Windows service, Workflow Service, ASP.Net, WCF are all available to me, and I'm open to other suggestions as well.
WF and WCF can be hosted as one WindowsService,
You can create a set of services to expose the state/information from the Workflow in WindowsService via WCF web service.
Therefore the WCF service should have a reference to your workflow exchange contract
( somehow can reference to workflow engine to deliver the request info from client UI).
Must be able to monitor the health
of each content "agent" via admin UI
The Admin UI can retrieve the data from the webservice which
Must be able to start and stop
individual workflows via admin UI
Let the workflow instance to handle a specific event to start or stop
Workflows are recurring based on a
schedule, but not necessarily
"long-running"
Let the workflow instance to handle a specific event to do so
"Service" must have high
availability
WindowsService is daemon alike application, it runs forever if it doesn't crash
I found this post helpful as well:
http://www.dotnetconsult.co.uk/weblog2/PermaLink,guid,77c334e8-0ec1-4f91-ab7e-0bcfa7f2f47d.aspx
You may want to look into Dublin, Microsoft's upcoming integrated host for workflow services. It's not out yet, but offers some of the features you're looking for.