I have to create a table, such that it sends Http Request with a parameter (first column value) for each row in the table and shows processing status for that request.
There is same backend process for each row with a column value passed as a parameter.
On completion of the request for the first row, the status column in the table should show 'Success' and then the request for the second record should be sent.
The request should be sent synchronously as there is moderately heavy backend process attached i.e. on completion of the previous request only next request has to be sent.
I have to achieve this using Angular 5 and DotNet Core 2.0.
Please let me know if there is a feature already available for doing such a thing in Angular 5 and .net Core in an optimized way. I have heard of ReactiveX but I am not able to figure out the best way to achieve this as Reactivex is for asynchronous programming.
Any suggestions or similar examples on this would be helpful.
Thanks
Related
Could someone please explain how Fork from Elsa Workflow works?
I have set up HTTP Endpoint. After that I send HTTP Request and I get HTTP Response in JSON format.
I want to set condition on that response (for example name='John') so that I could display only that row from database.
What Elsa activity should I use? Is Fork activity correct choice or other?
Fork
The Fork activity simply forks workflow execution into multiple branches. When you add this activity, you specify a list of one or more branch names. These branch names will be scheduled as activity outcomes.
For example, if you add a Fork activity with branches Do Some Request and Timeout, the Fork activity will show these branches as outcomes.
When the Fork activity executes, both branches will execute. This enables scenarios where you for example want to wait for some user input or some other job to finish, but not indefinitely: you have a second branch that waits for a timeout event using e.g. the Timer activity.
A sample workflow that describes the usage of the Fork activity in a similar scenario can be found here.
That describes the use case of the Fork activity. But you will want to use the If activity instead.
If
Going back to your use case with the HTTP Request activity and setting a condition, what you want to use instead of Fork is the If activity.
When you connect an If activity to HTTP Request, you will be able to write a JS expression that must evaluate to true or false.
For example, let's say your Send HTTP Request activity performs a GET request on https://reqres.in/api/users/2
Make sure that the Read Content checkbox is ticked.
Also make sure to give your Send HTTP Request activity a name. For example, SendHttpRequest1.
With that in place, you can now write the following JS expression in the Condition field of the If activity:
activities.SendHttpRequest1.ResponseContent().data.first_name == 'Janet'
Note that activities.SendHttpRequest1.ResponseContent() returns an ExpandoObject that represents the received JSON response from the demo API endpoint I used in my example.
I have a webapp created with Node.js/Express.js/Pug that runs a bash script(mostly an Nmap scan) and displays the results. I'd like to implement some sort of page in between the start and the results to signify the system is working on the task.
I tried to just add another res.render(...) at the beginning of the route that starts the scan, but I ran into the problem that HTTP cannot send headers twice. Effectively, I can't send two http responses for one request; please let me know if I'm wrong here.
I'm still not very familiar with this stuff; I'm working with a group and this job fell to me, any help is appreciated.
Typically the route handler would:
trigger the long running script asynchronously
return an "in progress" page
Then the "in progress" page would ask the server if it was done yet via:
Websocket
Ajax polling
Meta refresh polling
You'd need to have the callback to the original asynchronous process keep track of where the response should go to (possibly using a GUID that would be passed to it and also returned as data in the "in progress" page).
I have a subroutine in my Controller
<HttpPost>
Sub Index(Id, varLotsOfData)
'Point B.
'By the time it gets here - all the data has been accepted by server.
What I would like to do it capture the Id of the inbound POST and mark, for example, a database record to say "Id xx is receiving data"
The POST receive can take a long time as there is lots of data.
When execution gets to point B I can mark the record "All data received".
Where can I place this type of "pre-POST completed" code?
I should add - we are receiving the POST data from clients that we do not control - that is, it is most likely a client's server sending the data - not a webbrowser client that we have served up from our webserver.
UPDATE: This is looking more complex than I had imagined.
I'm thinking that a possible solution would be to inspect the worker processes in IIS programatically. Via the IIS Manager you can do this for example - How to use IIS Manager to get Worker Processes (w3wp.exe) details information ?
From your description, you want to display on the client page that the method is executing and you can show also a loading gif, and when the execution completed, you will show a message to the user that the execution is completed.
The answer is simply: use SignalR
here you can find some references
Getting started with signalR 1.x and Mvc4
Creating your first SignalR hub MVC project
Hope this will help you
If I understand your goal correctly, it sounds like HttpRequest.GetBufferlessInputStream might be worth a look. It allows you to begin acting on incoming post data immediately and in "pieces" rather than waiting until the entire post has been received.
An excerpt from Microsoft's documentation:
...provides an alternative to using the InputStream propertywhich waits until the whole request has been received. In contrast, the GetBufferlessInputStream method returns the Stream object immediately. You can use the method to begin processing the entity body before the complete contents of the body have been received and asynchronously read the request entity in chunks. This method can be useful if the request is uploading a large file and you want to begin accessing the file contents before the upload is finished.
So you could grab the beginning of the post, and provided your client-facing page sends the ID towards the beginning of its transmission, you may be able to pull that out. Of course, this would be reading raw byte data which would need to be decoded so you could grab the inbound post's ID. There's also a buffered one that will allow the stream to be read in pieces but will also build a complete request object for processing once it has been completely received.
Create a custom action filter,
Action Filters for executing filtering logic either before or after an action method is called. Action Filters are custom attributes that provide declarative means to add pre-action and post-action behavior to the controller's action methods.
Specifically you'll want to look at the
OnActionExecuted – This method is called after a controller action is executed.
Here are a couple of links:
http://www.infragistics.com/community/blogs/dhananjay_kumar/archive/2016/03/04/how-to-create-a-custom-action-filter-in-asp-net-mvc.aspx
http://www.asp.net/mvc/overview/older-versions-1/controllers-and-routing/understanding-action-filters-vb
Here is a lab, but I think it's C#
http://www.asp.net/mvc/overview/older-versions/hands-on-labs/aspnet-mvc-4-custom-action-filters
I am uploading multiple files using javascript.
After I upload the files, I need to run several processing functions.
Because of the processing time that is required, I need a UI on the front telling the user the estimated time left of the entire process.
Basically I have 3 functions:
/upload - this is an endpoint for uploading the files
/generate/metadata - this is the next endpoint that should be triggered after /upload
/process - this is the last endpoint. SHould be triggered after /generate/metadata
This is how I expect the screen to look like basically.
Information such as percentage remaining and time left should be displayed.
However, I am unsure whether to allow server to supply the information or I do a hackish estimate solely using javascript.
I would also need to update the screen like telling the user messages such as
"currently uploading"
if I am at function 1.
"Generating metadata" if I am at function 2.
"Processing ..." if I am at function 3.
Function 2 only occurs after the successful completion of 1.
Function 3 only occurs after the successful completion of 2.
I am already using q.js promises to handle some parts of this, but the code has gotten scarily messy.
I recently come across Backbone and it allows structured ways to handle single page app behavior which is what I wanted.
I have no problems with the server-side returning back json responses for success or failure of the endpoints.
I was wondering what would be a good way to implement this function using Backbone.js
You can use a "progress" file or DB entry which stores the state of the backend process. Have your backend process periodically update this file. For example, write this to the file:
{"status": "Generating metadata", "time": "3 mins left"}
After the user submits the files have the frontend start pinging a backend progress function using a simple ajax call and setTimeout. the progress function will simply open this file, grab the JSON-formatted status info, and then update the frontend progress bar.
You'll probably want the ajax call to be attached to your model(s). Have your frontend view watch for changes to the status and update accordingly (e.g. a progress bar).
Long Polling request:
Polling request for updating Backbone Models/Views
Basically when you upload a File you will assign a "FileModel" to every given file. The FileModel will start a long polling request every N seconds, until get the status "complete".
In Silverlight I got the following problem. If you fire multiple requests to the web service, the responses might not return in an ordered sequence. Meaning if the first request takes longer than the following ones, its response will return at last:
1. Sending request A.. (takes longer for some reason)
2. Sending request B..
3. Sending request C..
4. ...
5. Receiving response B
6. Receiving response C
7. Receiving response A
Now in my scenario, I am only interested in the most recent request being made. So A and B should be discareded and C should be kept as only accepted response.
What is the best approach to manage this? I came up with this solution so far:
Pass a generated GUID as user object when sending the request and store that value somewhere. As all responses will contain their respective GUID, you can now filter out the stale responses. A request-counter instead of a GUID would work as well.
Now I wonder if there are any better approaches to this. Maybe there are any out of the box features to make this possible? Any ideas are welcome..
I take a similar approach in my non-WCF ASP.NET web services, though I use the DateTime of the request instead and then just store the DateTime of the most recent request. This way I can do a direct less than comparison to determine if the returning service is the most recent or not.
I did look into canceling old service calls before making new ones, but there is no CancelAsync call for web services in Silverlight and I have been unable to find an equivalent way of doing this.
Both of these approaches are what I took when I worked on a real time system with a lot of service calls. Basically just have some way to keep track of order ( incrementing variable, timestamp, etc. ) then keep track of highest received response. If the current response is lower than the highest, drop it.