We are developing a WCF based data analytic application which will receives data from multiple instruments , store it in a database and make it available for data analysis... Since the data flow to the application can be high , we are planning to use a Queuing solution , our initial choice is MSMQ ... Can you please let me know if there are any alternate solutions
Related
I need to create a service to collect and consolidate events from other services, as far as found on internet ,the aggregator service helps to find out what's going on in the application flow, I have confusion here which need your help, aggregator microservice means if any input or output from a service should be sent to the aggregator service with time and date? But in clouds also we have such a service like application insights, does not it do the same thing? Even if we store every event it gona be a huge data in the db,is it really a best solution?
So Answering your first question,
Aggregator microservice means if any input or output from a service should be sent to the aggregator service with time and date?
Not Really, Aggregator Microservice is a pattern, which is basically another service that receives requests, subsequently makes requests to multiple different services and combines the results and responds to the initiating request.
So I guess you're looking for some Log aggregators, which are software functions that consolidate log data from throughout the IT infrastructure into a single centralized platform where it can be reviewed and analyzed.
But in clouds also we have such a service like application insights, does not it do the same thing? Yes, you can say that it's a similar service.
Even if we store every event it gona be a huge data in the db,is it really a best solution? Leave this with your Log aggregator tool, it will have a proper mechanism to keep your data. Mostly they will keep the data in a compact way and properly indexed too.
All ,
We have a requirement to develop a azure based platform, in which the user can configure multiple pharmaceutical instruments, start measurements on them and analyze the measured data. The typical components in the azure based platform will be following
1 - A .NET based 4 client application running on the computer connected to each instrument. This client application should receive the start measurement command from the azure platform , perform the measurement and update the result back to the azure*
2 - A set of services[probably REST based] which will get the results from the client application and update the database on the cloud
3 - A set of services and business logic which which can be used to perform analysis on the data
4 - A asp.net web application where the user can view instrument details , start measurement etc
There is a two way communication between the Azure platform and the client application i.e. the client needs to update results to the azure and the azure needs to initiate measurement on the instrument via the client application
In such a scenario , what is the recommended approach for the azure platform to communicate to the clients. Is it any of the following
1 - Create a duplex service between the client and server and provide a call back interface to start the measurement
2 - Create a command queue using Azure message queue for each client. when a measurement needs to be started , a message will the put on the queue. The client app will always read from the queue and execute the command
or do we have any other ways to do this , any help is appreciated
We do not fully understand your scenario and constraints around it, but as pointers, we have seen lot of customers use Azure storage queues to implement master-worker scenario (some component adds message to appropriate queue to get work done (take measurements in your case) and workers polling the queue to process this work (client computer connected to your instrument in this case)).
In terms of storing the results back, your master component could provide SAS access to client to write results back to specific blob in an Azure storage account and either have your service and business logic monitor existence of that blob to start your analysis.
Above approach will decouple your client from server and make communication asynchronous via storage. Again, these are just pointers and you would be the best person to pick the right approach that suits your requirement
For communication between the server and the client, you could use SignalR http://signalr.net/ there are two forms of messaging systems supported "as a service" on Azure, these are Service Bus and Message Queues - see this link http://msdn.microsoft.com/en-us/library/hh767287.aspx
I am novice in WCF and I have a project that needs to be migrated into WCF communication base with the client/server and server to server architecture.
My question is what is the right messaging function that I need for this project that insure the security of data across the network ,reliable connection and speed exchange of data.
I was able to find out the WCF has numerous messaging function.
Below is the architecture of my project:
Note: The clients should be simultaneously updated by both data processing and feed source servers. And clients also sends simultaneous requests to the servers while feeds are still being supplied by the feed source server.
I would be appreciate any suggestion or comments.
My first question is why are you putting the Connection Manager (CM) component in-between your clients and the services which they want to use? What is the job it does which means it needs to be right in the middle of everything?
This ultimately means that your CM component will have to handle potentially high volumes of bi-directional traffic across potentially different transport bindings and introduces a single failure point.
What if client A wants only to receive messages from the Feed Source (FS) component? Why should client A have to deal with an intermediary when it just wants to send a subscription notification to receive updates from the FS?
Equally, what if client B wants to send a message to the Data Processing (DP) component? Surely it should just be able to fire off a message to DP?
I think the majority of what you want to do with this architecture can be achieved with one-way messaging, in which case you should use netMsmqBinding (assuming you are in a pure wcf environment).
I have a need for an application to access reporting data from a remote database. We currently have a WCF service that handles the I/O for this database. Normally the application just sends small messages back and forth between the WCF service and itself, but now we need to run some historical reports on that activity. The result could be several hundred to a few thousand records. I came across http://msdn.microsoft.com/en-us/library/ms733742.aspx which talks about streaming, but it also mentions segmenting messages, which I didn't find any more information on. What is the best way to send large amounts of data such as this from a WCF service?
It seems my options are streaming or chunking. Streaming restricts other WCF features, message security being one (http://msdn.microsoft.com/en-us/library/ms733742.aspx). Chunking is breaking up a message into pieces then putting those pieces back together at the client. This can be done by implementing a custom Channel which MS has provided an example of here: http://msdn.microsoft.com/en-us/library/aa717050.aspx. This is implemented below the security layer so security can still be used.
I'm playing around with windows azure and I would like to build a clouded server application that receives messages from many different clients, such as mobile and desktop.
I would like to build the client so that they work while in "offline-mode", i.e. I would like the client to build up a local queue of messages that are sent to the azure server as soon as they get online.
Can I accomplish this using wcf and/or azure queing mechanism, so that I don't have to worry about whether the client is online or offline when I write the code?
You won't need queuing in the cloud to accomplish this. For the client app to be "offline enabled" you need to do queuing on the client. For this there are many options, a local database, xml files, etc. Whenever the app senses network availability you can upload your queue to Azure. And yes, you can use WCF for that.
For the client queue/sync stuff you could take a look at the Sync Framework.
I haven't found a great need for the queue so far. Maybe it's just that I'm not seeing it in my app view. Could also be that the data you can store in the queue is minimal. You basically store short text strings (like record ids), and then you have to do something with the ID when you pull it from the queue, such as look it up, delete it, whatever.
In my app, I didn't use the queue at all, just as Peter suggests. I wrote directly to table storage (accessed via it's REST interface using StorageClient) from the client. If you want to look at a concrete example, take a look at http://www.netalerts.mobi/traffic. Like you, I wanted to learn Azure so I built a small web site.
There's a worker_role that wakes up every 60 seconds. Using one thread, it retrieves any new data from it's source (screen scraping a web page). New entries are stored directly in table storage (no need for a queue). Another thread deletes entries in table storage that are older than a specified threshold (there's no issue with running multiple threads against table storage). And then I'm working on the third thread which is designed to send notifications to handheld devices.
The app itself is a web_role, obviously.