Service posting straight to Azure Queue - api

I have this service that posts JSON messagest to a given url, the requests cannot be modified and because of that I cannot add all the headers I need to post to a queue in Azure.
I have been researching on this but seems like the only way to make a post other from the authorization headers is allowing a range of IPs to post to the Q; something that I cannot do either because the sender can change IPs and I could loose data, not good.
What I'm trying to find out is if Microsoft has a way (or some different service from queues) that I can use to prevent data loss in case my app is down (this is where the queue comes in), or if there's some way I can allow this external provider to post to my queue without all the security or with a minimum of it.
Thanks in advance.

Related

Event notification on new e-mail in IBM Domino

Is it possible to subscribe to mail events on an IBM Domino server?
I need a service similar to the one provided by Microsoft Exchange Event Notification, where you can subscribe to events and get notified when there are changes - eg. arrival of a new e-mail. I need the solution to be server side, since I can't rely on users having their client running.
Unfortunately, as per my comment above, there is no pre-packaged equivalent to the push, pull and streaming subscription services that EWS supports. A Notes client can get notifications via Notes RPC protocol, and there's also obviously some technology in IBM's Notes Traveler mobile product, but nothing that I'm aware of as a pre-packed web service or even as a notifications API. You would have to build it. There are a variety of ways you could go about it.
For push or streaming subscriptions, one way would be with a Notes C API plugin using the Extension Manager, running on the server and monitoring the mailboxes. You might be able to use a DSAPI plugin into Domino's HTTP stack to manage the incoming connections and feed the data out to subscribers, but honestly I have no idea if Domino's HTTP stack can handle the persistent connections that are implied in the subscription model. Alternatively, the Extension Manager plugin could quickly send the data over to code written in any other language that you want, running on any web stack that. Of course, you'll have to deal with security through all the linked-together parts.
For pull subscriptions, I guess it's really more of a polling archiecture, with state saved somewhere so that only changes since the last call will be delivered. You have any number of options for that. You could use Domino's built-in HTTP server, obviously, so you could write your own Domino-hosted web service for this. You could also use the Domino Data Service, which is a REST API, to do this -- with all necessary state information being stored on the client-side. (On quick look, I don't see a good option for getting all new docs since a specified date-time via Domino Data Service, but it might be possible.)
I do worry a bit about scalability of any custom solution for this. My understanding is that Microsoft has quite a bit of caching and optimization in their services in order to address scale. Obviously, you can build whatever you need for that into your own web service, but it will likely add a lot of effort.

Camel route "to" specific websocket endpoint

I have some camel routes with mina sockets and jetty websockets. I am able to broadcast a message to all the clients connected to the websocket but how do i send a message to a specific endpoint. How do i maintain a list of all connected clients with a client id as reference so i can route to a specific client. Is that possible? Will i be able to mention a dynamic client in the to URI?
Or maybe i am thinking about this wrong and i need to create topics on active mq and have the clients subscribe to it. That would mean that i create a topic for every websocket client? and route the message to the right topic.
Am i atleast on the right track here, any examples you can point out? Google was not helpful.
The approach you take depends on how sensitive the client information is. The downside of a single topic with selectors is that anyone can subscribe to the topic without a selector and see all the information for everyone - not usually something that you want to do.
A better scheme is to use a message distribution mechanism (set of Camel routes) that act as an intermediary between the websocket clients and the system producing the messages. This mechanism is responsible for distributing messages from a single destination to client-specitic destinations. I have worked on a couple of banking web front-ends that used a similar scheme.
In order for this to work you first generate for each user a distinct token/UUID; this is presented to the user when the session is established (usually through some sort of profile query/message).
It's essential that the UUID can be worked out as a hash of the clientId rather than being stored in a DB, as it will be used all the time and you want to make sure this is worked out quickly.
The user then uses that information to connect to specific topics that use that UUID as a suffix. For example two users subscribing to an orderConfirmation topic would each subscribe to their own version of that topic:
clientA -> orderConfirmation.71jqsd87162iuhw78162wd7168
clientB -> orderConfirmation.76232hdwe7r23j92irjh291e0d
To keep track of "presence", your clients would need to periodically send a heartbeat message containing their clientId to a well-known topic that your distribution mechanism listens on. Clients should not be able to subscribe to this topic for reads (see ActiveMQ Security). The message distribution mechanism needs to keep in memory a data structure that contains the clientId and the time a heartbeat was last seen.
When a message is received by the distribution mechanism, it checks whether the clientID for which it received the message has a "live/present" session, determines the UUID for the client, and broadcasts the message on the appropriate topic.
Over time this will create a large number of topics on your broker that you don't want hanging around when the user has gone away. You can configure ActiveMQ to delete these if they have been inactive for some time.
You definitely do not want to create separate endpoint for each client.
Topic and a subscription with selector is an elegant way to resolve it.
I would say the best one.
You need single topic, which every client would subscribe to with the selector looking like where clientId in ('${myClientId}', 'EVERYONE'). Now when you want to publish a message to specific client, you set a property clientId to the id of this client. If you want to broadcast, you set it to 'EVERYONE'
I hope I understand the problem right...

WCF routing and various endpoints

I have a little problem and don't know where to start.
I need to make a subscribtion service and if returns unique address on which consumer will send soaps after subscription. It works like so : you send a SOAP on address http://foo.org/Subscribe and in response you get address http://foo.org/SubscriptionManager/1, the next consumer will get http://foo.org/SubscriptionManager/2 etc.
How can i emplement that via WCF? I guesed that WCF have something like ASP.Net Routing, where i could route links like http://foo.org/SubscriptionManager/ and access 2 as a parameter, but i haven't found something like that.
I look forward to any help.
The question I have is why do you want to route users to different endpoints?
The whole idea of returning a service URI for the consumer to call is not good design in my opinion.
You are forcing your consumers to do more work - they must make an extra call and interrogate the response just to find out which endpoint they have to call.
If your requirement is to spread load between two services you should offer a single load-balanced endpoint which will then send requests to the other endpoints.
Alternatively, if your requirement is to route certain users to one or other of the subscription services based on some rules then you can have a look at WCF-Routing.

What is the best way to route NServiceBus messages to specific clients?

Let's say I have a ClientRequestMessage message that contains a request for a specific Client. A web application will generate these requests and they need to be sent to the correct Client for handling. I can think of a few options for this.
I could have a single queue that all messages go to and specific client handlers check a property (like ClientId) to decide whether they care about it. This feels wrong on many levels to me though.
I could publish a message to all of the clients and they could decide whether or not they care about it during handling. This seems like too much traffic and wastes each client's time handling messages they shouldn't care about in the first place though.
I could have client specific queues that these messages get routed too. This one feels the best to me, but I am unsure of how to do it. I'd like to keep it simple and avoid client specific message types, but I am not sure how to tell NServiceBus "for client A send it to client A's queue and for client B send it to client B's queue".
So my question is, what is the best (most efficient? easiest to manage?) way to set this up? I am pretty sure I need to use the distributor, but not positive so thought I would ask.
BONUS QUESTION:
Let's say each client has multiple handlers. How can I make sure only one of them handles a given message? Would I need a distributor per client?
If what you really want is the solution that allows you to have just a single message where you can place a specific filter on the message based on clientId and only route the message to the client when it relates to them then I would use PServiceBus(pservicebus.codeplex.com). It will make it easier for you specific a set of subscriptions for each of your client where their messages are all filtered by clientId into a specific queue or what transport you have available. The below example shows filtering a ChatTopic by the UserName Property and the subscriber only receives the message at the specified transport when the message been published UserName property is not TJ. You are also allowed to use complex filter where you do thing such as GreaterThan("MyComplexProperty.Blah.ID", 5)
Subscriber.New("MyUserName").Durable(false)
.SubscribeTo(Topic.Select<ChatTopic>().NotEqual("UserName", "TJ"))
.AddTransport("Tcp",
Transport.New<TcpTransport>(
transport => {
transport.Format = TransportFormat.Json;
transport.IPAddress = "127.0.0.1";
transport.Port = port;
}), "ChatTopic")
.Save();
You can tell NSB where to put messages by using the MessageEndpointMappings configuration section. You can map a specific message type or a whole assembly to a queue. If you don't want to create specific message types and map them, then I would recommend the publish approach. The overhead of removing a message from the queue is pretty minimal.
If your "client" has many instances of NSB to pick up messages then you will need to use a Distributor. Check out the distributed Pub/Sub documentation.

How a WCF request can be correlated with multiple Workflow instances?

The scenario is a follow:
I have multiple clients in which they can register themselves on a workflow server, using WCF requests, to receive some kind of notifications. The information of the notifications will be received from an external system using another receive activity. The workflow then should get the notification information and callback all registered clients using send activity and callback correlations (the clients are exposing callback interfaces implemented in there and the end-point addresses passed initially with the registration requests). "Log-running workflow service" approach is used with a persistent storage.
Now, I'm looking for some way to correlate the incoming information of the notifications received from the external system with the persisted workflow instances created previously when the registration requests, so that all clients will be notified using end-points that already passed with the registration requests. Is WF 4.0 capable of resuming and executing multiple workflow instances when the information of the notification received without storing end-points somehow manually and go though them? If yes, how can I do that?
Also, if my approach of doing so is not correct, then please advice me about the best practice of doing such system using WCF services.
Your help is highly appreciated.
When you use request correlation with workflow services the correlation key must always match a single workflow instance, you can't have multiple workflow instances react to a single message. So you either need to multicast the message using all the different correlation keys or resume you workflow instances in some other way. That other way could be to store the request somewhere, like a SQL table, and have the workflows periodically check that location if they need to notify the client.