RavenDB NserviceBus for multiple Endpoints - ravendb

I have multiple Endpoints with one MVC web project.
How should I use RavenDB ?
I want to keep one saga project as endpoint and multiple other endpoints.
No other endpoints need persistence except Saga project because, Saga project is the process controller and it has to store Sagadata. It will manage the process and update the SagaData for that I want to use RavenDB to store sagadata.
However, for each endpoint I believe have to setup persistence and unnecessary I have to instantiate datastore and put it for each endpoint is it correct ?
Which option I should pick EmbededRavenDB or RavenDB hosted in IIS?
I want to use EmbededRavenDB since it doesnot need port to be open in server.
I think Server Deployment wise it will be easy. Is it true ?
For each endpoint do I have to recreate the document store instance while configuring bus ? I am not clear how to manage database across the endpoints.
Do we have any sample ?
How should I proceed ? Any thoughts!

I am using 6 Endpoints and for each endpoint I am instantiating new document store for ravendb by applying unique resourcemanagerid for each one. However, I wish I could have used Embeded RavenDB but I am not sure how should I use embeded with 6 endpoints and one MVC project. Which one will be server where should I put DATA folder lots of question I have.
As of now my development is good not sure how will do setup in server with NServicebus using Ravendb IIS hosted one.

Related

Register Externally hosted app in PCF Service Registry

I am trying to add an ASP.NET 4.x app hosted externally (using AWS Elastic Beanstalk) into the Service-registry of an existing PCF.
Edit: Is this possible? If so, can someone give me an example about how this can be done
Assuming you have network connectivity in all directions between apps in PCF and the external app, yes this should be quite possible.
However, if you're using Spring Cloud Eureka, your externally-hosted app will need to get valid OAuth credentials so that it can authenticate prior to registering.
The comment by Daniel Mikusa is very appropriate for how I achieved this.
For Pivotal SCS, you would want to create a service instance (if you
don't have one already), then create a service key for your external
app. That will give you all of the binding info/creds you need to
connect from your remote service. A service key is the same as binding
a service to an app, except it's not tied to an app so it works well
for situations like this. Just give your service key a good name, so
you know that it's being used by an external app when you come back
and see it a year from now

Does Openshift Origin 1.1 REST API allow to create new applications based on templates?

We are developing a custom console to manage development environments. We have several application templates preloaded in openshift, and whenever a developer wants to create a new environment, we would need to tell openshift (via REST API) to create a new application based on one of those templates (oc new-app template).
I can't find anything in the REST API specification. Is there any alternative way to do this?
Thanks
There is no single API that today creates all of that in one go. The reason is that the create flow is intended to span multiple disjoint API servers (today, Kube and OpenShift resources can be created at once, and in the future, individual Kube extensions). We wanted to preserve the possibility that a client was authenticated to each individual API group. However, it makes it harder to write easy clients like this, so it is something we plan on adding.
Today the flow from the CLI and WebUI is:
Fetch the template
Invoke the POST /processedtemplates endpoint
For each "object" returned invoke the right create call.

Creating a content hub and client application using Piranha CMS

First off, I need to mention that I'm not sure if what I'm trying to achieve is even supported by Piranha CMS (that's partly what I'm trying to determine here). They mention the ability to create a standalone content hub on their website, but my assumptions of what is possible with that model might be incorrect. What I've already done is created an ASP.NET MVC application that is hosting Piranha CMS and I've published it to Azure websites for testing purposes--that part works as expected. The content management interface is the only user facing piece here--it is meant only to serve as the content hub for the client application (just the one for now as this is just proof of concept work).
I am now trying to build a client ASP.NET MVC application that pulls content from the hub. This is where I'm thinking that my assumptions may have been wrong. I was thinking that I'd be able to install the Piranha CMS nuget package(s) on the client as well, and I'd be able to configure the framework to get content from the hub in the same way that it would if the content were hosted on the client site. I realize that I could get the content from the hub using Piranha's REST api, but what I want to do is to be able to use the more friendly entity model based api for this.
So my question is whether it is possible (within reason) to setup Piranha CMS in the way that I've described. If it is, how exactly do I configure the client such that it is aware of the location of the content hub?
There are currently no .net client api consuming the rest services as the simplest scenario would be to deploy .net applications together with the server. In the setups I've done native apps & html5 knockout/angular applications have used the rest api's for getting json data. You should however be able to white such a module, performing the HTTP calls and the deserializing the json without any problems.
Regards
HÃ¥kan

Versioning WCF service with a routing service

I have been tasked with working out a versioning strategy for a new suite of WCF services. Much of the reading I have done recommends the use of an intermediate routing service to forward on calls to the appropriate service version. I understand how this works but am questioning how much benefit this gives in terms of shielding clients from breaking changes.
For example, I have two versions of a service with different endpoints:
mycompany.com/API/v1.0/GetCustomer
mycompany.com/API/v2.0/GetCustomer
A routing service solution would look for a custom 'version' header in the SOAP message and route the call to the appropriate service. If no version header is found then the latest version could be used.
Alternatively, I could just expose both endpoints and the consumer can call the required version.
How is one solution better than the other? Adding the routing service seems to add an extra level of configuration. Both solutions require the consumer to change their code if they wish to upgrade...
It may be better to ask this in a separate post but maintaining multiple versions of a service means we will need concurrent versions of contracts in our source code. Is there any recommended way of managing this (e.g. namespacing)?
Thanks
You can use routing in combination with WCF Action filtering. You can look at http://www.dotnetcurry.com/ShowArticle.aspx?ID=470
You can rebuild your service by sending a version parameter to the GetCustomer operation and maintain it as a single Service.

Multiple service with Single endpoint in Client side (Consuming Application)

I am having multiple WCF service but while consuming the service in Web application each service having its own end point to access the service. I need to have single end point in my web application to access all the services. Is there any other way except partial class.
No this can't be done. You can't merge multiple services into one endpoint.
You mention a partial class definition. That actually means you merge your different services together and expose them as one single service. This can off course be done and would give you one single access point (since it's just one service).
This way you could still logically separate the code in different files and let the them each implement a single interface. The service as a whole would implement a combined interface of all the different elements and that's what you would expose as a service endpoint.
--EDIT--
I personally wouldn't merge the services into one big service because that one big service will have to process all the incoming request. What if you want a certain configuration setting for one service, or you want to do some load balancing and move several services to another server?
If the problem is in maintaning the web.config files, you should have a look at web.config transformations. That way you can automate the process of configuring your application for different deployment scenarios.