Is It possible to pass Parameter in data source of CDA - mdx

I have master/slave databases and want to parametrize the URL in data access in cda document
I want to do something like that if the master is down then changed IP from my web application and route to slave DB or vice versa.
"jdbc:mysql://"+${IP}+"/warehouse_dev"
I don't want to write the same query again and again for different data sources

As far as I undersantd, your talking about the URL value for the nonXMLBody element. As far as I know the standard do not define any parameters notation, and may be the best solution is to manage it at Network level, using for example a load balancer ore some network solution that offers this service.
Hope useful.
Martí

Related

how to connect multiple Parse servers to the same mongodb?

I would like to have two separate Parse servers (configured with a different app ID) connect to the same mongodb, so they can see the same set of users, so that I can create 2 different apps that share the same userbase.
Is this something Parse would support? Are there any expected conflicts or config caveats? I was unable to find info about this on Parse's github..
thanks
There's nothing to do, besides setting the database URL option to the same value on both servers, and that your database is accessible from both servers.
I'm not sure why you would need two different applicationId's as you want the same data and likely, logic running on both apps.
No, Parse Server does not support sharing classes between applications.
What you could do is have one of the instances or maybe a third one handle authentication and store your user information. I am pretty sure this would mean you will have to manually set user info on your requests and objects to save on the other two instances.
Another option is for each of the instances have an afterSave hook on the user class that saves and updates the info at the other instance. This seems easier to do and maintain.
I would choose the second option.

Oracle Webcenter Content : what is the difference between URLs "/wcc" and "/cs"?

My company decided to migrate to the Oracle Fusion Middleware, and we are starting to use the Oracle Webcenter components, especially the WCC (v11.1).
We can access our documents via 2 different entry points :
http://server:port_1/wcc
http://server:port_2/cs (and this URL is also accessed via RIDC)
The GUI are very different from one app to the other, but the main functionalities seem to be the same on both of them : I can browse, view, download stored documents, I have access to the metadata, I can do an advanced search or upload a file...
Why does Oracle provide 2 webapps to manage the content ? Why are these URLs on 2 different Weblogic servers (or domains) ? Should we use one address rather than the other ?
Thank you.
They are simply two different GUIs. Some functionality is available in one that may not be available in the other.
Use whatever interface best fits your needs (including what user's may prefer).
/cs/ is for the native (original) UI.
/wcc/ is for the new(ish) ADF WebUI. Also see this blog post.

Merge two Endeca Servers (Endeca 3.1) into one. Including their current data

Let me explain in more detail:
1st: I'm running endeca 3.1, so Endeca Server here refers to 3.0's Data Domain.
I'm required to use an Endeca Server currently present on Endeca (Downloaded a Demo VM). All the info on it, including, groups, attributes and data, must be merged into out Endeca Server. (It can also be the other way around, i could merge my Endeca Server into this one.)
So far, i've tried to do the following:
1) Clone the Endeca Server
2) use the putCollection sconfig operation to create a collection on it with the same name i have on mine.
3) Load configurations using the LoadCollection & LoadAttributes graphs from OEID POC Template 3.1. I point to the new collection on the Configuration.xls file.
This is where i encounter an issue. The LoadAttributes graph gets a T/O message from the server's WS. Then the config WSDL becomes inaccesible for a while. I can't go beyond this point.
I've been able to load data into the collection, but i need to load the attributes first.
THanks in advance for your replies.
Regards
There are a few techniques.
Have you tried exporting the data domain and then importing it?
You can use the endeca-cmd tools to export to a file, and then import from that file. This would enable you to add 2 datastores into one server.
If you want to combine 2 datastores then that is a different question.
The simplest approach in 3.1 if the data collections are small. Extract then as CSV (via a data-table), convert to XLS and add them via self provisioning into separate collections within a single data store. If you are running in the VM this is potentially the easiest approach.
This can also be done using Integrator.
You don't need to load the attributes unless you are using multi-value types. You can call against the conversation web-service to extract data and then load it using 'bulk-load' I would not worry too much about creating the attributes unless this becomes essential due to their type or complexity. If you cannot call against the conversation web-service, then again extract as csv and load using Integrator.

BizTalk 2010 - Using external source for credentials

On my BizTalk server I use several different credentials to connect to internal and external systems. There is an upcoming task to change the passwords for a lot of systems and I'm searching for a solution to simplify this task on my BizTalk server.
Is there a way that I could adjust the File/FTP adapters to extract the information from an XML file so that I can change it only in the XML file and everything will be updated or is there an alternative that I could use such as PowerShell?
Did someone else had this task as well?
I rather don't want to create a custom adapter but if there is no alternative I will go for that one. Using dynamic credentials for the send port can be solved with Orchestration but I need this as well for the receive port.
You can export the bindings of all your applications. All the passwords for the FTP and File Adapter will be masked out with a series off * (asterisks).
You could then edit your binding down to just those ports you want to update, replace the masked out passwords with the correct passwords, and when you want the passwords changed, import them.
Unfortunately unless you have already prepared tokenised binding files the above is a manual effort.
I was going to recommend that you take a look at Enterprise Single Sign-On, but on second thoughts, I think you probably just need to 'bite the bullet' and make the change in the various Adapters.
ESSO would be beneficial if you have a single Adapter with multiple endpoints/credentials, but I infer from your question that isn't the case (i.e. you're not just using a single adapter). I also don't think re-writing the adapters to include functionality to read usernames/passwords from file is feasible IMHO - just changing the passwords would be much faster, by an order of weeks or months ;-)
One option that is available to you however, depending on which direction the adapter is being used: if you need to change credentials on Send Adapters, you should consider setting usernames/passwords at runtime via the various Adapter Property Schemas (see http://msdn.microsoft.com/en-us/library/aa560564.aspx for the FTP Adapter Properties for example). You could then easily create an encoding Send Pipeline Component that reads an Xml file containing credentials and updates the message context properties accordingly, the message would then be send with the appropriate credentials to the required endpoint.
There is also the option of using ESSO as your (encrypted) config store instead of Xml files / database etc. Richard Seroter has a really good post on this from way back in 2007 (its still perfectly valid tho.)

How to handle multiple data sources in one WCF Domain Service?

I'm working on creating a WCF Domain Service which at the moment provides access to a database. I created the Entity Model, added the DomainService (LinqToEntitiesDomainService) and everything works so far.
But there are cases when my data doesn't come from the DB but somewhere else (for instance an uploaded file). Are there any best practices out there how to handle this different data sources properly without resorting to writing two completely different data providers? It would be great to access both types with one interface. Is there already something I can use?
I'm fairly new to this so any advice apart from that is highly appreciated.
How many cases where the data comes from a file? How many files? How will you know if a file is there? Are you going to poll the directory? what format are the files? (XML support is possible)
Microsoft's documentation suggests that you can create a custom host endpoint, but I don't know what limitations there are.