Mule ESB pluggability - mule

I am new to Mule ESB. I want to know that, can I upgrade Mule application without redeploying. I am talking about pluggability. Suppose my application is already running and now some more features or client flow added. Now I want to add this new flow like a new plugin. Is it possible to do it without down time.?
For e.g. In my Mule application, I have used one HTTP connector to connect with one client.
Now in case of new HTTP connector required to add, can I do it without redeploy.?

You can now modify your configuration files and custom classes and have them reloaded without having to restart Mule.
Mule checks every three seconds for updated configuration files under the $MULE_HOME/apps directory, and when it finds one, it reloads the configuration file and the JARs in that applications lib directory.
https://docs.mulesoft.com/mule-user-guide/v/3.2/hot-deployment

You can "try" to update or replace an application folder contents and touch/modify the application’s configuration file to have Mule reload the config and automatically re-deploy the application.

No, you can't. What you can do is to deploy a new application, and use the same HTTP connector if you define it in a common domain for both applications.

This is not possible in a single server. You can achieve it through martinfowler.com/bliki/BlueGreenDeployment.html .
To make this work you need two servers and a proxy in front. You take one server offline in the proxy, update it then reenable. Then you do the same with the second server. we are facing same issue in Talend ESB.

It's not possible but you can add and redeploy the application

Related

WSO2 ESB HTTP Endpoint does not get added to the Synapse configuration

I have built an ESB Project with an HTTP endpoint. But for some reasons. The Endpoint that I have defined and added to the ESB Solution does not seem to be reflected in the project and seem to be not reflecting when I deploy to the server. The endpoint is basically not being used. I have also tried to check the under the Defined Endpoints tab in the server Enterprise Integrator console, #
Home > Manage > Service Bus > Endpoints
but it isn't there. Numerous restarts have not helped neither has undeploying and redeploying the car file. Can someone point out where I might have gone wrong? As usual, thanks in advance.
Could be different things, to check:
Extract the .car file (it's just a .zip, so rename or use 7zip to
extract) and see if your endpoint is there.
Check if the serverrole in the pom file is correct (should be EnterpriseIntegrator it think)
Sometimes renaming artifacts causes problems as the file does not get renamed correctly or a reference is not updated in one of the
project files. Try removing the endpoint and use 'search in files' to
remove any lingering references in pom files.

Handle multiple connector in one mule application using mule domain project

I have an application which consist of two http connector with different host and port. How to handle it with shared resource i.e. mule domain project.
Have a look at https://docs.mulesoft.com/mule-user-guide/v/3.8/shared-resources.
The general idea is simple:
1. Create a domain project (in AnypointStudio: New -> Mule Domain project)
2. Move your connector configuration from the project to the domain project (use cut/paste in XML, not the graphical editor)
3. Reference the domain project from your Mule project (property domain in mule-deploy.properties
And don't forget for deployment: The domain must be deployed before you deploy your project.
#Kishan Kumar Soni, The referenced documentation explains on how use the Shared Resources with mule and is not meant for single connector only. You can move your two http:listener-config into shared resources config(domain-config) file, make sure to have unique name, then reference them in your application(s) as desired. It will work.

How to manage database credentials for mule proejct

I am using database connector component, with vault component to store the database credentials. Now as per the documentation of both components i have created different properties file for each environment to store the encrypted credentials for diff env.
Following is the structure of my mule project
Now the problem with this structure is that i have to build new deployable zip file whenever i have to update the database credentials for any environment.
I need a solution where i can keep all credentials encrypted and centralized and i don't have to create a build every time after updated the credentials, We can afford to restart the server, but building new zip and deploying is really cumbersome.
Second problem we have this approach is a developer needs to know the production db to update it in properties file, this is also a security issue.
Please suggest alternate approach for credentials management for mule projects.
I'm going to recommend you do NOT try to change the secure solution provided to you by MuleSoft. To alleviate the need for packaging and deployment, you would have to extract the properties files outside of the deployment and this would be a huge risk. Regardless of where you store the property files within the deployment if you change the files, you have to package and re-deploy. I see the only solution to your problem as moving the files outside of the deployment and securely storing them. Mule has provided a solution while it may be cumbersome, they are securing these files first with encryption and secondly within the server container. You can move out the property files but you have to provide a custom implementation and you will be assuming great risk to your protected resources.
Set a VM arguement e.g. environment.type=local for local machine on your anypoint studio.
Read this variable in wherever you are reading your properties file in a way that environment type is read dynamically such as below.
" location="classpath:properties/sample-app-${environment.type}.properties" doc:name="Secure Property Placeholder"/>
In order to set the environment type on your production server(or wherever you are using mule runtime), open \conf\wrapper.conf and add the arguement wrapper.java.additional.=-Dserver.type=production. If you already have any property in this file, you may need to set the value of n appropriately. For example 13 or 14.
This way you don't need to generate different deployment artefacts for different environment because correct properties file is picked by using environment specific VM arguement.

how to share connector configurations between applications in mule esb?

Instead of using mule domain project (supports only sharing of connector configurations of jms, http... limited connectors). I need to share connector configuration of Object store connector between applications. I am trying to access common data in multiple applications, hence I need this. please help me.
try with session Objects you can send session data between different mule application.
You could use configuration properties through a properties file in the domain project such as
domain.properties
Example properties:
domain.value1=true
domain.value2=my text
. This file would be under the
src/main/resource
folder in your domain project.
You can refer to this property file in your application as the global configuration element, property-placeholder. Example:
<context:property-placeholder location="..\..\domains\<domain project name>\domain.properties"/>
Simply refer to the property using
${domain.value1}

Need advice regarding deployment on multiple remote machines

Currently I am using ms-deploy to build and deploy on several machines using team-city. In my current scenario, I need to build, package and deploy on Dev. After this I need to deploy this package on test and Live servers (which are on different domain. I understand how we do it but problem is Web transformation only occurs for test and live configs if we build a package. It means if I want to use the same package that is created for Dev cannot be used, as web transformation only occurred for Dev web config. Also know that we can change web config when un-packaging but that parameters are very limited. We have a lot of changes not just the connection string or db changes.
Another solution is to add another step to build packages for test and live as part of Dev deployment but then it means a lot of copying on remote servers, once for test and once for live which is a lot of time consuming due to different domains.
Can you please guide what is the best solution in this scenario. So I can use team-city to publish to Dev and test and live using same package and different web configs in one go.
To configure items at deployment time which are not automatically created for you. You can add a file named parameters.xml to your project and extend what you want to make available at deployment time.
Here's some documentation on the approach Using Deployment Parameters for Web.Config File Settings.