Need advice regarding deployment on multiple remote machines - msbuild

Currently I am using ms-deploy to build and deploy on several machines using team-city. In my current scenario, I need to build, package and deploy on Dev. After this I need to deploy this package on test and Live servers (which are on different domain. I understand how we do it but problem is Web transformation only occurs for test and live configs if we build a package. It means if I want to use the same package that is created for Dev cannot be used, as web transformation only occurred for Dev web config. Also know that we can change web config when un-packaging but that parameters are very limited. We have a lot of changes not just the connection string or db changes.
Another solution is to add another step to build packages for test and live as part of Dev deployment but then it means a lot of copying on remote servers, once for test and once for live which is a lot of time consuming due to different domains.
Can you please guide what is the best solution in this scenario. So I can use team-city to publish to Dev and test and live using same package and different web configs in one go.

To configure items at deployment time which are not automatically created for you. You can add a file named parameters.xml to your project and extend what you want to make available at deployment time.
Here's some documentation on the approach Using Deployment Parameters for Web.Config File Settings.

Related

Sonar Analysis of website before deployment

I currently have a website (consisting only of static files) and have currently automated the deployment of the website when changes are pushed to the master branch by using a Jenkins multibranch pipeline.
I'm planning to add an extra set of validations before deployment, and I've come across Sonar. Sonar can't be run on static files on its own; it requires these files to be served by a web server such as Apache2, because it also verifies HTTP headers.
Consequently, as long as my changes are not deployed in production, I will not be able to run Sonar on a particular development branch, and would have to wait until the branch is merged into master to obtain the results.
In this case, can you please give hints on how I can get validations results before deployment?
I would setup a Test environment on another machine. It should mirror your production environment as close as possible. Publish to there first. Run Sonar. If all checks out, then deploy to prod. This is a basic Continuous Deployment scenario.

How to manage database credentials for mule proejct

I am using database connector component, with vault component to store the database credentials. Now as per the documentation of both components i have created different properties file for each environment to store the encrypted credentials for diff env.
Following is the structure of my mule project
Now the problem with this structure is that i have to build new deployable zip file whenever i have to update the database credentials for any environment.
I need a solution where i can keep all credentials encrypted and centralized and i don't have to create a build every time after updated the credentials, We can afford to restart the server, but building new zip and deploying is really cumbersome.
Second problem we have this approach is a developer needs to know the production db to update it in properties file, this is also a security issue.
Please suggest alternate approach for credentials management for mule projects.
I'm going to recommend you do NOT try to change the secure solution provided to you by MuleSoft. To alleviate the need for packaging and deployment, you would have to extract the properties files outside of the deployment and this would be a huge risk. Regardless of where you store the property files within the deployment if you change the files, you have to package and re-deploy. I see the only solution to your problem as moving the files outside of the deployment and securely storing them. Mule has provided a solution while it may be cumbersome, they are securing these files first with encryption and secondly within the server container. You can move out the property files but you have to provide a custom implementation and you will be assuming great risk to your protected resources.
Set a VM arguement e.g. environment.type=local for local machine on your anypoint studio.
Read this variable in wherever you are reading your properties file in a way that environment type is read dynamically such as below.
" location="classpath:properties/sample-app-${environment.type}.properties" doc:name="Secure Property Placeholder"/>
In order to set the environment type on your production server(or wherever you are using mule runtime), open \conf\wrapper.conf and add the arguement wrapper.java.additional.=-Dserver.type=production. If you already have any property in this file, you may need to set the value of n appropriately. For example 13 or 14.
This way you don't need to generate different deployment artefacts for different environment because correct properties file is picked by using environment specific VM arguement.

Deploying multiple/single instances of a website

We have an ASP.NET website that we want to deploy (and remove) multiple instances of the site on the same IIS machine.
We also have a few number of customers that need to install the product on their system.
I was hoping WIX would be able to handle this, but it appears you can only have one instance installed at a time.
What options are available to me? Right now we use FinalBuilder to setup a generic "install package" which uses a batch file that a user populated with their environment settings, and uses tools like sed and awk to update config files and more scripts to deploy to IIS.
It works, but it's very cumbersome. I was hoping to find more of a GUI/command line interface to replace this process.
It sounds like MSDeploy will work for your use case. It can deploy multiple instances to the same IIS instance and can also delete instances.
The following post is specifically about service versioning but you could use the same technique to install several instances of a web app.
http://www.dotnetcatch.com/2016/03/03/simple-service-versioning-with-webdeploy/

What would cause SSIS to ignore Package Configuration Connections?

I have a very simple SSIS Package that has 2 connections defined in the Connection Manager section. An MS Access Data Source and an MS SQL Data Source Destination. All this package does is Truncate a table in the SQL Destination and Imports data from MS Access into the SQL table. This works as expected during Development within VS2013.
Now, I also have enabled Package Configurations for the package and have a couple of XML Configuration files (1 for each Connection) in a folder on the root of the C: drive. The Configuration file connections differ based on the server where they reside, but the folder structure exists on both servers so the package can execute against the server from which it is run.
I've checked the box to enable Package Configurations and deploy the package to 2 different Servers. 1 for Development and the other for QA. When I execute the package via the SSMS Integration package execution on my Development Server, the package utilizes the Development table. But when I execute the same package on my QA environment, it also utilizes the Development table.
Since the Development connection is the one that is embedded in the package via the Connection Manager, it appears (presumably anyway) that the package is using the embedded connection and ignoring the configuration files.
I have alternatively explicitly added the path to the Configuration file within the Execute Package Utility in the Configurations section to see if it made any difference but the results are the same. The configuration file is not acknowledged. So it again appears that the package is using the embedded connections that defined in the Configuration Managers.
I suppose I "may" be able to remove the Connections from the package in the Connection Managers section and turn off validations during Design time and then deploy again in effort of forcing the package to use the Config files but that doesn't seem like the way to go and a hack at best; provided that it would even work.
Not that I think it should make a difference but to provide more detail, here is a bit more concerning my Server Configuration:
Development - SQL 2014 [ServerName]
Quality Assurance - SQL 2014 [ServerName][InstanceName]
I don't recall ever having this issue before, hence my reason for posting.
Ok, since I am working against a dead line; I was hoping to acquire an answer sooner than later. But since that wasn't the case and because I've seen variations of this question before without a definitive answer (at least to satisfy this scenario) I performed some tests and am posting this for others who may also have need of this information.
The Following Conditions will ignore the use of Configuration Files even if Package Configurations are enabled in an SSIS Package. These findings are based on actual tests and affirmed to be true for SQL 2014 although prior versions may also be applicable.
Disclaimer: These tests focused on the Configuration Files as they pertained to actual Server Connections. (E.g. Connection Strings) and not any other variables although it’s conceivable that any other values within the Configuration file would also be affected.
Execution of the Package from within SSMS while connected to the Integrated Services Component and selecting to Run Package. The noted behavior is that whatever Connection value was acquired prior to deployment to the Server is the one that will be used; irrespective of the Configuration Files
Note: This holds true even if configurations are added in the Configurations section prior to execution. Although there is mention that the configurations are not imported and they cannot be edited; the fact is they were neither used during the testing.
If an SQL job is of type SQL Server Integration Services Package and no Configuration File references are actually added to the Configurations tab, the values the job will execute under whatever values were used during the last build within BIDS prior to deployment (Embedded Values)
If multiple configuration files are used by the package but some are omitted in the Configurations tab of the job; the job will use those Configuration Files designated but will default to the last values used in Development (Embedded Values) for those which are not present in the context of the job
Some of these behaviors are not very obvious and I'd imagine it could be a frustrating puzzle when someone expecting to follow the rules of most online tutorials for using Package Configuration files; would have the expected more straight forward results.
I know it was a time consuming task of testing to identify the root cause for me and although I'm not an expert; I'm certainly far from a novice with SSIS.
At any rate, I hope this helps someone else from hours of work and investigations.

myeclipse deployment issue cant see changes

Im trying to work in my project, when I change files it changes the file inside the directory that I imported the projects into my workspace. But the changes aren't showing up on the server because the server is using the folder inside .metadata/webapps/myapp etc
what have I done wrong here?
Sounds like you're deploying to the integrated Tomcat server. The deployment is normally to the .metadata/.me_tcat/webapps/myapp folder. Is this what you meant? If not then I suggest you remove the deployment and then deploy again and let the deployment location default. The folder .metadata/.me_tcap is simply where the integrated Tomcat installation looks for applications when it starts up.
You can check the deployment location by looking in the servers view. Expand the integrated Tomcat server. The deployment location for your project is shown there.
The deployed location should be kept in synch with your code. If that's not happening, it may be that the changes you're making are the kind that require a redeployment. I'm not sure what the conditions are for propagating changes but I guess there might be some situations where changes can't be picked up automatically in this way. The changes should be picked up on a redeploy.
To check, create a new web project, which will create some default configuration and a default index.jsp file. Deploy this to the integrated Tomcat server. Now make a change in the index.jsp file and save it. Check if the change is made in the deployment location, or simply run the server and check that your new index.jsp gets displayed.
For more help, I suggest you post to the forums at www.myeclipseide.com/forums