How to share settings with Run cunfigurations (like inheritance) - intellij-idea

I have configurated about 100 different Run configurations in my node project. recently was introduced into the project some environment variables
How to share the same info between all run configurations without edit one by one

There is no way to perform such an operation in the IntelliJ IDEA UI. However, the run configurations are stored as .xml files under the .idea directory in your project root. You can run a batch operation on these files to make the changes you need.

Related

In IntelliJ IDEA, how to copy non-source assets to output folder during build?

I have a project in IntelliJ IDEA, inside that a couple of modules and one of my modules has two build configurations. One of them needs to copy a <projectroot>/tools folder to its out/production/<BuildConfigurationName> folder. Can IDEA somehow automate this?
The accepted answer above is incorrect. IDEA can do this (without ant/gradle) via the artifacts system (accessed via Build menu or project settings). Any one artifact job copies multiple files/folders/build outputs to a chosen location (optionally jarred) and can be set to automatically run on make.
Artifacts can even be chained, i.e. output from one as input to another.
Can IDEA somehow automate this?
Not directly, no. Ultimately IDEA is an IDE and not a build tool. While it can do a lot during a build, it does not have the ability to copy non-source files to an alternate directory, let alone a dynamically named directory.
If you marked the tools directory as a source directory (and none of its contained file types were set in the "Ignore files and folder" setting at the bottom of the "File Types" settings dialog), IDEA would then copy the tools directory to the out directory. But renaming requires a more sophisticated build tool.
Ultimately, the "ideal" or "best practices" solution would be to build your project using a build tool like Maven, Gradle or Ant for which this type of thing would be a snap.
If that is not an option, or for some reason you really want IDEA to do the build, the best thing you could do is to write a simple Ant script to the copy for you. (Or possibly Gradle, I do not have much experience with Gradle yet. Maven could do it, but it'd be a bit cumbersome compared to Ant.) In any Run/Debug configurations, you can define the ant script target to run before or after the IDEA "make" in the Before Launch section. (You can set that as a default for any newly created configurations by configuring it in Defaults on the left). If you run your build manually, you can assign a shortcut to the ant build and then run it and the make in sequence. Alternatively, you could record a Macro (Edit > Macros) to run both in sequence and then (optionally) assign the macro a keyboard shortcut.

TFS Build dropping extra files including csproj in target folder

I have an automated build process set up to run from a build definition in TFS, which publishes a web application and generates/executes a database project script successfully via publish profiles that are passed as msbuild arguments in the build process definition. Everything is now running as expected except that several unnecessary files are being deployed to the target folder, including the .csproj file, all of the config transforms, and the properties folder which contains all of my publish profiles.
This is strange because 1. It's definitely not including ALL files/folders and mostly appears to be including ones used by the publish profile like transforms, while applying the transform correctly and excluding any explicitly excluded file (as defined in the pubxml), and 2. The process works perfectly if I do it by publishing from the project in Visual Studio 2013. I have the profile configured to only include files needed by the application, and I've confirmed in the csproj file that this property is there.
I tried excluding the properties folder from deployment in the pubxml file, but this causes the build to crash because it can't find the assembly file. What I've gathered is that the process is keeping all files it needs to complete the build, and dropping all of those files in my destination folder. FWIW, I'm using the "file system" method and I'm not sure yet if web deploy will make a difference. I haven't been able yet to connect to the target server with web deploy, but that's a separate problem to solve. Is there something in the build that I can configure so that my destination folder has only the files it needs to run the application, and not the files needed to BUILD the application?
FYI I also have not been using a drop folder, I'm not sure if that makes a difference or not but that might be the only thing I haven't tested as it doesn't seem necessary since I'm using a publishprofile and don't want to use the default tfs build configuration.
I found a solution that works well enough, after reading this: http://www.asp.net/web-forms/tutorials/deployment/advanced-enterprise-web-deployment/excluding-files-and-folders-from-deployment
This was a little uglier solution than I wanted, since it requires hard-coding the names of excluded files, but it does the trick and only requires identifying the files and folders in one location instead of altering a publish profile for each target environment. I created a wpp.targets file and used the ExcludeFromPackageFolders and ExcludeFromPackageFiles elements to identify the extra files. Ironically, if I don't also name the wpp.targets file in the exclude element, THAT file is included in my package. It's possible MSDeploy doesn't have the same issues with TFS as filesystem, but after spending half a day trying to work through a different set of issues and permissions workarounds, we decided that file system is a cleaner publishing method.

TeamCity config

We're pretty new to TeamCity at work. We have a build & deployment pckage setup which is using MSBuild/MSDeploy to ship changes to our web servers. However, we have a few issues (apologies for putting a few questions on the same post). For clarification our solutions looks like so:
Project Folder
WebApp (includes .csproj file. Includes a folder called "media" - this folder is not in SVN)
Libraries (includes referenced assemblies)
Our issues:
There is a specific folder within the Libraries folder that must be copied into the bin directory after build (because of an assembly redirect). We have always used a PostBuild event, however this doesnt work in TeamCity.
The folder "media" within the WebApp folder is not included in SVN. When the TeamCity package is executed it deletes this folder. I would like to prevent TeamCity from deleting just this folder.
When we run the TeamCity task, we get an ERROR_FILE_IN_USE error for one of the files teamcity is trying to delete during the sync task. I have read about using the app_offline.htm file to combat this - but quite how Im not sure.
I'm going to guess that some of these settings can be command line parameters in the msbuild job - I think it would be better to store these in the csproj file rather than just in teamcity if it is possible?
thanks in advance
Al
A few questions on the information provided
Can you clarify what you mean by post-build command doesnt work? Does it fail or does it just not do what you expect?
How have you setup your post-build command? does it reference specific filepaths? TeamCity executes MSBuild in the same was as you could from the command line or from visual studio.
Regarding the MSDeploy folder issue, you can configure MSDeploy with a Skip Action, here's a link to another post describing how to do this
Prevent MSDeploy (selectively) from deleting folders on target IIS server
Because MSDeploy is trying to deploy into a folder being used by IIS you are also seeing the file locking issue. There are two solutions
1. Add a teamcity step to stop IIS (using PowerShell) before deploying. This will cause downtime.
2. Deploy to a different folder and then switch IIS to point to your new folder. This is a much better solution as you also have roll back.
A much easier solution to all of this is to use a Deployment Tool such as Octopus Deploy to deploy your application. You can learn more about Octopus Deploy at http://octopusdeploy.com/

How can I get Eclipse to use my IVY_HOME variable when downloading ivy dependencies?

My company uses extensive use of ivy to download dependencies. Some of these dependencies are huge (~500MB) and take a while to download from the remote repositories.
To build our application we have an ant script that will first resolve all the dependencies and the deploy to the server.
I have set an "IVY_HOME" environment variable so that all the dependencies are downloaded to D:\ivy_home instead of C:\Users\.ivy2\ - this is because D: is my SSD which is significantly faster, and it is where my local server directories are located - so copying files from ivy_home to the server is super fast.
But for some reason when I am using IvyDE plugin inside eclipse - it always wants to download a separate copy of all the dependencies and puts them into my C:\ which is causing several issues:
Local publishes from the ant script will not be picked up in eclipse since they are placed into a different location
Dependencies already downloaded in D: will not get picked up which makes the ivy Resolve inside eclipse much slower than it needs to be
The dependencies are in a slower drive in eclipse so performing searches, and executing these jars is also slower
How about creating symlink to replace the .ivy2 in Users to D? I've tried it on my own and it's looks working fine.
Open cmd as root, and then execute this line
mklink /d C:\Users\{username}\.ivy2 D:\.ivy2
I'd create an ivysettings.xml file and specify the location of my cache using the caches directive. See the following answer for example:
can I turn off the .ivy cache all together?
Why don't you set up IVY globally with the ivysettings.xml along with a property file.
This property file could have this:
ivy.default.ivy.user.dir=D:\ivy_home
For individual projects you could uncheck "enable project specific settings" for each IvyDE library management, so they would use IVY global settings, with one extra eclipse environment configuration.

Maven best practice for generating artifacts for multiple environments [prod, test, dev] with CI/Hudson support?

I have a project that need to be deployed into multiple environments (prod, test, dev). The main differences mainly consist in configuration properties/files.
My idea was to use profiles and overlays to copy/configure the specialized output. But I'm stuck into if I have to generate multiple artifacts with specialized classifiers (ex: "my-app-1.0-prod.zip/jar", "my-app-1.0-dev.zip/jar") or should I create multiple projects, one project for every environment ?!
Should I use maven-assembly-plugin to generate multiple artifacts for every environment ?
Anyway, I'll need to generate all them at once so it seams that the profiles does not fit ... still puzzled :(
Any hints/examples/links will be more than welcomed.
As a side issue, I'm also wondering how to achieve this in a CI Hudson/Bamboo to generate and deploy these generated artifacts for all the environments, to their proper servers (ex: using SCP Hudson plugin) ?
I prefer to package configuration files separately from the application. This allows you to run the EXACT same application and supply the configuration at run time. It also allows you to generate configuration files after the fact for an environment you didn't know you would need at build time. e.g. CERT
I use the "assembly" tool to zip up each domain's config files into named files.
I would use the version element (like 1.0-SNAPSHOT, 1.0-UAT, 1.0-PROD) and thus tags/branches at the VCS level in combination with profiles (for environments specific things like machines names, user name passwords, etc), to build the various artifacts.
We implemented a m2 plugin to build the final .properties using the following approach:
The common, environment-unaware settings are read from common.properties.
The specific, environment-aware settings are read from dev.properties, test.properties or production.properties, thus overriding default values if necessary.
The final .properties files is written to disk with the Properties instance after reading the files in given order.
Such .properties file is what gets bundled depending on the target environment.
We use profiles to achieve that, but we only have the default profile - which we call "development" profile, and has configuration files on it, and we have a "release" profile, where we don't include the configuration files (so they can be properly configured when the application is installed).
I would use profiles to do it, and I would append the profile in the artifact name if you need to deploy it. I think it is somewhat similar to what Pascal had suggested, only that you will be using profiles and not versions.
PS: Another reason why we have dev/ release profiles only, is that whenever we send something for UAT or PROD, it has been released, so if there is a bug we can track down what the state of the code was when the application was released - it is easier to tag it in SVN than trying to find its state from the commit history.
I had this exact scenario last summer.
I ended up using profiles for each higher environment with classifiers. Default profile was "do no harm" development build. I had a DEV, INT, UAT, QA, and PROD profile.
I ended up defining multiple jobs within Hudson to generate the region specific artifacts.
The one thing I would have done differently was to architect the projects a bit differently so that the region specific build was outside of the modularized main project. That was it would simply pull in the lastest artifacts for each specific build rather than rebuild the entire project for each region.
In fact, when I setup the jobs, the QA and PROD jobs were always setup to build off of a tag. Clearly this is something that you would tailor to your specific workplace rules on deployment.
Try using https://github.com/khmarbaise/multienv-maven-plugin to create one main WAR and one configuration JAR for each environment.