Ant scripts cccheckin/cccheckout using the CCRC plugin to eclipse? - eclipse-plugin

Is it possible to use Ant scripts to checkin/checkout source code elements while using the CCRC plugin to eclipse? I am getting an error message saying that the element the script is attempting to check out is not part of the VOB, but of course it is there and I can check it out manually.

It should be possible to use those Ant ClearCase tasks with CCRC views ("web views" which are anologous to snapshot views)
A script like this one should work:
<project name="Testing ClearCase " default="CC" basedir=".">
<target name="CC">
<property name="FileSrc" value="MyView/MyVob/MyDir"/>
<property name="dist" value="dist"/>
<cccheckout viewpath="${FileSrc}/myFile"
reserved="false"
nowarn="true"
comment="Auto Build from script"
failonerr="false" />
<copy file="${dist}/myFile" tofile="${FileSrc}/myFile"/>
<cccheckin viewpath="${FileSrc}/myFile"
comment="Checked in by myFile.xml ANT script"
nowarn="false"
failonerr="false"
identical="true"/>
</target>
</project>
But you need to make sure your current directory is (in this script) just above where you update your web CCRC view "myView".
The only issues I know of are:
if CCRC try to checkout a file of a replicated Vob.
if the parent directory of a file to be checked-in was renamed from another view

The Ant ClearCase tasks in VonC's answer use the cleartool command (getClearToolCommand() in org.apache.tools.ant.taskdefs.optional.clearcase.ClearCase.java). When I invoke a cleartool operation, even from within or above the CCRC view, I get the error message from the question.
Now (as some years have passed since VonC's answer) there is a CCRC CLI that can be used instead (http://www-01.ibm.com/support/docview.wss?uid=swg24021929, setting CCSHARED to your top level \eclipse directory). The commands are similar to those provided by cleartool, although as it appears not to support UCM to solve your problem of doing a check out I first had to set an activity on the stream using the CCRC eclipse plugin.
To get the CCRC CLI to work with the ant ClearCase tasks would require changing the task to:
Call rcleartool rather than cleartool.
Since cleartool points to an .exe and rcleartool is a bat for loading a jar, ProcessBuilder won't be able to process the new command (I tested with rcleartool.bat and cmd \c rcleartool.bat) unless you convert the jar to an exe.

Related

Executing post-build step Intellij IDEA CE 2020.3?

I am using IntelliJ IDEA CE 2020.3 to build a simple JAR file. After the build, I'd like to copy the created JAR to a library directory.
I am using the Ant plugin that comes bundled with the IDE. I can't seem to find the underlying Ant build/control files that make the whole thing work. I assume Ant uses the .XML files that are part of the IDE's project settings, but this is unclear.
In any case, is there a way to add the "copy" step that I mention above?
Do I have to either use the built-in Ant or take it over completely myself?
Can I edit the default that ships with the IDE?
All,
So after some experimenting, I found that if I manually create the build.xml file, I can execute post-build steps. For my example here, I created the following, simple build.xml and added it to the top-level IntelliJ project directory (where the .iml file lives):
build.xml (manually created)
<project name="mylib" default="copy-file">
<target name="copy-file">
<copy file="out/artifacts/mylib_JAR/mylib.jar" tofile = "./mylib.jar" />
</target>
</project>
Note that the directories are relative to the project directory.
IntelliJ IDEA enabled me to add the copy-file task to augment the default, built-in build. To configure your tasks via the IDE, open the Ant tool window via the View/Tool Windows/Ant main menu item.
I hope this helps someone out there!

Atlassian Bamboo get data from repository and then SCP to server

Similar to this issue but that is for Windows and there is no correct answer.
I want to do a simple deployment using Bamboo (cloud version). The idea is we checkout the data from a Bitbucket repository and then use SCP to publish content.
The problem is when SCP runs, I keep getting:
There were no files
I have set up our Bitbucket repo and Bamboo can connect fine. However, I'm not sure where Bamboo checks out the files? I didn't set up our Bamboo instance, so I found that in the config settings, the default path is:
/home/bamboo/bamboo-agent-home/xml-data/build-dir/
I would have thought this should be really straight forward. The repo gets checked out into /a/path/somewhere/ and SCP uploads from /a/path/somewhere. The problem is I don't know where the path is and I cannot find any documentation that tells me where it is.
My previous experience with Bamboo SCP Task have shown that it does not work well with Windows SSH servers. I tried freeSSHd and SolarWinds SCP server and I could not connect to any of them through SCP Task.
As a workaround I used Maven build task with the following ant-run configuration:
<configuration>
<tasks>
<scp todir="username:password#xxx.xxx.xxx.xxx:/" trust="true" failonerror="false">
<fileset dir="dirname">
<include name="**/*">
</fileset>
</scp>
</tasks>
</configuration>
I had this task in a different Bamboo stage. To pass files (artifacts) from one stage to another:
click Create definition under the Artifacts tab in the first job configuration (which generates the files you wanna pass)
click Create dependency under the Artifacts tab in the seconds job configuration which runs previously mentioned ant-run task
I think you want to use the SCP Task as one of the tasks (the final one) of your Deployment project. To use that task you'll need to share one or more artifacts that are produced by the associated Build project.

Ivy: <ivy:settings> vs. <ivy:configure>

I have a master Ivy project that others include in their project via a svn:externals property. The project contains the Ivy jar, the default ivysettings.xml file that connects to our project, and a few Ant macros that allows me to standardize the way we build jars, etc. (For example, users use <jar.macro> vs. <jar>. The <jar.macro> uses the same parameters, but also automatically embeds the pom.xml in the jar and adds in Jenkins build information into the Manifest).
We also use Jenkins as our continuous integration system. One of the things I want to do is to clean the Ivy cache for each build, so we don't have any jar issues due to cache problems. To do this, I've setup my ivysettings.xml file to define a separate cache for each Jenkins Executor:
<ivysettings>
<property name="env.EXECUTOR_NUMBER" value="0" override="false"/>
<caches
defaultCacheDir="${ivy.default.ivy.user.dir}/cache-${env.EXECUTOR_NUMBER}"
resolutionCacheDir="${ivy.dir}/../target/ivy.cache"/>
<settings defaultResolver="default"/>
<include file="${ivy.dir}/ivysettings-public.xml"/>
<include url="${ivy.default.settings.dir}/ivysettings-shared.xml"/>
<include url="${ivy.default.settings.dir}/ivysettings-local.xml"/>
<include url="${ivy.default.settings.dir}/ivysettings-main-chain.xml"/>
<include url="${ivy.default.settings.dir}/ivysettings-default-chain.xml"/>
</ivysettings>
I originally used the <ivy:settings> task to configure our projects with Ivy. However, all of the Jenkins executors were using the same Ivy cache which caused problems. I switched from <ivy:settings> to <ivy:configure> and the problem went away. Apparently, <ivy:configure> sets up Ivy immediately (and thus sets up the caches correctly) while <ivy:settings> doesn't set Ivy up until <ivy:resolve> is called.
I've seen some emails on Nabble about <ivy:configure> being deprecated (or maybe not). I see nothing in the Ivy online documentation stating <ivy:configure> is being deprecated.
So, when would you use <ivy:settings> vs. <ivy:configure>. In my case, since I needed separate caches for each Jenkins executor, I needed to use <ivy:configure>, but is there a reason I might use <ivy:settings> over <ivy:configure>? And, is <ivy:configure> deprecated?
here's what I found:
<ivy:settings> is newer and the preferred way.
<ivy:configure> may or may not be deprecated.
<ivy:settings> doesn't set my Ivy settings until <ivy:resolve> is called while <ivy:configure> sets all Ivy settings as soon as the task is executed.
The last one is my issue. Since I have parallel Jenkins builds going on, and I want to start out each build with a completely clean cache, I use customized cache settings depending upon the Jenkins executor number. The caches are labeled cache-0 through cache-5.
However, since <ivy:settings> isn't executed until I call <ivy:resolve>, my customized cache settings aren't picked up. I call <ivy:cleancache> before I call Ivy resolve which causes the builds to clean out a common cache. Hilarity ensues. Using <ivy:cofnfigure> fixes this problem.

MSBUILD - block until a file exists on an ftp server?

As part of a build process .. I would like to block the build until a file gets created (exists) remotely at an ftp location, after which I would continue the build. (Preferably with somekind of time out limit).
Suggestions?
Is this even possible using only the standard msbuild task and/or extensionPack/communitytask?
Your best bet is to build a small custom exe (you can even compile it as a build step) that polls for the file you are looking for. Then you use the PreBuild target, or a custom target in a pre-build step to verify that the file exists.
<Project xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<Target Name="WaitOnFTP">
<Exec Command="MyFTPWaiter.exe"/>
</Target>
</Project>
Other more MSBuild oriented suggestions are to remake that exe as a custom task, or even an inline task in MSBuild 4.0.
FWIW, I've encountered a similar solution done by a peer who didn't want large binaries used by integration tests in version control and he required the use of a custom downloader in the build to get the files from a SMB share. It worked well enough.
Custom Tasks
Inline Tasks

Why does headless PDE Build omit directories I've specified in build.properties's bin.includes?

One of my Eclipse plug-ins (OSGi bundles) is supposed to contain a directory (Database Elements) of .sql files. My build.properties shows:
bin.includes = META-INF/,\
.,\
Database Elements/
(...which looks right to me.)
When I build and run from within my interactive Eclipse IDE, everything works fine: calls to Bundle.getEntry(String) and Bundle.findEntries(String, String, bool) return valid URL objects; my tests are happy; my code is happy.
When I build via headless ant script (using PDE Build), those same calls end up returning null. My tests break; my code breaks. I find that Database Elements is quietly but simply missing from my plug-in's JAR package. (META-INF and the built classes still make it in there fine.) I scoured the build log (even eventually invoking ant -verbose on the relevant portion of the build script) but saw no mention of anything helpful.
What gives?
It appears there was a bug (though I was unable to search up a Bugzilla citation) in the PDE Build ant-script generation process as of 3.2 that produced an ant build.xml script fragment like this from the bin.includes:
<copy todir="${destination.temp.folder}/my_plugin" failonerror="true" overwrite="false">
<fileset dir="${basedir}" includes="META-INF/,Database Elements/" />
</copy>
The relevant Ant documentation says that includes contains a "comma- or space-separated list of patterns". Thus (since my directory name contains a space and was copied literally into the includes attribute value) I think the copy task was trying to include a file named Database and a directory named Elements/. Neither existed, so they were quietly ignored. I suspect the same problem would have bitten if I had a comma in my directory name, but I did not test this.
Since I use Eclipse 3.5 interactively, I decided to finally decouple my headless build's Eclipse instance from my target platform (which remains at 3.2 for the moment) and to update my headless PDE Build to 3.5 (by attempting to produce a minimal PDE Build configuration from my interactive instance's plug-ins). Now, the generated build.xml contains this instead:
<copy todir="${destination.temp.folder}/my_plugin" failonerror="true" overwrite="true">
<fileset dir="${basedir}">
<include name="META-INF/"/>
<include name="Database Elements/"/>
</fileset>
</copy>
The relevant Ant documentation this time indicates that the only special characters in an individual include are * and ?. Indeed, the bug seems to have been fixed sometime between 3.2 and 3.5: my 3.5-based headless PDE Build now produces a plugin that contains Database Elements; my tests are happy; my code is happy; I'm happy.