TFS CI build trigger include variable - variables

Is it possible to use a variable as build trigger? I've tried and the build doesn't get triggered. If I remove the variable and insert a value, the build gets triggered as expected.
Aren't variables allowed here? $(Mapping.ServerPath) is set to MyRepo/Branches/MyBranch. $/MyRepo/Branches/MyBranch triggers the build correctly.

No. And why should they?
The specified path results in a poll action being performed on the static path.
You can use wildcards if needed.
The build should trigger on a change, hence CI trigger.
Making the path a variable, when would you provide it?
If it's just keeping a static value elsewhere, why not fill it in?
If you want to provide the path when calling the build.
Then you don't intend to use the CI option as intended?

No, it is not supported.
There is a user voice that you can follow: Allow Variables in Repository, variables and triggers Tab.

We have a microservice architecture with dozens of builds, it makes sense to be able to use a variable that we can update when we start our next iteration. With our branching strategy we have a new branch for each sprint and for each release. Changing the CI trigger in every build every couple weeks is inefficient.
We are using on premise TFS2018 and from everything I've seen this is not supported.

Related

Is there a way to make Gitlab CI run only when I commit an actual file?

New to Gitlab CI/CD.
What is the proper construct to use in my .gitlab-ci.yml file to ensure that my validation job runs only when a "real" checkin happens?
What I mean is, I observe that the moment I create a merge request, say—which of course creates a new branch—the CI/CD process runs. That is, the branch creation itself, despite the fact that no files have changed, causes the .gitlab-ci.yml file to be processed and pipelines to be kicked off.
Ideally I'd only want this sort of thing to happen when there is actually a change to a file, or a file addition, etc.—in common-sense terms, I don't want CI/CD running on silly operations that don't actually really change the state of the software under development.
I'm passably familiar with except and only, but these don't seem to be able to limit things the way I want. Am I missing a fundamental category or recipe?
I'm afraid what you ask is not possible within Gitlab CI.
There could be a way to use the CI_COMMIT_SHA predefined variable since that will be the same in your new branch compared to your source branch.
Still, the pipeline will run before it can determine or compare SHA's in a custom script or condition.
Gitlab runs pipelines for branches or tags, not commits. Pushing to a repo triggers a pipeline, branching is in fact pushing a change to the repo.

How to skip Ivy publish without causing error?

I would like to skip publishing an artifact if it already exists in the repository, but as far as I can see from the documentation there isn't a way to do this. There is an overwrite attribute, but if set to false that causes the publish to fail if the artifact exists. I definitely don't want to overwrite the artifact, either.
I've looked into using <ivy:info> and <ivy:findrevision> to see if the artifact exists and set a property I can use on my publish target (as an unless attribute, for example), but neither of these tasks allows me to specify the repository to check.
I'd rather not resort to using an external taskdef, like antcontrib's try/catch tasks.
Does anyone have any other suggestions?
Info and findrevision allow the settingsRef-attribute. So you could use an extra settings-file that only references the resolver you need (via ivy:settings and ivy:configure) and use that settingsRef in your task.
Why would you run the "publish" task if you don't intend saving what you built?
I use the buildnumber task to ensure that my version number is incremented automatically based on what was previously published.

Reusing tasks in SSIS

How can a task be reused in SSIS without copy/paste?
For example, I'd like to use the tasks I've defined in an event handler for one executable in another executable, but not with all executables in the package. So far, I haven't found any solutions other than writing a complete custom component, which seems like overkill. Any suggestions?
Have you considered using an event at the package level, and filtering to only fire when your particular condition requires it?
E.g. you could use the OnPostExecute event just by putting a dummy task in your flow with a name that starts with a specific string like "RunMyTasks", and then check the System::SourceName to see if it starts with "RunMyTasks". If it does, then branch to run your tasks (and otherwise branch to handle the event as you normally would).
You could do a similar thing using OnVariableValueChanged - this might be better (although you'd need to test it). Create a variable with RaiseChangedEvent=TRUE. Create a script task / component to change the value of the variable; finally, put your task set into the event handler.
Check the scoping notes at the bottom of Jamie's post here.
If you can use third-party solutions, check the commercial CozyRoc SSIS+ library. It includes enhanced Script Task Plus, which allows export of script to external file and then link and reuse in other packages.

SSIS Intermittent variable error: The system cannot find the file specified

Our SSIS pacakges a structured as one Control package and many child packages (about 30) that are invoked from the control package. The child packages are invoked with Execute Package Task. There is one Execute Package Task per child package. Each Execute Package Task uses File Connection Manager to specify path to the child package dtsx file. There is one File Connection Manager per child package. Each File Connection Manager has an expression defined for ConnectionString property. This expression looks like this:
#[Template::FolderPackages]+"MyPackage.dtsx"
The file name is different for each package. The variable (FolderPackages) is specified in the SSIS package configuration file.
The error that is generated during run time is
Error 0x80070002 while loading package file "MyPackage.dtsx"
The system cannot find the file specified." The package that fails is different from run to run and sometimes no packages fail at all. This is when run on exactly the same environment/data etc.
I ran FileMon during this error and found out that when the error happens SSIS tries to read the dtsx file from a wrong place, namely from system32. I checked that this is identical to what would happen if #[Template::FolderPackages] variable were empty, but because the very same variable is used for every child package and works for some but doesn't work sometimes for others, I have no expalnation to this fact.
Anything obvious, or time to raise a support call with Microsoft?
Are you using Expressions on the SSIS variables directly? Variables with Expressions are calculated each time the variable is referenced by the consuming object which needs to use it. That is where the race condition bug exists, because sometimes the expression doesn't get evaluated if another thread is already evaluating a different variable, and the default value for the variable is provided to the consumer object.
If that matches your design, these two bugs on the connect site discuss the problem, and the workarounds:
https://connect.microsoft.com/SQLServer/feedback/details/332372/ssis-variable-expressions-dont-always-evaluate
A second one at
connect.microsoft.com/SQLServer/feedback/details/406534/ssis-2008-variable-expressions-dont-always-evaluate
A summary of workarounds is
{
- Note the parallel tasks that could run in you SSIS control flow and utilize these expression variables. If you have two tasks side-by-side if each relies upon the same variable, and that variable has an Expression to set its value, then you could hit this.
Manually sequentialize such tasks, so that they don't run in parallel. Ie. Add a green arrow on the control flow, so that the tasks occur in order Task1, Task2, Task3, rather than side-by-side on parallel paths and rather than inside the same container with no paths.
You could avoid variable expressions: Assigning local variables in the required order using a home-made script task that does the same kind of work, so that variables are not evaluated using expressions (ie. the thing which can hit this race condition). In other words, manually assign the variable values at a point in time in your control flow just before they are used. The point of using expressions on variables is to dynamically set a value based on another value whenever it is used, so this acheives a similar design goal but in a manual way.
Reduce threads to minimize potential: Setting the Dataflow task EngineThreads to 1 and MaxConcurrentExecutables to 1. This will help sequentalize execution of your package to one task at a time, but that has the side effect which may cause slower performance.
Create and set values on distinct copies of variables at different scope levels in the design, so that they evaluate in different parallel execution scopes and avoid the expression evaluation on parallel threads. Master::Var1, Child1::Var1, Child2::Var1
}
A bit of a stab in the dark but...
I've had a similar issue with variables where readonly=false and multiple components were reading the variable at the same time and causing locking issues.
I consistently recreated the problem by running a pair of dataflows that did nothing but reference the variable inside a for loop container and changed the variable to be read only and this resolved the problem.
If you temporarily hardcode the package name does this resolve the issue?
Turns out after sending trace info to Microsoft that we are encountering heap corruption. I'll update this question if we get to the bottom of it.
The current suggestion is to disable heap lookaside for dtexec.exe.
The official answer to this issue is that it is a bug in SQL 2005 and 2008. Many tasks accessing the same variable cause a race condition, and some tasks get the default value for the expression instead of the evaluated value.
The workaround is to ensure that the default value (the value defined in the property sheet for whatever property you are having trouble with) should be the value that will work in your production environment.
This way, when the race condition happens in prod, SSIS will fall back to the package value, which will still work.
In dev? Well you're just going to have to deal with that manually until we get a bug fix from Microsoft.
There is a KB article relating to this issue: http://support.microsoft.com/kb/2448991 which states when and where this was fixed.

TFSBuild:How to trigger a build only when a particular file is checked in?

We have a particular file, say X.zip that is only modified by 1 or 2 people. Hence we don't want the build to trigger on every check-in, as the other files are mostly untouched.
I need to check for a condition prior to building, whether the checked-in item is "X.zip" or not.. if yes, then trigger a build, else don't. We use only CI builds.
Any idea on how to trigger the build only when this particular file is checked-in? Any other approaches would be greatly appreciated as i am a newbie in TFS...
Tara.
I don't know of any OOTB feature which can do this, what you would need to do is write your own custom MSBuild task which is executed prior to the build running (pre-build action).
The task will then need to use the TFS API to check the current check in for the file you want and if it's not found you'll have to set the task to failed.
This isn't really ideal as it'll indicate to Team Build a build failure, which, depending on whether you're using check in policies, may be unhelpful. It'd also be harder to at-a-glance work out which builds failed because of the task and which failed because of a real problem.
You can change the build to occur less frequently rather than every check in, which will reduce load on your build server.
Otherwise you may want to dig into Cruise Control .NET, it may support better conditional builds.
If you could move X.zip into it's own folder, then you could set up a CI build with a workspace that only looked at the folder containing X.zip.
You would then need to add an explicit call to tf get to download the rest of the code as Team Build only downloads what the workspace is looking at.
But this might be simpler than the custom task approach?