I am moving from "XAML" builds to DevOps YAML builds and trying to replicate what I had in TFS 2012. In the XAML build I had several "Solutions" in the "Items to build" and this build was triggered on any checkin. From what I can tell the Pipeline was designed to build a single solution. I've "unlinked" the Pipeline from a single solution and was planning on adding additional Build Tasks for each solution to build. Is this the proper way? If this is not the best way to do this I'm open to suggestions. Using Azure DevOps 2019 and Visual Studio 2017.
Theoretically, you are able to do this. If you manage to do this with if's or something like that, you have to add multiple tasks - to each solution. You have to deal with triggers, finally your yaml file would be unmaintainable. YAML script would be huge and difficult to understand the dependencies.
I would suggest you just use a single file for each solution build. You are able to have many of .yml files all targeting different solutions.
Multiple YAML build pipelines in Azure DevOps
Can I have multiple build pipelines for the same repository?
Besides, since you are moving from "XAML" and not familiar with this build process. You can always use the Designer approach and pull out the system generated YAML if you are new to the system. Here's the YAML schema reference that might help you!!!
Related
I have an MVC 5 application we're moving from on-premise to the Azure cloud. Currently, we have several publish profiles, one per environment, which we determine using a powershell script. One of our goals is to make the building scripts and infrastructure as simple as possible, so I was wondering if I could make it so that using only my appveyor.yml file I could set the publish profile to be used, so
Is there a way to set the publish profile from the appveyor.yml file?
If not what are my choices?
You can run your PowerShell script as part of desired build step in pipeline. It is possible can run commands right from YAML file or UI or check-in your PowerShell script into repository and run .ps1 file. You might consider using secure variables to avoid checking in things like connection strings into repo in clear text.
However this custom script/profiles approach will not allow you to use built-in WAP artifacts packaging and you will be also needed to use custom script instead of automatic MSBuild mode. Which is OK, but a little bit more scripting. Also you will be needed to publish artifacts so it will be available for deployment.
Maybe easier option is to let AppVeyor do all build and WAP artifacts packaging/publishing automatically, and then use built-in Web Deployment with Web Deploy parametrization instead of multiple publishing profiles.
But if you decide to go with custom scripts, and multiple publishing profiles, you still can use use built-in Web Deployment with artifacts created by your scripts.
I am having an issue getting selenium end to end tests to work after an automated deployment using visual studio team services (VSTS).
I have a build working that generates a build artefact. This is triggered from VSTS but runs on an on premises build server. I have a deployment working that deploys to an on premises development web server. All this works including unit tests running after the build.
When I try to add testing after the deployment is when I run into the problem. The tests are to be run on the build server and point to the dev server website. The deployment has two phases. A deploy and then an agent phase that runs a test assemblies task using the build agent on the build server. The problem seems to be that the test dll's are not being included in the build artifact and so are never found when the test process runs. Deploy setup us as follows.
I have a copy files before the publish artifact in the build definition that seems to copy the files in to the right place but they are not included in the zip file artefact. I've looked at several websites and posts on here but I still seem to be missing a vital bit of knowledge that will get this working.
Use of the log did help as recommended so thanks for that.
I have managed to get this working. I separated the selenium tests out into a separate solution and built that separately creating it's own build artefact. I then added this to the agent task in the deploy. This worked. The only thing I need to get sorted now is the correct search path to find the test DLL's. It's not quite as dynamic as I would like at the moment. I can play tunes on that until I get it right though.
I accept this is working around rather than solving the original problem but needs and timescales must. I think moving the end to end UI tests out of the main solution makes sense anyway but no doubt others may think differently.
Thanks for your help everyone
I'm evaluating Bamboo to replace our Jenkins setup and have a couple questions. I have a .NET solution that generates two artifacts: a packaged website and an MSI. I have three environments I deploy to: test, stage, production. Our Jenkins server in turn has three jobs--one for each environment. Each job builds the solution, copies in configuration files for the environment it will be deployed to and then deploys the artifacts. Reading the documentation and other stuff (https://answers.atlassian.com/questions/19562/plans-stages-jobs-best-practices), I'm getting mixed signals about how deployment should work with Bamboo. It seems to me like deployment plans expect artifacts to exist and then deploy them. But, build plans include deployment steps as well. How is all of this supposed to interact together?
The reason I'm confused is because I have environment specific configuration files that get packaged during a build. Any direction on how this should work?
I posted the question to the Atlassian board as well and got an answer I think I like the best:
Jason Monsorno · 762 karma · Aug 30 '13 at 04:38 PM
Deployment projects in Bamboo seem to be dependant on the existance of
an artifact, the catch is you don't necessary need to use that
artifact so you could use an empty artifact and do completely
independent steps. Deployment projects are still fairly new to Bamboo
and your structure may favor the "normal" workflow so each environment
would be a separate manual stage.
The Deployment project do have a separate workflow and versioning. To
use Deployment projects in your scenario, I'd suggest making the
artifact the entire checkout then each Deployment environment can
build a copy of the artifact. The space-saving/less-time-efficient
option would be to just save the current revision in a file as the
artifact and use that to check it out and build in each Deployment
environment.
We have just set up TeamCity to build all of our projects and we need a system to deploy them to our staging and testing servers. The projects are ASP.NET. We also need to deploy our databases, is there something out there that can do this?
Thanks in advance. Help is much appreciated.
This looks like it would cover the deployment of the ASP.NET application code from TeamCity
As for databases, I am assuming you mean setting up TeamCity to run new database migration scripts. You can run post build events in TeamCity but it depends on how you configure the build. If you use MSBuild for instance you can do something like this. The other build runners like Nant should have something similar. Hope this helps
I have written several scripts for my hudson builds. I have place them in the workspace of the particular job i am working on.
I was hoping to know where the best place to put the scripts. Is somewhere in the file system then best place? What if we move build machines? Does hudson designate a place for scripts?
Please and thank you.
I would suggest putting them inside your project folder /hudson/jobs/MyProject instead of inside the workspace. The workspace could be overwritten.
Do you use source control? If so you can put them in there and get hudson to pull them from there...
If these scripts are related to a particular project, bundle them with the project. Don't put them somewhere else.
If these scripts are used for more then one project, put them in your source control as a a separate project. Than you can pull them down every time you pull your project. If your scm plugin for hudson does not support configuring two separate sources (like subversion does), then just pull the build script using a command line tool for your scm as your first build step.
Build scripts need to be versioned the same way as you code is versioned.