TFS Test result entry "Web Test Manager" or other test execution options - testing

QUESTIONS:
Do you have any direct or indirect experience with Web Test Manager by Sela Software Labs?
Positive or negative experience is fine. I’m just looking for some facts to base production decisions from.
How risky is it to install Sela "Web Test Manager" to our Production Server
WTM is a TFS Web Access extension. It extends website capabilities to include editing test steps and running tests.
Any other alternatives to executing tests and logging test results in TFS that we should consider?
Scenario: I have 2-3 Developers starting to run test cases as early as this week. We have 3 MTM (Microsoft Test Manager) licenses we use for testing (2 testers, one dev with VS2010 Ultimate). Purchasing another two full copies of Microsoft Test Professional for each of our VS2010 Pro/Premium (not Ultimate) devs just for periodically running test cases and doing light test case editing is not reasonable. We do not need trace listeners for general test pass runs.
Option #1: Sela Software Labs (Sela Group, Sela International) developed Web Test Manager several months ago but there is very little product reviews or customer feedback publicly available.
Sela WTM website: http://www.sela.co.il/alm/products_WTM.html
Single review posted on Microsoft partner marketing site: http://pinpoint.microsoft.com/en-us/applications/web-test-manager-wtm-is-the-only-tool-that-enables-to-test-with-tfs-2010-directly-from-the-browser-now-for-the-first-time-you-can-manage-your-tests-12884914644
Option #2: Developers track their results in individual spreadsheets which Testers then re-enter using MTM. This is not appealing at all and introduces several tedious failure risks.
Other options?

Are you talking about automated tests or manual tests? For doing automated testing, I believe you don't need additional VS licenses..
Fancy version: Set up a test machine with the Visual Studio Test Agent (and Build/Workflow agent if you need to do deployment), a TFS Test Controller, and trigger test runs from a dev machine. If you do it right, results get automatically published to TFS and attached to the correct TestCase workitems and TFS build objects. Check out Visual Studio Test and Lab Management for more info on that (definitely some extra overhead in building out the infrastructure, but it's really slick once you've made the investment). You should be able to trigger the Build-Deploy-Test workflows from any VS license that has TFS access, I think.
Less fancy, but still doable: Still install the Test Agent, but don't worry about actually wiring it up to a Test Controller. The Test Agent installation will at least give you MSTest and the ability to /publish the results of test execution up to TFS for reporting/result storage.
If you're looking to do manual test execution/reporting, I unfortunately don't have a lot of suggestions.. Most of my team's investment has been on automation, so I don't have a ton of experience working with the manual testing interfaces. :/

I'm testing WTM to use it in our company. We have common reasons mentioned by you. For these issues WTM seems to be a good and the only one option.
You are right, unluckely there are almost no reviews. So why I want to share my experience with WTM here.
Installation is easy and quick. Got no problems.
There is not much documenatation stuff on the web page. It's a pity. Hope Sela would make it better in the future (s. VisualAssist X Extension as an example for good public page). At least there is a good pdf documentation in the WTM folder after installing it.
Technically the extension is very good. But there are still some limitations (listed in documentation) and enhancements to be done (i.e. filtering, test step editing for customized TestCase TFS Work Item)
I think, for now it is a good option for testers who don't really need all MTM features.

I created an open source tool called the TFS Test Steps Editor. Originally it was developed to work around MTM's lack of ability to insert line breaks in test steps. I just released a new version that has the ability to publish test results for manual tests.
MTM is a pretty big pain to use: it's slow and buggy, and worst of all, it will sometimes lose data while attempting to publish a result. My tool saves all of your in-progress test execution to disk as you work, and you can export a .ZIP of your results for backup or to re-open on another machine and resume testing.

Related

ASP.NET Core front-end developer workflow with VSCode and VS 2019

I haven't done any cshtml front-end development for a few years.
What's the current, generally accepted way for ASP.NET Core front-end developers to work across a range of tools on Windows?
By that, I mean a way to have the front-end JS build and the .NET project(s) also build and to work rapidly in the browser and code.
My thinking is.
We have much better command line story around dotnet today.
Some folk like VS Code.
Some folk prefer VS 2019, and some like either, depending.
We need to work on UI aspects sometimes.
But we also need to attach a debugger and debug the server logic sometimes.
The build server should have no problem, be simple, and rely mostly on build logic held in the repo.
Tooling, and kicking off the whole build and serve process should be understandable and familiar.
It should be pretty simple to get going after a team noob clones the repo.
My initial thought would be to setup NPM then use something like Gulp to kick off everything, including running dotnet run.
Then when running under the Visual Studio 2019 debugger, use the Task Runner Explorer to kick off the Gulp stuff but skip the dotnet run part.
(shame there doesn't seem to be a command line for start VS(Code or 2019) and attach debugger)
Now I'm expecting to get a "primarily opinion based" SO beating, but there are general trends and ideas that go into designing all these tools for how they can all play ball together and what the dev story looks like.
You've pretty much already described the process. However, I'll add a few things:
You don't need the dotnet run bit. Visual Studio and VS Code are both capable of debugging directly.
You can assign the gulp tasks to build tasks in Task Runner Explorer, so you really don't even then to think about running those directly. I'm not as sure on this aspect of VS Code, but I'm sure there's probably some extension to handle it, if it's not already built-in.
If you want true ease of development, the best thing you can do is use Docker. Just add a Dockerfile to each project that actually runs (i.e. not a class library) and set up the steps to build and run it there. In Visual Studio, you can right-click the project and choose Add > Docker Support, and it will actually generate a ready-made Dockerfile, though you may need to add a step or two to handle the client-side build steps. In any case, this then becomes truly click and run, with nothing to worry about. The story is even better when you use docker-compose, as then Visual Studio and VS Code can spin up your entire application stack all at once, including external dependencies such as a database, Redis instance, etc. If you haven't used Docker before, start now. It's absolutely revolutionary for development.
One note for CI/CD, as much as possible, you should add a YAML file to describe your CI/CD pipeline. Depending on the the actual provider you're using for build/release, there might be some differences, so consult the relevant documentation. (Azure DevOps, for example, doesn't currently support describing release pipelines in yaml, though you can still do your build that way.) In any case, this allows you to configure all this in code, and have it committed to source control.
You may consider the same for your infrastructure. Azure has ARM templates, AWS has CloudFormation, GCP has Deployment Manager. There's also third-party tools like Terraform or Ansible. All of these, in some form or fashion (usually JSON or YAML) allow you to define all the characteristics of the infrastructure you're going to deploy to and commit that to source control. This makes deployment and things like creating new environments as breeze.

Continous- integration software for cmake project hosted on github

We are looking for a software to run our test cases automatically.
We want a software which will run on our server (or a commercial), which automatically gets the newest commit on github. Then compiles the commit of the project with CMake and run Ctest on our test cases. The results should then be visualized on a nice website.
I had a look at CDash, but as the documentation is so bad I did not even get it to get the latest commit from github.
So my questions are:
Is there a good tutorial to CDash? Except the bad wiki page.
What software is available for running tests on new commits to github, what are their advantages and drawbacks?
In answer to your second question, Jenkins is a robost and extensible continuous integration tool that can be integrated tightly with GitHub using a plug-in (or loosely using standard Git support). It also supports CMake via a plug-in. Whether it has disadvantages that will make it less useful for you depends on your organization and build process, but I've found it to be highly customizable to a wide variety of processes. I recommend taking a look at it.
There's also a third-party Ctest plugin available for Jenkins.
CDash works in pair with CTest. If you are already using CMake then it should be fairly easy to submit your testing results to CDash. I'd recommend reading the CTest documentation:
http://www.vtk.org/Wiki/CMake_Testing_With_CTest
You can either install your own CDash server or use Kitware's hosted server at my.cdash.org. You can test your server with a sample project available at:
http://www.cdash.org/cdash/resources/software.html

Recommendations for Continuous integration for Mercurial/Kiln + MSBuild + MSTest

We have our source code stored in Kiln/Mercurial repositories; we use MSBuild to build our product and we have Unit Tests that utilize MSTest (Visual Studio Unit Tests).
What solutions exist to implement a continuous integration machine (i.e. Build machine).
The requirements for this are:
A build should be kicked of when necessary (i.e. code has changed in the Repositories we care about)
Before the actual build, the latest version of the source code must be acquired from the repository we are building from
The build must build the entire product
The build must build all Unit Tests
The build must execute all unit tests
A summary of success/failure must be sent out after the build has finished; this must include information about the build itself but also about which Unit Tests failed and which ones succeeded.
The summary must contain which changesets were in this build that were not yet in the previous successful (!) build
The system must be configurable so that it can build from multiple branches(/Repositories).
Ideally, this system would run on a single box (our product isn't that big) without any server components.
What solutions are currently available? What are their pros/cons? From the list above, what can be done and what cannot be done?
Thanks
TeamCity, from JetBrains, the makers of ReSharp, will do all of that. You will have to configure it for what specifically it means to "build your product", but you can configure up everything you specified with it.
The software can alert you to failed builds, even down to alerting only the person responsible for checking in code that broke the build. It even comes with handy web pages you can view to see only your own changes, which builds they've been through successfully, which ones are pending, and which ones are currently being executed.
Since it is a distributed product, you can make it grow with your organization and product. If at some point you discover that you're waiting for the build to complete too much, because a lot of builds are being queued up, you can add more build agents. The build agents are basically separate client programs you install on additional machines, that execute the actual build configurations.
It comes in two flavors, the professional version and the enterprise version. The professional version is free, can contain up to 20 build configurations, 20 users, and 3 build agents. The enterprise version has unlimited users and build configurations, and you can also use LDAP based security (think domain verified users.) There's also some other bonuses from the enterprise version. You can also buy licenses for more build agents if you need more than the initial 3.
Now, if "no server components" means you don't want it to act like a web server, you're going to be hard pressed to find something that will react to your commits.
However, if you mean that you don't want to have to install a server OS, then TeamCity can work on workstation versions of Windows as well. That isn't to say that you shouldn't consider setting up a proper server for it, but it will run on a workstation if that is what you require.
Our product BuildMaster does all of the things you listed by design and there is a free, somewhat limited edition (e.g. you can only have a limited number of issue tracking providers integrate with it, the database change script packaging tool isn't included in the free version, etc.) for 5 users or fewer.
What you've described is the basics of a CI Tool, so every CI Tool should be OK.
I use Cruise Control.NET but it is bugged with Mercurial and is not very straightforward at first glance. I am nevertheless happy with it. Other tools that come in my mind are Hudson, Team Build (from TFS) and TeamCity.
I have not tried other tools but you can see pros/cons here :
TeamCity vs CC.net
Hudson vs CC.net, Link 1 and Link 2
CC.net vs TFS
EDIT : I forgot to mention that Hudson and Cruise Control.net are Open Source project, you can easily write plugins and patches to your install.
EDIT² : Mercurial bugs seem to be fixed in the upcoming 1.6 version of ccnet (changes commited to the trunk this week).
There's always BuildBot which I like (and have contributed some code to ). It's fairly easy to set-up and run on any OS, and to do simple tasks like that you say, and remarkably flexible if you need it.
What you might find missing is batteries-included log-scrapers and/or report generators that other more commercial CI-servers comes with, especially for Enterprise-y frameworks.
It scales pretty well too, Mozilla and Chromium use it, amongst others.

Any good command-line tools (for a build server) for validating websites?

My team creates a number of dynamic/data-driven websites. We use a CruiseControl.NET to download the code, create test data, run unit tests, and install each site into IIS for manual testing. However we haven't found a good tool (or tools) that can actually run through some simple tests of the websites, such as checking for broken links or invalid HTML.
Are there any good tools that we can incorporate into our build process to automate basic website testing? E.g. check for broken links, check for HTML/JavaScript/CSS coding errors, and so on? Load testing would be great too.
Looking for something totally generic; we don't need to write/record scripts for playback. Just something to cover the basics.
Thank you!
-James
For link checking you could always look at http://linkchecker.sourceforge.net/ if that isn't suitable they list other alternatives.
It also seems like it is an active project.
JSLint does javascript validation and there are two options for executing it via the commandline so that might be worth a look too http://www.jslint.com/

How do I set up a build server on the cheap/free?

Currently I'm tasked with doing the daily build. We have an ASP.NET 2005 website with a SQL Server 2005 backend. Our current source control is Visual Source Safe 2005.
At this point, I use the brute-force method of daily builds.
Get Latest version of source code
Get Latest version of Database release script
Backup old website files to a directory
Publish new code to my local machine
Run on my server to keep the test/stage site working
Push newly created files to the website
Run SQL Script on test database (assuming updates, otherwise I don't bother)
Test website on the Test Server.
Looking at the idea of automated builds intrigues me since it means that I do less each morning. How would you recommend I proceed? I want to have a fully fleshed out idea before I present it to my boss.
Ditch VSS, move to Subversion, and check out CruiseControl.NET. Alternatively, if you have a MSDN developer license, you can run TFS workgroup edition and set up a build server on any old XP box. Its what we do at our shop.
As Assaf noted, you can use CC.NET with VSS directly. Nice.
TeamCity has worked well for me. It has a very simple setup. Combine it with an MsBuild script for your operations and you're auto-matic.
For build management I wholeheartedly recommend TeamCity. It doesn't require IIS6 (like CC.net does) since it runs on it's own copy of Tomcat and the setup is all done thru various forms. This is a big deal to me since the build server is just an XPPro box. It integrates well with SVN and there is no crazy XML file manipulation like I had to do with CruiseControl.Net. Big win for me.
For a build runner we use NAnt to send emails to various people, copy the packaged builds where they're supposed to go, run NUnit and NCover, and deploy the software to our web farm.
For automated testing we use Watin.
http://www.nunit.org/index.php
http://www.jetbrains.com/teamcity
http://ncover.sourceforge.net/
http://subversion.tigris.org/
http://nant.sourceforge.net/
http://watin.sourceforge.net/
Try CruiseControl.Net. It's free, and whatever customized daily/continuous routine you want it to perform you can always add with scripts.
Remember, it's not just about daily (nightly) builds, but also about letting you catch build errors in time (since it continuously builds after every source commit/check-in). You don't necessarily test every code chance on every possible platform and build configuration, but CC can do exactly that for you (in the background).
http://confluence.public.thoughtworks.org/display/CCNET/Visual+Source+Safe+Source+Control+Block
All of what you are doing can be performed by a set of batch files, depending on how automated your test environment is. The main batch file can be started as a 'scheduled task' at midnight or whatever. That's how we 'do it cheap' here and at other places I've worked. If you need help with a particular batch, I can provide a sample.
I second (or third) the reccomendation for Subversion/CruiseControl.net. Also, if it is appropriate, check out hosted services for SVN like CVSDude. You'll probably become well versed with MSBuild in the process too. Once you get it setup it is great.
The cost doesn't come from licensing of the tools or even hardware necessarily, but from your time building and maintaining the system - and depending on what you are doing, that could become significant.
Start with the basics and incrementally improve it over time. Like anything else, if you try to come out of the gate with lots of automation and functionality you could find yourself mired in it fulltime for weeks.
Whatever tools you use, house them in a virtual machine (ie., vmware).
When the equipment inevitably goes south, you can copy the image onto any machine and not miss a beat because your build server decided to take the day off, assuming of course, you back up.