I'm using the MSTest system for unit testing my compact framework (3.5) application and DLLs. When I test some DLLs it just runs but for some it loads the emulator first. Can anyone tell me what determines whether the emulator is launched?
The testrunconfig file tells mstest which platform to deploy the tests to. However, if you have the configuration set to both build and deploy all of the DLLs, then the DLLs will attempt to deploy to their default target, not the target from the testrunconfig (yes, it's stupid and confusing).
The general rules I follow are:
Go through each project and set the target device to the same thing.
Use the Configuration Manager, and set to deploy only those items that won't be deployed due to being a dependency
Set the testrunconfig to match the target device from above
Related
I am currently developing a Windows Store App that will eventually be targeted at the ARM devices when they are available. For now, I have been developing and testing from Visual Studio on my desktop computer and everything works fine. However, when I try to create an app package that I can pass along to others within my company for testing purposes, the application will not run properly.
The solution includes two projects. The first is a C++ project that is set to build a dll file. The purpose for this is to expose the Direct2D and DirectWrite libraries that seem to be unaccessible to a C# project. The second project is the C# project that references this dll for drawing functions and includes a XAML interface and most of the program logic. All of this works flawlessly on my development machine from within Visual Studio (and also when installing the package).
When I send the package files to other individuals within the company, the installation appears to work fine by installing with the PowerShell script. The tile appears in the start screen and the program will launch for a few seconds. The C# and XAML interface appears, but the DirectX portion of the application is not visible and the entire application shuts down within a few seconds. This makes me believe that the dll may not be installing or referenced correctly upon installation. I have checked the package file, and the dll file is included in the package after the build process is complete.
I have packaged a few different test programs (MSDN Samples) that have all installed on their machines, but we get the same results that they will not run (again, all samples run fine on my development machine when building them). The only test project that worked properly was a simple C# project that did not use DirectX at all. Any of the DirectX samples that I tried have all failed (including the native C++ samples that do not use C# at all).
To be clear, the process I use for building is going to Project -> Store -> Create App Packages and choosing the No option for uploading to the Windows Store.
Does anyone have any ideas on what might be going wrong with the build or installation process?
Thanks in advance for any help!
Does it work with the Metro Sideloader? I am not sure if it just adds a UI to the Powershell script, but it works for my team and me for testing...
Good luck!
Are you side loading a Debug version of your DirectX app onto a machine that does not have the Windows SDK installed? Visual Studio's default DirectX projects and the samples on MSDN both request the D3D11_CREATE_DEVICE_DEBUG flag when creating the D3D Device. Device creation will fail if the Windows SDK is not installed on the machine running the code.
Here are a few different options that will allow you to unblock yourself. Any one of these should give you the desired result:
Create a Release package and deploy that instead of a Debug package.
or - Go to DirectXBase.cpp and remove the D3D11_CREATE_DEVICE_DEBUG flag from the code.
or - Install the Remote Debugging tools for Visual Studio on the target machines. This will install the necessary SDK components to allow creation of D3D Debug devices. The other cool thing about this option is that once you're set up you won't have to create packages manually and side load them anymore. Just tell Visual Studio the name of your ARM machine and press F5 to deploy it remotely. More information here: http://msdn.microsoft.com/en-us/library/vstudio/bt727f1t.aspx
How are you deploying the native DLL with your project? Are you using project-to-project references? Can you verify that your DLL is ending up in the final package, in the root of the package application directory?
I recommend using Sysinternals Procmon to watch your application load on the target machine. If it crashes or fails, you can look in the log history for which DLL it is trying to load and failing. Typically this will show up as a repeated series of DLL load probes (it will try and load the dll from the application directory, and then proceed to try a number of other paths).
I am trying to make a sample application (for mac) using the PLCrashReporter framework and the application works fine on my system but the application crashes on the other systems as the systems don't have that framework.Please let me know how to add the framework on other systems through our application.
For this you need to go to Build Phase and then set the destination to framework and drag framework from your Xcode project to copy files and remove it from link Binery with Libraries and now go ahead and make your build it is ready to distribute.......
I use xcode 4 to build a cocoa application with a private dylib/framework.
In my development Mac, I put the dylib in the /usr/local/lib directory, and drag it into the project.
The app is compiled and runs perfect on my computer.
To distribute this app to the other Mac, I create a copy Files building phase, and say "copy that dylib to Frameworks directory".
The application is built successfully, and I indeed see the dylib is copied to the Frameworks directory in the app bundle.
The problem is when I run this app in another regular Mac, which does not have this dylib installed. I get an error saying:
dyld: Library not loaded: /usr/local/lib/mylib.dylib
The issue comes from the fact that you copy the framework into the app bundle, so it is available at a location like:
<you_app_path>/Contents/Frameworks
but you try to load it from /usr/local/lib where it is not available on you deployment machine. From Apple Framework Programming Guide:
To embed a framework in an application, there are several steps you must take:
You must configure the build phases of your application target to put the framework in the correct location.
You must configure the framework target’s installation directory, which tells the framework where it will live.
You must configure the application target so that it references the framework in its installation directory.
Now, you say that the build phase is ok; I assume also that you sent the application target build setting correctly. What is left is configuring the framework target’s installation directory.
If you did not build the framework yourself, you should be able to fix this by changing the framework install path so that it is defined relative to the loader (your app), to something like: #loader_path/../Frameworks/ (or #executable_path/../Frameworks). You can do that by means of install_name_tool.
If you are compiling on your own the private framework, you can define its install location in Xcode build settings.
I have a bunch of C# VS2010 projects compiling automatically on a TeamCity build server.
The build server compiles the projects and then runs automatic unit tests on the output.
Problem is, part of the tests are trying to communicate with WCF services on the local server.
The tests fail because the BuildServer only builds the projects and does not publish the output services onto the IIS7(Running alongside with the TeamCity).
Is there a simple way to automatically tell TeamCity(maybe through MSBuild.exe) to publish my *.svc files every time the code finished compiling?
Thank you [=
Simplest thing to do would be to point IIS7 at the TeamCity checkout directories -- it will build there so you can run the tests against the services without simulating deployment. You also might want to create 2 stages of tests -- one more traditional unit tests that get run before "deployment" and a 2nd set that gets run after the 1st set is successful and "deployment" happens.
Deploying out of TeamCity can definitely work though exactly how depends on your network and application topography.
To deploy the web services you can use Web Deploy to package and install your services in IIS. However, it seems that the real issue is the dependency your tests have on your services. You should abstract your service interfaces and use a mocking framework and your favourite DI container in your tests so the services don't need to be up.
HTH.
Using TeamCity 6.5.1
NUnit version 2.5.10
Win2008 x64
Project is using .NET 4.0
Trying to execute the built-in TeamCity NUnit test runner, I receive the following error:
NUnit error running tests in 'E:\TeamCity\LocalBuildAgent\BuildAgent\work\698a8f459eac8cd9\MyProject\bin\Release\MyProject.Tests.dll' assebmly
System.BadImageFormatException: Could not load file or assembly 'E:\TeamCity\LocalBuildAgent\BuildAgent\work\698a8f459eac8cd9\MyProject\MyProject.Tests\bin\Release\MyProject.Tests.dll' or one of its dependencies. This
assembly is built by a runtime newer than the currently loaded runtime and cannot be loaded.
My stack is pretty much identical to yours...so, I am going to take a shot in the dark here.
If you go into the Build Step you designated in Team City for running NUnit and find the .NET Runtime section and make sure Platform is set to "auto(MSIL)" and most importantly check to make sure version is set to v4.0.
I have seen your exact error when attempting to run unit tests for a 4.0 project against the 2.0 framework setting.
If it isn't that, I would suggest checking directory permissions and that the System or Network Service accounts which I think team city runs on (unless specified otherwise) can access the directory that your tests.dll resides in.