How to add C++/CX library to Windows Store solution targetting Any CPU - xaml

So I have a Windows Store app using C# targeting Any CPU so that the single app will run on any Windows 8 x86/x64 desktop/tablet or ARM tablet. I need to add some special code in C++ which doesn't seemto have the option to target Any CPU. The code will compile and run both on x86/x64 and if I change the entire solution to ARM it will compile and run there too. So I'm looking for a way to make the C++ target Any CPU which I think is probably impossible. Or have the C++ library compiled multiple times (x86, x64 and ARM) and have all of them included in the appx package. I have spent about 3 hours reading Windows Store development docs on C++/CX and haven't found any way to do this yet. Of course I'll keep looking, but I'm hoping someone else has seen how to do this and can point me in the right direction.

There's no way you can create a single package targetting AnyCPU when you're calling into a native library. You need to create three different packages, one for each target architecture. When you're uploading the app to the store, you can include all three packages.
To simplify the process of building all three packages, you could create a Visual Studio extension (vsix) with all three builds of your native library. In this case the native library for the right platform will be automatically included in each package. Here's a quick tutorial on how to do it.

Related

Setting up cross platform development for Mono/ARM

I'm going to develop and compile C#/Mono app on Windows 7 with Visual Studio and then run this app at Linux device. I googled a lot, but one point is still confusing me - how should I set up my development environment. I have Mono for Windows installed on my laptop and now there are two possibilities:
create regular windows c#/net project which will use references from Windows\MS.NET framework, build this project using msbuild, copy and run this app on Linux
create Mono target for VS, create project which will use references from ProgFiles(x86)\Mono\lib, build this project using xbuild etc.
Which way I should choose? It seems to me, that option #2 is more preferable, but I do not understand why.
None of your solutions is very good. I would choose a 3rd one:
Develop your code with Linux, using MonoDevelop IDE.
There are many reasons why this option is the best, such as:
Mono for Windows is suboptimal: You will find some things don't work (which do work on Linux) or things that are much slower than normal.
Mono is not 100% compatible with MS.NET: Some things are unsupported in Mono (e.g. System.Management) or have too many bugs to be considered stable (e.g. WCF). So it's better that you test on Mono as soon as possible, i.e. while developing and debugging locally.
MonoDevelop is still a very good IDE which can compete with VS in some areas (e.g. Code Completion).
Well, I use myself mono on Linux/ARM and I do all my development under Visual Studio, just compiling for AnyCPU and taking a bit of care on what to use.
You can even debug your program on the target machine from Visual Studio using MonoDebugger, it starts to work decently.

DirectXMath and Win8 SDK in VS2010 project

I've been working in an engine in Visual Studio 2012 that supports rendering with Direct3D 9 and Direct3D 11. However I'm getting some new people to help with the project and they would prefer to work on Visual Studio 2010 because that's the version they own and use. So I decided to convert the project to be built with VS2010's v100 platform toolset.
I'm getting close to getting it to work but I can't include DirectXMath.h, necessary for the DirectXTK and some utility functions I'm using. It's part of the Windows 8 SDK and included in VS2012, but VS2010 doesn't seem to find it.
Anyone knows how to get it to be included using environment variables so that it works for everybody on the team, and in a way that works on Win7 too?
Thanks.
To make new teammates be able to code in VS2010 you have several options:
You don't need to change platform toolset to old one and rewrite your codebase. VS2010 developers can just install Windows 8 SDK, and use v110 toolset. To help them, configure "VC++ directories" in project properties as pointed in this article (change macro variables, which points to old Windows SDK, to explicit locations of new Windows SDK):
In “Executable Directories” replace $(WindowsSdkDir)binwith$(ProgramFiles)\Windows Kits\8.0\bin\x86`
In “Include Directories” add $(ProgramFiles)\Windows Kits\8.0\Include\um;$(ProgramFiles)\Windows Kits\8.0\Include\shared at
the beginning and remove $(WindowsSdkDir)include
In “Library Directories” replace $(WindowsSdkDir)lib with $(ProgramFiles)\Windows Kits\8.0\lib\win8\um\x86
In “Exclude Directories” replace $(WindowsSdkDir)include with $(ProgramFiles)\Windows
Kits\8.0\Include\um;$(ProgramFiles)\Windows Kits\8.0\Include\shared
When targeting x64, replace x86 with x64
If you really want to downgrade toolset from v110 to v100, then you will need to make use old standalone DirectX SDK. Before, Windows SDK and DirectX SDK was separate. They was merged since Windows 7 SDK. When merging, Microsoft decidede to remove some stuff and also renamed some files, for example, standalone SDK contains math in #include <xmmath.h>.
You can combine both: create multiple project/platform configurations and inmplement conditional compilation via #ifdef where VS2010 configuration will fail to find headers/compile. For example you can use C++11 features in VS2012 branch of code, but in VS2010 branch you use only C++03 features.
I would prefer first option, but it is up to you to decide.
P.S. As far as I remember, project files from VS2012 (.vcxproj) cannot be opened in VS2010 (it knows only .vcproj), so you cannot share it. You will probably want to install VS2010, make .vcproj and maintain both files. It can be pain when you change project options in one, and forget to change in other, so be careful. Also, consider to move all your team to single IDE, or at least single build system (for example, CMake).
Happy coding!

Can Windows Store apps be compiled as x86 instead of AnyCPU?

I have some .NET code that I am looking into porting into being a Windows Store app.
This code does a few different things and one of the things it does has a dependency on being compiled as x86 instead of AnyCPU.
Is this going to be a problem? Can a Metro app be compiled as x86 and still be distributed on the Windows Store? Is being compiled as x86 going to stop it from being able to run in Windows RT? Would I have to come up with a version without this subset of functionality to run in Windows RT? If I can get the code into its own assembly can I just have the Windows RT version not use it? (so, the main executable is AnyCPU and this one assembly is x86).
Can a Metro app be compiled as x86 and still be distributed on the Windows Store?
Yes. However, it would only be installable for x86 installations.
Is being compiled as x86 going to stop it from being able to run in Windows RT?
Since Windows RT is designed to run only on ARM CPUs, compiling for x86 will stop it from being able to run on Windows RT.
Would I have to come up with a version without this subset of functionality to run in Windows RT? If I can get the code into its own assembly can I just have the Windows RT version not use it?
You could use conditional compilation symbols to include/exclude functionality as required in your code. https://stackoverflow.com/a/6587823/61385 shows an example of how to do this.
Just compile whatever libs you need and when you upload to app store just upload the one you want to. Check your apppackages folder and look for the .appxupload files.

Windows 8 Metro App Side Load Deployment

I am currently developing a Windows Store App that will eventually be targeted at the ARM devices when they are available. For now, I have been developing and testing from Visual Studio on my desktop computer and everything works fine. However, when I try to create an app package that I can pass along to others within my company for testing purposes, the application will not run properly.
The solution includes two projects. The first is a C++ project that is set to build a dll file. The purpose for this is to expose the Direct2D and DirectWrite libraries that seem to be unaccessible to a C# project. The second project is the C# project that references this dll for drawing functions and includes a XAML interface and most of the program logic. All of this works flawlessly on my development machine from within Visual Studio (and also when installing the package).
When I send the package files to other individuals within the company, the installation appears to work fine by installing with the PowerShell script. The tile appears in the start screen and the program will launch for a few seconds. The C# and XAML interface appears, but the DirectX portion of the application is not visible and the entire application shuts down within a few seconds. This makes me believe that the dll may not be installing or referenced correctly upon installation. I have checked the package file, and the dll file is included in the package after the build process is complete.
I have packaged a few different test programs (MSDN Samples) that have all installed on their machines, but we get the same results that they will not run (again, all samples run fine on my development machine when building them). The only test project that worked properly was a simple C# project that did not use DirectX at all. Any of the DirectX samples that I tried have all failed (including the native C++ samples that do not use C# at all).
To be clear, the process I use for building is going to Project -> Store -> Create App Packages and choosing the No option for uploading to the Windows Store.
Does anyone have any ideas on what might be going wrong with the build or installation process?
Thanks in advance for any help!
Does it work with the Metro Sideloader? I am not sure if it just adds a UI to the Powershell script, but it works for my team and me for testing...
Good luck!
Are you side loading a Debug version of your DirectX app onto a machine that does not have the Windows SDK installed? Visual Studio's default DirectX projects and the samples on MSDN both request the D3D11_CREATE_DEVICE_DEBUG flag when creating the D3D Device. Device creation will fail if the Windows SDK is not installed on the machine running the code.
Here are a few different options that will allow you to unblock yourself. Any one of these should give you the desired result:
Create a Release package and deploy that instead of a Debug package.
or - Go to DirectXBase.cpp and remove the D3D11_CREATE_DEVICE_DEBUG flag from the code.
or - Install the Remote Debugging tools for Visual Studio on the target machines. This will install the necessary SDK components to allow creation of D3D Debug devices. The other cool thing about this option is that once you're set up you won't have to create packages manually and side load them anymore. Just tell Visual Studio the name of your ARM machine and press F5 to deploy it remotely. More information here: http://msdn.microsoft.com/en-us/library/vstudio/bt727f1t.aspx
How are you deploying the native DLL with your project? Are you using project-to-project references? Can you verify that your DLL is ending up in the final package, in the root of the package application directory?
I recommend using Sysinternals Procmon to watch your application load on the target machine. If it crashes or fails, you can look in the log history for which DLL it is trying to load and failing. Typically this will show up as a repeated series of DLL load probes (it will try and load the dll from the application directory, and then proceed to try a number of other paths).

MonoDevelop: same project for MonoMac and GTK# possible?

Perhaps my question is totally naiive and this is the reason why I couldn't find any information with Google or something else - but nonetheless, I think it is worth asking here.
I want to develop a C# application which behaves naturally in Mac and Windows (Linux would also be nice, but is not directly needed). My main operating system for development should be Mac OS X and therefore I want to go with MonoDevelop.
I can setup a project for MonoMac - works fine.
I can setup a different project for GTK# - works fine.
My question is now, what I have to do to get a project with a possibility for a MonoMac and a GTK#-frontend. So I will go with the MVC pattern and want to work in one project. As a result, building my project would result in a Mac executable (based on the MonoMac stuff) and one windows executable (based on GTK#).
Am I completely wrong with my approach?
What do I have to do to achieve my goal?
Yes, for a multi-platform app with the best possible look-n-feel on each platform, you would need one executable per platform. Using an MVC approach is the best way to do this - you can have a solution containing a library project with all the shared code - models, processing code, business logic, etc - and a project for each "frontend" executable containing the platform-specific views and shell.
If a really good native experience on Windows is higher priority than Linux support, I'd recommend using WPF or Windows Forms instead of GTK#. This would mean you'd have to split development between Windows and MacOS - you would need to open the same project in Visual Studio, SharpDevelop or MonoDevelop on Windows, and edit the WPF/WinForms project and the shared library there.
OTOH, GTK# has the advantage you could start off writing a single frontend that would work on all three platforms, and then write the platform-specific ones afterwards.