I am trying to make a sample application (for mac) using the PLCrashReporter framework and the application works fine on my system but the application crashes on the other systems as the systems don't have that framework.Please let me know how to add the framework on other systems through our application.
For this you need to go to Build Phase and then set the destination to framework and drag framework from your Xcode project to copy files and remove it from link Binery with Libraries and now go ahead and make your build it is ready to distribute.......
Related
I am currently developing a Windows Store App that will eventually be targeted at the ARM devices when they are available. For now, I have been developing and testing from Visual Studio on my desktop computer and everything works fine. However, when I try to create an app package that I can pass along to others within my company for testing purposes, the application will not run properly.
The solution includes two projects. The first is a C++ project that is set to build a dll file. The purpose for this is to expose the Direct2D and DirectWrite libraries that seem to be unaccessible to a C# project. The second project is the C# project that references this dll for drawing functions and includes a XAML interface and most of the program logic. All of this works flawlessly on my development machine from within Visual Studio (and also when installing the package).
When I send the package files to other individuals within the company, the installation appears to work fine by installing with the PowerShell script. The tile appears in the start screen and the program will launch for a few seconds. The C# and XAML interface appears, but the DirectX portion of the application is not visible and the entire application shuts down within a few seconds. This makes me believe that the dll may not be installing or referenced correctly upon installation. I have checked the package file, and the dll file is included in the package after the build process is complete.
I have packaged a few different test programs (MSDN Samples) that have all installed on their machines, but we get the same results that they will not run (again, all samples run fine on my development machine when building them). The only test project that worked properly was a simple C# project that did not use DirectX at all. Any of the DirectX samples that I tried have all failed (including the native C++ samples that do not use C# at all).
To be clear, the process I use for building is going to Project -> Store -> Create App Packages and choosing the No option for uploading to the Windows Store.
Does anyone have any ideas on what might be going wrong with the build or installation process?
Thanks in advance for any help!
Does it work with the Metro Sideloader? I am not sure if it just adds a UI to the Powershell script, but it works for my team and me for testing...
Good luck!
Are you side loading a Debug version of your DirectX app onto a machine that does not have the Windows SDK installed? Visual Studio's default DirectX projects and the samples on MSDN both request the D3D11_CREATE_DEVICE_DEBUG flag when creating the D3D Device. Device creation will fail if the Windows SDK is not installed on the machine running the code.
Here are a few different options that will allow you to unblock yourself. Any one of these should give you the desired result:
Create a Release package and deploy that instead of a Debug package.
or - Go to DirectXBase.cpp and remove the D3D11_CREATE_DEVICE_DEBUG flag from the code.
or - Install the Remote Debugging tools for Visual Studio on the target machines. This will install the necessary SDK components to allow creation of D3D Debug devices. The other cool thing about this option is that once you're set up you won't have to create packages manually and side load them anymore. Just tell Visual Studio the name of your ARM machine and press F5 to deploy it remotely. More information here: http://msdn.microsoft.com/en-us/library/vstudio/bt727f1t.aspx
How are you deploying the native DLL with your project? Are you using project-to-project references? Can you verify that your DLL is ending up in the final package, in the root of the package application directory?
I recommend using Sysinternals Procmon to watch your application load on the target machine. If it crashes or fails, you can look in the log history for which DLL it is trying to load and failing. Typically this will show up as a repeated series of DLL load probes (it will try and load the dll from the application directory, and then proceed to try a number of other paths).
Perhaps my question is totally naiive and this is the reason why I couldn't find any information with Google or something else - but nonetheless, I think it is worth asking here.
I want to develop a C# application which behaves naturally in Mac and Windows (Linux would also be nice, but is not directly needed). My main operating system for development should be Mac OS X and therefore I want to go with MonoDevelop.
I can setup a project for MonoMac - works fine.
I can setup a different project for GTK# - works fine.
My question is now, what I have to do to get a project with a possibility for a MonoMac and a GTK#-frontend. So I will go with the MVC pattern and want to work in one project. As a result, building my project would result in a Mac executable (based on the MonoMac stuff) and one windows executable (based on GTK#).
Am I completely wrong with my approach?
What do I have to do to achieve my goal?
Yes, for a multi-platform app with the best possible look-n-feel on each platform, you would need one executable per platform. Using an MVC approach is the best way to do this - you can have a solution containing a library project with all the shared code - models, processing code, business logic, etc - and a project for each "frontend" executable containing the platform-specific views and shell.
If a really good native experience on Windows is higher priority than Linux support, I'd recommend using WPF or Windows Forms instead of GTK#. This would mean you'd have to split development between Windows and MacOS - you would need to open the same project in Visual Studio, SharpDevelop or MonoDevelop on Windows, and edit the WPF/WinForms project and the shared library there.
OTOH, GTK# has the advantage you could start off writing a single frontend that would work on all three platforms, and then write the platform-specific ones afterwards.
I'm using MonoTouch (monoDevelop 2.6) to develop an iPhone app. I've created an iPhone windows based project, a monoTouch library project and a NUnit project. I'll like to add a reference from my NUnit project, to my library project so that I can write some unit tests against my UI agnostic code.
Sadly, the library and UI projects use the runTime MonoForiPhone and can't be added as references to the NUnit project (which uses the Mono/3.5 target framework).. The projects are greyed out under Edit References with a message "Incompatible framework..."
Likewise if I create a regular .net library for my business logic, the UI cannot reference the project.
How can I create units against my iPhone application?
Add a monotouch library project which contains non UI code.
Add a nUnit project.
Add files to the nUnit project (by creating links).
Now, I can run the at least one of the tests.
What I would suggest is to create two projects, one to target MonoTouch and one which targets .NET 3.5/4/whatever for NUnit testing purposes.
In case you missed it, there is now a NUnitLite runner available for MonoTouch which is designed to work for UI agnostic code and executed on devices (or simulator).
See: .NET Unit test runner for iOS
I'm using the MSTest system for unit testing my compact framework (3.5) application and DLLs. When I test some DLLs it just runs but for some it loads the emulator first. Can anyone tell me what determines whether the emulator is launched?
The testrunconfig file tells mstest which platform to deploy the tests to. However, if you have the configuration set to both build and deploy all of the DLLs, then the DLLs will attempt to deploy to their default target, not the target from the testrunconfig (yes, it's stupid and confusing).
The general rules I follow are:
Go through each project and set the target device to the same thing.
Use the Configuration Manager, and set to deploy only those items that won't be deployed due to being a dependency
Set the testrunconfig to match the target device from above
I have two XCode projects: a framework and a client application.
My application depends on my framework and everything works fine with that — the framework is being recompiled everytime the app is, the projects build paths are set correctly, it's completely okay.
Now the framework started using 3rd party dylib file, and it's linked against the dylib.
I've even added a build phase to copy that library into the framework's resources dir.
When i'm trying to run the application, everything compiles correctly, then i get this:
dyld: Library not loaded: /usr/local/lib/libplplot.9.dylib
Referenced from: /Users/railsmaniac/Projects/Study/Calculus of >approximations/Builds/Debug/XNMaths.framework/Versions/A/XNMaths
Reason: image not found
How can i fix it?
Adding the library into client application's resources doesn't fix the problem.
I can just place the library into the required location, but i prefer to keep it IN the framework.
Is it possible?
It looks like your application is expecting the library to be found at a specific path on the system. If you are on OS 10.5+ you can use the new #rpath functionality to allow your application to link dynamically to your library.
See this post for further details. It also shows the "old" way of doing this.