I have ASP .Net Web Application in C# (no MVC or WebForms) compiled in Release mode for AnyCPU, pdb files are enabled and deployed with application.
When enable 32-bit applications on AppPool has the default value of False the stack trace of the exception has correct line numbers.
When the flag is set to True the stack trace has incorrect line numbers.
Just to make it clear the only thing I change is the value for enable 32-bit applications flag in AppPool configuration of my web application.
I have tried this on two machines:
Windows 8 with IIS 8.5.9600.16384
Windows Server 2008 R2 with IIS 7.5.7600.16385
In my particular case it is OK to just reconfigure AppPool (we have already migrated from x86 to AnyCPU and this obsolete configuration is just a mistake), but I am still interested why does this happen? (may be there is some bug in IIS, I was not able to find this behavior mentioned anywhere).
Update: it seems I have figured it out, but this is a temporary reprieve:
The problems is almost certainly due to code optimization (I have written code in such a way, that rules out other options: jitter reorders functions. This is not a compiler because I do not recompile application between tests).
Most of the optimization is done by jitter, and x86 optimizations are more aggressive than x64 optimizations, thus the difference in resulting code. When Microsoft decides to make x64 optimizations more agressive lines will be broken.
So the answer seems to be:
There are basically two steps of optimization in C#: a compilter (csc.exe, when C# code is translated into IL) and a jitter (when IL is translated in machine code). Jitter does not most of the optimizations (article). Also there is a great post by Eric Lippert about which optimization you might expect.
x86 and x64 jitters do different optimizations (CLR via C# Fourth Edition by Jeffrey Richter, Part V Threading, section Volatile Constructs, page 764)
Thus you can get correct line numbers in x64 (because jitter does not optimize code that aggressively) and x86 (that is more mature).
Summary: I have not found a way to work around it.
Related
I have a project done in VB.NET and I want to publish it for distribution. I know that when I build solution It creates an .Exe. But that requires local resorouces. If I build for release I know it works but it still needs the .Net platform installed. Is there anyway to make a true standalone .exe or something that would run on a persons computer if they do not have .NET installed. Also the ClickOnce application thing is not a wanted Solution.
Is there any converter program that can do this for me?
The .NET framework represents the basic prerequisite to run a .NET program; equivalently than having Windows installed is the prerequisite to run a Windows program. Bear in mind that any Windows version includes the .NET framework (and, actually, it tends to involve top-priority updates and thus are automatically perfomed by Windows Update in many cases). A big proportion of the programs running on Windows created during the last 10 years are built on the .NET framework; a relevant proportion of the sites (like this one, for instance) are built on ASP.NET and thus the given server has to include the .NET framework. If overall compatility is a concern for you, you might rely on a-bit-old .NET version: the latest one in VS 2010 (4.0) should be fine for most of modern computers; but you can even rely on the previous one (3.5) to be completely sure. Lastly, bear in mind that a .NET program can also be run under a OS other than Windows (Linux or MacOS, for example); although, from my past experiences, these are not too reliable situations. Nonetheless, in case of being interested in other OS, you should do some research/testing on this front to see if the available options offer what you are looking for.
SUMMARY: the exe file generated by the Visual Studio is actually what you call "standalone .exe". One of its defining features is the .NET version (which can be changed in the Project Settings); a program can only be run on computers with a .NET framework (or equivalent) equal or newer than the one on which it was built. The 4.0 version should be OK for most of new/properly-updated computers; the 3.5 .NET would work with virtually any computer (although, logically, it includes less features than the 4.0 one).
---------------------------- UPDATE AFTER COMMENTS --------------------
From some comments, I have undertood that my statement wasn't as clear as I thought and this is the reason for this update
.NET is pre-installed in Windows only since Vista. XP does not include the .NET runtime by default. The reason for not having mentioned this issue in my answer was that having a XP Windows without .NET is highly unlikely. Firstly because this is a top-priority, automatic update and thus one of the first times the computer is connected to internet "Windows Update" will take care of this. And secondly because this is the basic framework for any Microsoft programming over the last 10 years and thus a Windows computer not having it will not be able to run almost anything. With this last sentence, I don't mind that most of the programs are built on .NET, but that for a Windows-based environment most of nowadays basic requirements do include .NET.
It was also pointed out that there is some compatibility problems between different .NET versions (that various side-by-side versions were required). The basic Microsoft approach to the different .NET versions is backwards compatibility, what means that a given .NET version can run any program built with that version or older. This is theoretically right, but not always right in fact. My approach to this problem is relying on a bit old .NET version (3.5) and not using too new/untested features (e.g., WPF). If you want a for-sure overall compatible program you should work quite a lot on this front (compatibility between versions is one of most typical problem of any programming platform), instead expecting Microsoft to take care of everything. Thus, in principle, just one .NET version (the last one) has to be installed (which, on the other hand, is not the case for a big proportion of computers; for example: computer including the 3.5 version being updated, over the years, to 4.0 and 4.5 by maintaining the previous versions).
Lastly, I want to highlight that my intention with this answer is not defending any programming approach over any other one; I am just describing what is there from the point of view of your question "can I remove the .NET part?" -> no, you cannot; there is no (sensible) way to do that. If you want to rely on a different programming platform you should get informed about it (I am sure that Camilo Martin will be more than happy to help you on this front). If you prefer to rely on .NET, be sure that you can generate an overall compatible program (to be run on Windows).
Easy way to convert in .exe in VB.NET-2010:
Create New Project
Select Windows Application And Save Proper Path
Comple Project then Select File-> Save All
Select Build->Start Build
Your Project .exe Created Your Project Save Path:
Select Windows Application 1
Select bin Folder
Select Debug Folder
And in Debug folder your .exe File is ready.
Quick Basic once made and executable (.exe) directly form their VB code, but I wouldn't recommend converting to Quick Basic. You can look at Mono to see if they have anything yet. (mono allows you to use compiled vb.net in other operating systems).
Ezirit Reactor makes a single executable, but it's not free.
You can bundle .NET Framework into your distribution so that users don't have download it.
Why do you need an executable (.exe)? If the reason is for security and to minimize chances of reverse engineering, then get a good obfuscator.
I've just been working on migrating a staging web site from II6 to IIS8.
IIS8 comes with an option Enable 32-Bit Applications which is a true false flag. The explanation of this flag is:
[enable32BitAppOnWin64] If set to True for an application pool on a
64-bit operating system, the worker process(es) serving the
application pool run in WOW64 (Windows on Windows64) mode. In WOW64
mode, 32-bit processes load only 32-bit applications.
Now if I set this to False my web site stops serving and I get a 500 logging an error message of:
ISAPI Filter
'C:\Windows\Microsoft.NET\Framework\v4.0.30319\aspnet_filter.dll'
could not be loaded due to a configuration problem. The current
configuration only supports loading images built for a AMD64 processor
architecture. The data field contains the error number. To learn more
about this issue, including how to troubleshooting this kind of
processor architecture mismatch error, see
Now I guessed that their must be an assembly with x86 flags set, so I followed the instructions from this post using CorFlags to check this. But the all return Any CPU, i.e.
Version : v4.0.30319
CLR Header : 2.5
PE : PE32
CorFlags : 9
ILONLY : 1
32BIT : 0
Signed : 0
There are slight veriations but thats the jist.
So why do I need to set Enable 32-Bit Applications to True?
So I've done some more investigation using Process Explorer (this question helped) and it appears that if I set enable 32-bit applications to False and even though the Corflags says they do not require 32 bit several of the dlls do have an image type of 32-bit:
I believe I've gotten to the bottom of this, eventually!
So it appears that this machine is missing some of the x64 configuration. Particuarly the "ISAPI Filters" configuration contained the standard .net 4 aspnet_filter.dll (C:\Windows\Microsoft.NET\Framework\v4.0.30319\aspnet_filter.dll) but not the x64 version (C:\Windows\Microsoft.NET\Framework64\v4.0.30319\aspnet_filter.dll)
From talking with our infrastructuire guys they suggested the best way to get this set up correctly is to "uninstall the .Net 4.0 feature and re-install it" bear in mind this requires a reboot!
** post was edited, more info below
I've just watched two great videos about Advanced Dotnet Debugging (by Brian Rasmussen) and I am trying to repeat some steps, but just don't know how to proceed with tis error:
An attempt to set a processes DebugPort or ExceptionPort was made,
but a port already exists in the process.
I've found some answers on google and i generally understand what the error says but I just don't understand one weird fact: when i compile my simple app < .NET 4.0, I can attach as the movie shows, trying to do the same after i compile targetting .NET 4.0 disables me from attaching.
One of google's answers says "try to attach from windbg using noninvasive mode" but.. Brian do not use any of such checkboxes. It just works on his videos.
What's the difference? Where's the catch? Is it Windows 7 vs Vista? Maybe some different compile settings matters?
I am using MS VS 2k10 with MS SDK with Windbg x86 downloaded from msdn and symbols correctly configured to http server. The system is MS Vista x86.
Resources (exact time >= 8:15):
http://channel9.msdn.com/posts/MDCC-TechTalk-Advanced-NET-Debugging-part-2
Edit:
Error shows when attaching to process that was run from VS. Trying to attach to process that was run /outside VS, windbg doesn't show any content.
Edit2:
Windbg had some refreshing problems in my system. Using few times "Windows \ [Undock | Dock all]" menu option i was able to see the content of attached process, that was missing.
So the only question now is: what's the difference when attaching to process started from VS, when it's compiled in once using target < 4.0 and again = 4.0? Why when targetting 4.0 windbg cannot attach to the process in not "noninvasive" mode. What has changed in VS 2k10?
I take it you're debugging from Visual Studio (F5) and then trying to attach. You can only have one active debugger at a time, so that is why you get this error. If you want to launch the process from VS, run it without debugging (Ctrl-F5). If you do that, you should be able to attach from WinDbg.
EDIT : I am sorry, I missed the point about various versions of .NET behaving differently in this respect, so let me try to address your questions again. The reason it "just works" in the video, is because I use run without debugging every time I launch from VS. So if you simply want to follow the examples in the videos, all you need to do is run without debugging.
I started using WinDbg/SOS on CLR2 and x86. Launching a x86 .NET process from VS back then would trigger the error, so I made a habit of just launching without debugging.
However, as you have discovered there are scenarios where you can actually attach to a process that is being debugged by VS. I can reproduce the scenarios you describe, but I can also attach to a x64, .NET 2 process started with debugging from VS2008, but I cannot attach to the same process if the platform is set to x86.
Apparently there are subtle differences that I haven't been aware of, and it doesn't seem to be related exclusively to the .NET version, as I can attach to a x64 .NET2 process even if it is under the control of the VS debugger.
I'll update my answer if I find additional details.
I have a C#/WCF application (hosted in windows service) which was deployed and tested on 32 bit Windows server. Now I need to deploy it for production. My network team suggested to deploy it on 64 bit Windows Server to take full advantage of server capabilities.
My questions:
Is there any performance gain in deploying an application on 64 bit OS? If yes, how much?
Do I need to do any special to make my application 64 bit OS compatible? If yes, what?
P.S. My application is compiled with "Any CPU" option (Does it matter?).
There is blogs of information out there on this. A quick Bing will bring up 1000s of talking points: http://www.bing.com/search?q=x64+vs+x86+server&src=IE-SearchBox&FORM=IE8SRC But, to be brief:
Is there any performance gain on deploying an application on 64 bit OS? If yes, how much?
The most noticeable benefit is memory utilization - specifically, your service/app, and all of the server's other services/applications, have more room to play. Only true if you have 4 GB or more of RAM. If you have less than that, you are actually wasting memory by each allocation of block.
The benefit, at the raw-performance level, is for every CPU cycle, you can execute up to 64bits of information, instead of 32bits - double the information. Significantly more noticeable with multi-threaded applications: e.g. Your WCF Service hosted in IIS, which is multi-threaded for the incoming requests. :)
Do I need to do any special to make my application 64 bit OS compatible? If yes, what?
Short answer, nothing what-so-ever. :) And that's the benefit of .NET, when you compile with the default "Any CPU" option!
When you compile code into assemblies, you are compiling code into an Intermediate Language (IL) - not actual machine code. The .NET CLR (Common Language Runtime) version, that is installed on the specific server/workstation/device that you are deploying to, is what takes your IL code and executes it in native instructions for that specific platform - x86, x64, or IA-64 (or AMD64, ARM, etc if there any tweaks utilized). You do not have to do anything!
As for coding practices, there is nothing specific to do either.
Referencing 3rd Party Native Assemblies?
Now, the only concern is if you are using referencing any 3rd-party assemblies through COM or alike that are compiled in native code (i.e. basically, 3rd party assemblies writting in raw languages). That becomes tricky referencing a 32-bit native assembly via a CLR on a x64 machine (basically, you have to force your application to complile to 32bit to access it). There are other work-arounds though, which is outside of the scope of this answer.
That's why I either: stick to all .NET references, reference only 3rd party assemblies written in .NET, just write it myself, or beg the author of the 3rd party component to release both 32bit and 64bit compiled versions. The latter becomes difficult to test on your x86 (32bit) machine as you can only reference the 32bit versions, but will have to deploy the 64bit versions.
More-of-a-headache is when dealing with your own WCF project, and those 3rd-party native assemblies, is that the built-in WCF hosting service in Visual Studio (as well as Cassini) is only 32-bit, as well as Visual Studio's IntelliSense. Yeah, it's fun when using 3rd party native assemblies and trying to debug applications on a x64 machine. Good times!
I am working on an application that has more than a few dlls written in VB6. VB6 code includes COM dlls and ocx controls. The rest of the code is in C++ and C#. I have been assigned the task to make the application code base compatible with 64bit architectures. Loads of help material is available for C/C++ code so thats not a problem at hand. But it isn't easy to rewrite all of this vb6 code into .net or some other language to make it compatible with 64bit. Neither do I understand all the underlying logic so just assume that rewrite is out of question.
On the other we all know that VB6 dlls wont work in 64bit environment. So what are my options.
1) covert each of the dll into an EXE which will be loaded in 32bit and it can interact with my rest of 64bit application via COM interfaces. Do you foresee any problems with this approach?
2) I edit the registry and load all the VB6 dlls out of process, make them load in dllhost.
3) Make a single 32bit exe, refer all of these VB6 dlls in that exe and load that exe in 32bit address space and the 64bit part of my application communicates with the 32bit exe.
The major problem that comes to my mind with all of the above mentioned approaches is what to do with OCX controls????
Any ideas?
If no new ideas than which of the above mentioned will be preferred by you and why?
If you have lots of existing VB6 code that uses to run in-process, I'd first question if migrating to 64 bit is really worth the effort. 64 bit has many advantages for server apps, but for desktop apps, 32 bit is often completely sufficient. And as WOW64 can be expected to be available for at least a decade, there is little speaking against running 32 bit apps on 64 bit Windows.
The point is, although it is possible that by using out-or-process servers you could tweak your app so that it runs at least partially in 64 bit mode, this will probably have a significant impact on performance (and also on memory overhead). Odds are that the customer therefore has absolutely no benefit from choosing the 64-bit version of your app.
That said, I'd say 2) or 3) would be the natural choice. 2) is certainly easier to implement, but 3) gives you more control of how many out-or-process servers should be created and how their lifetime is managed.
I'm migrating from SQL 2000 x86 to SQL 2008 x64.
Here we are facing a similar issue. I've decided to use DCOM.
How will does it work?
A server 32-bits will host the assembly and the 64-bits machine will call using DCOM.
There is performance penalty. BTW, the effort in comparison to rewrite... certainly worth be doing.