CIL instructions unexpected return value - oop

I am trying to create il-instructions manually for learning purposes, but have run into a small problem.
I have a simple structure with an interface: "MyInterface" with a single method: "Handle", a class called "AddTwo" which implements "MyInterface" and a "Program" class with an entrymethod. The il-dump looks like this:
.class interface public abstract auto ansi MyInterface<TInput,TOutput>
{
.method public hidebysig newslot abstract virtual
instance !TOutput Handle(!TInput A_1) cil managed
{
} // end of method MyInterface::Handle
} // end of class MyInterface
.class public auto ansi plusTwo
extends [mscorlib]System.Object
implements class MyInterface<int32,int32>
{
.method public hidebysig newslot virtual final
instance int32 Handle(int32 x) cil managed
{
// Code size 13 (0xd)
.maxstack 2
IL_0000: ldarg 0
IL_0004: nop
IL_0005: nop
IL_0006: ldc.i4 0x2
IL_000b: add
IL_000c: ret
} // end of method plusTwo::Handle
} // end of class plusTwo
.class public auto ansi Program
extends [mscorlib]System.Object
{
.method public static int32 Main() cil managed
{
.entrypoint
// Code size 34 (0x22)
.maxstack 2
.locals init (int32 V_0)
IL_0000: newobj instance void plusTwo::.ctor()
IL_0005: ldc.i4 0xa
IL_000a: callvirt instance !1 class MyInterface<int32,int32>::Handle(!0)
IL_000f: stloc V_0
IL_0013: nop
IL_0014: nop
IL_0015: ldloc V_0
IL_0019: nop
IL_001a: nop
IL_001b: ldc.i4 0x5
IL_0020: add
IL_0021: ret
} // end of method Program::Main
} // end of class Program
In the above I expect the output to be: 17, but instead I get a random high integer that changes with each test. It looks like a memory address or something. Can someone tell, by looking at the generated IL above, what the problem might be? Any help would be greatly appreciated.

Related

Cannot intercept the method with byte-buddy with #Advice.Origin Method

Using the following example, I am not able to intercept the methods call when I have #Advice.Origin Method method as an argument in my method.
public static void premain(String arguments, Instrumentation instrumentation) throws IOException {
new AgentBuilder.Default()
.type(ElementMatchers.nameEndsWith("Controller"))
.transform((builder, type, classLoader, module) -> {
return builder.method(ElementMatchers.any()).intercept(MethodDelegation.to(AccessInterceptor.class));
}
).installOn(instrumentation);
}
#RuntimeType
public static Object intercept(#Advice.Origin Method method, #SuperCall Callable<?> callable) throws Exception {
System.out.println("intercept");
return callable.call();
}
If I remove #Advice.Origin Method method, the code starts working
#RuntimeType
public static Object intercept(#SuperCall Callable<?> callable) throws Exception {
System.out.println("intercept");
return callable.call();
}
There is a difference between #Advice.Origin and #Origin. Advice can do less then delegation but inlines its code. You need to adjust your imports.

The equivalent of c# virtual in vb when calling method *within* base class

I understand how to use VB's Overridable and Overrides, to get similar functionality to c#'s virtual when calling methods of a class. However, consider the c# console code below, which calls an overridden method from within the class itself:
class Program {
static void Main(string[] args) {
new test();
new test2();
Console.ReadKey();
}
}
public class test {
public test() {
hello();
}
public virtual void hello() {
Console.WriteLine("hello from base method");
}
}
class test2 : test {
public override void hello() {
Console.WriteLine("hello from overridden method");
}
}
The result I get, predictably, in c# is:
hello from base method
hello from overridden method
The problem is, I can't work out how to duplicate this functionality in VB.NET. Keep in mind here that hello() is being called from within the base class code, which runs the overridden method. That is what I can't seem to accomplish in VB.
No matter what I try in VB, the base class's hello() is always called, not the overridden hello().
Class Test:
Public Class Test
Public Sub New()
Hello()
End Sub
Public Overridable Sub Hello()
Console.WriteLine("hello from base method")
End Sub
End Class
Class Test2:
Public Class Test2
Inherits Test
Public Overrides Sub Hello()
Console.WriteLine("hello from overridden method")
End Sub
End Class
Sub Main:
Sub Main()
Dim x As New Test
Dim y As New Test2
End Sub

Why is IsConst emitted twice in char * const a

I've disassmebled the following C++/CLI code in ildasm:
Managed(char * const a)
{
}
and the disassembled IL looks like this:
.method public hidebysig specialname rtspecialname
instance void .ctor(int8 modopt([mscorlib]System.Runtime.CompilerServices.IsSignUnspecifiedByte)* modopt([mscorlib]System.Runtime.CompilerServices.IsConst) modopt([mscorlib]System.Runtime.CompilerServices.IsConst) a) cil managed
Removing some insignificant parts:
.method public hidebysig specialname rtspecialname
instance void .ctor(int8* modopt(IsConst) modopt(IsConst) a) cil managed
So while there is only one const in the original code, it is emitted twice in the IL. Why is that so?

SessionFactory thread safe issue

Sometimes I get this error stack trace at my web app:
[ArgumentException: An item with the same key has already been added.]
System.ThrowHelper.ThrowArgumentException(ExceptionResource resource) +52
System.Collections.Generic.Dictionary`2.Insert(TKey key, TValue value, Boolean add) +10695474
System.Collections.Generic.Dictionary`2.Add(TKey key, TValue value) +10
NHibernate.Util.ThreadSafeDictionary`2.Add(TKey key, TValue value) +93
NHibernate.Type.TypeFactory.GetType(NullableType defaultUnqualifiedType, Int32 length, GetNullableTypeWithLength ctorDelegate) +88
NHibernate.Type.TypeFactory.<RegisterDefaultNetTypes>b__c(Int32 l) +82
NHibernate.Type.TypeFactory.BuiltInType(String typeName, Int32 length) +46
NHibernate.Mapping.SimpleValue.GetHeuristicType() +168
NHibernate.Mapping.SimpleValue.get_Type() +49
NHibernate.Mapping.SimpleValue.IsValid(IMapping mapping) +30
NHibernate.Mapping.PersistentClass.Validate(IMapping mapping) +87
NHibernate.Mapping.RootClass.Validate(IMapping mapping) +21
NHibernate.Cfg.Configuration.ValidateEntities() +183
NHibernate.Cfg.Configuration.Validate() +13
NHibernate.Cfg.Configuration.BuildSessionFactory() +36
Framework.Data.Code.BaseSessionFactoryProvider..ctor() +74
Framework.Data.Code.SessionFactoryProvider..ctor() +29
Framework.Data.Code.NestedSessionManager..cctor() +43
My SessionFactoryProvider is thread-safe singletone:
public interface ISessionFactoryProvider
{
ISessionFactory GetSessionFactory();
}
public abstract class BaseSessionFactoryProvider : ISessionFactoryProvider
{
protected readonly ISessionFactory factory;
public ISessionFactory GetSessionFactory()
{
return factory;
}
protected BaseSessionFactoryProvider()
{
factory = GetConfig().BuildSessionFactory();
}
public abstract Configuration GetConfig();
}
public class SessionFactoryProvider : BaseSessionFactoryProvider
{
public static ISessionFactory SessionFactory
{
get { return Instance.factory; }
}
public override Configuration GetConfig()
{
return new Configuration().Configure();
}
public static SessionFactoryProvider Instance
{
get
{
return NestedSessionManager.sessionManager;
}
}
class NestedSessionManager
{
internal static readonly SessionFactoryProvider sessionManager =
new SessionFactoryProvider();
}
}
Also in my app I bind SessionFactoryProvider to ISessionFactoryProvider via ninject
kernel.Bind<ISessionFactoryProvider>().To<SessionFactoryProvider>().InSingletonScope();
So my question why do i get this error?
In my comments, I said I saw no flaw in your Singleton implementation.
In fact, there is an obvious one : your SessionFactoryProvider parameterless constructor is not private, because SessionFactoryProvider does not have any overload for its constructor.
So the compiler generates a public parameterless constructor for SessionFactoryProvider
see Should we always include a default constructor in the class?
So any code can instantiate a new SessionFactoryProvider through
this public constructor (this should be easy to test) which is used by Ninject to instantiate the class. (This was the point that puzzled me : how does Ninject instantiate the class ? It should not be able to instantiate a class without public constructor).
I guess this is how you end-up with duplicate SessionFactoryProvider.
You should implement your SessionFactoryProvider with a public constructor, leaving out any consideration of singleton implementation. Then just rely on Ninject and its InSingletonScope() to provide singleton functionnality

Strange behaviour when using dynamic types as method parameters

I have the following interfaces that are part of an existing project. I'd like to make it possible to call the Store(..) function with dynamic objects. But I don't want to change the Interface hierarchy (if at all possible).
public interface IActualInterface
{
void Store(object entity);
}
public interface IExtendedInterface : IActualInterface
{
//Interface items not important
}
public class Test : IExtendedInterface
{
public void Store(object entity)
{
Console.WriteLine("Storing: " + entity.ToString());
}
}
and the following code:
IExtendedInterface extendedInterfaceTest = new Test();
IActualInterface actualInterfaceTest = new Test();
Test directTest = new Test();
dynamic employee = new ExpandoObject();
employee.Name = "John Smith";
employee.Age = 33;
employee.Phones = new ExpandoObject();
employee.Phones.Home = "0111 123123";
employee.Phones.Office = "027 321123";
employee.Tags = new List<dynamic>() { 123.4D, 99.54D };
try
{
extendedInterfaceTest .Store(employee);
}
catch (RuntimeBinderException rbEx)
{
Console.WriteLine(rbEx.Message);
}
//Casting as (object) works okay as it's not resolved at runtime
extendedInterfaceTest.Store((object)employee);
//this works because IActualInterface implements 'Store'
actualInterfaceTest.Store(employee);
//this also works okay (directTest : IProxyTest)
directTest.Store(employee);
When I call extendedInterfaceTest.Store(employee), it raises a runtime binder exception. Why does the interface type make a difference when it's the same underlying type? I can call it on IActualInterface and Type, but not IExtendedInterface?
I understand that when calling a function with a dynamic parameter, the resolution happens at runtime, but why the different behaviours?
What you need to remember is that dynamic resolution basically does the same process as static resolution, but at runtime. Anything that couldn't be resolved by the CLR won't be resolved by the DLR.
Let's take this small program, inspired by yours, and that doesn't use dynamic at all:
namespace ConsoleApplication38 {
public interface IActualInterface {
void Store(object entity);
}
public interface IExtendedInterface : IActualInterface {
}
public class TestInterface : IExtendedInterface {
public void Store(object entity) {
}
}
public abstract class ActualClass {
public abstract void Store(object entity);
}
public abstract class ExtendedClass : ActualClass {
}
public class TestClass : ExtendedClass {
public override void Store(object entity) {
}
}
class Program {
static void TestInterfaces() {
IActualInterface actualTest = new TestInterface();
IExtendedInterface extendedTest = new TestInterface();
TestInterface directTest = new TestInterface();
actualTest.Store(null);
extendedTest.Store(null);
directTest.Store(null);
}
static void TestClasses() {
ActualClass actualTest = new TestClass();
ExtendedClass extendedTest = new TestClass();
TestClass directTest = new TestClass();
actualTest.Store(null);
extendedTest.Store(null);
directTest.Store(null);
}
static void Main(string[] args) {
TestInterfaces();
TestClasses();
}
}
}
Everything compiles fine. But what did the compiler really generate? Let's see using ILdasm.
For the interfaces:
// actualTest.Store
IL_0015: callvirt instance void ConsoleApplication38.IActualInterface::Store(object)
// extendedTest.Store
IL_001d: callvirt instance void ConsoleApplication38.IActualInterface::Store(object)
// directTest.Store
IL_0025: callvirt instance void ConsoleApplication38.TestInterface::Store(object)
We can see here that the C# compiler always generates calls for the interface or class where the method is defined. IActualInterface has a method slot for Store so it's used for actualTest.Store. IExtendedInterface doesn't, so IActualInterface is used for the call. TestInterface defines a new method Store, using the newslot IL modifier, effectively assigning a new slot in the vtable for that method, so it's directly used since directTest is of type TestInterface.
For the classes:
// actualTest.Store
IL_0015: callvirt instance void ConsoleApplication38.ActualClass::Store(object)
// extendedTest.Store
IL_001d: callvirt instance void ConsoleApplication38.ActualClass::Store(object)
// directTest.Store
IL_0025: callvirt instance void ConsoleApplication38.ActualClass::Store(object)
For the 3 different types, the same call is generated because the method slot is defined on ActualClass.
Let's now see what we get if we write the IL ourselves, using the type we want rather than letting the C# compiler choosing it for us. I've modified the IL to look like this:
For interfaces:
// actualTest.Store
IL_0015: callvirt instance void ConsoleApplication38.IActualInterface::Store(object)
// extendedTest.Store
IL_001d: callvirt instance void ConsoleApplication38.IExtendedInterface::Store(object)
// directTest.Store
IL_0025: callvirt instance void ConsoleApplication38.TestInterface::Store(object)
For classes:
// actualTest.Store
IL_0015: callvirt instance void ConsoleApplication38.ActualClass::Store(object)
// extendedTest.Store
IL_001d: callvirt instance void ConsoleApplication38.ExtendedClass::Store(object)
// directTest.Store
IL_0025: callvirt instance void ConsoleApplication38.TestClass::Store(object)
The program compiles fine with ILasm. However it fails to pass peverify and crashes at runtime with the following error:
Unhandled Exception:
System.MissingMethodException: Method
not found: 'Void
ConsoleApplication38.IExtendedInterface.Store(System.Object)'.
at
ConsoleApplication38.Program.TestInterfaces()
at
ConsoleApplication38.Program.Main(String[]
args)
If you remove this invalid call, the derived classes calls work fine without any error. The CLR is able to resolve the base method from the derived type call. However interfaces have no true representation in runtime, and the CLR isn't able to resolve the method call from the extended interface.
In theory, the C# compiler could emit the call directly to the correct class specified in the runtime. It would avoid problems about middle classes calls as seen on Eric Lippert's blog. However as demonstrated, this is not possible for interfaces.
Let's get back to the DLR. It resolves the method exactly the same way as the CLR. We've seen that IExtendedInterface.Store couldn't be resolved by the CLR. The DLR cannot either! This is totally hidden by the fact that the C# compiler will emit the right call, so always be careful when using dynamic unless you perfectly know how it works in the CLR.