I cannot understand how Dart Editor analyze source code - code-analysis

Dart Editor version 1.2.0.release (STABLE). Dart SDK version 1.2.0.
This source code produces runtime exception.
void main() {
test(new Base());
}
void test(Child child) {
}
class Base {
}
class Child extends Base {
}
I assumed that the analyzer generates something like this.
The argument type 'Base' cannot be assigned to the parameter type 'Child'
But I can only detect this error at runtime when occurred this exception (post factum).
Unhandled exception:
type 'Base' is not a subtype of type 'Child' of 'child'.

The analyzer is following the language specification here.
It only warns if a the static type of the argument expression is not assignable to the type of function the parameter.
In Dart, expressions of one type is assignable to variables of another type if either type is a subtype of the other.
That is not a safe type check. It does not find all possible errors. On the other hand, it also does not disallow some correct uses like:
Base foo = new Child();
void action(Child c) { ... }
action(foo); // Perfectly correct code at runtime.
Other languages have safe assignment checks, but they also prevent some correct programs. You then have to add (unsafe/runtime checked) cast operators to tell the compiler that you know the program is safe. It's a trade-off where Dart has chosen to be permissive and avoid most casts.

Let's try to be polite and answer the question without any prejudice.
I think I understand what you expected and here my angle on what the error means:
You are invoking the method with the argument of type Base
The method is expecting an argument of type Child
The Child is not equal to the Base, neither is a subtype of it (as a fact it is the Child that is a subtype of the Base)
It is working as expected as it makes only sense to provide object of the expected type (or it's subtypes - specialisations).
Update:
After reading again your question I realised that you are pointing out that editor is not finding the type problem. I assume this is due to the point that Dart programs are dynamic and hence certain checks are not done before the runtime.
Hope it helps ;-)

Related

Type inference in Kotlin lambdas fails when using `it` special variable

I fail to understand, why the following compiles:
directory.listFiles { it -> it.name.startsWith("abc") }
but this doesn't:
directory.listFiles { it.name.startsWith("abc") }
Am I correctly assuming that in the first case, the type of it is inferred via the name property? Why is this not happening in the second case?
It is because there are two possible FunctionalInterfaces that can be used with File.listFiles:
listFiles(FileFilter) - this interface is accept(File pathname)
listFiles(FilenameFilter) - this interface is accept​(File dir, String name)
The compiler cannot work out which you want to use. So how is this better in the case you write it ->?
Well, the compiler inspects the call arguments of the two interface methods and can now see you expect one argument "SOMETHING ->," so the only matching call is the FileFilter variation.
How might you use the FilenameFilter? you'd use this syntax:
directory.listFiles { dir, name -> name.startsWith("abc") }
The magic here is not it - that's a coincidence, but that you declared just one parameter.

Kotlin Function Generics - Upper Bound Not Working

I faced some issue regarding usage of Kotlin generics in functions
fun <T : CharSequence> doSomething(): T {
return String() as T
}
class Something(intValue: Int)
Something(doSomething()) // Doesn't show any compile error
Now when it is executed it throws error
java.lang.ClassCastException: class java.lang.String cannot be cast to class java.lang.Number
Wanted to know why Kotlin compiler is not throwing error for incompatible typecasting
I think what you are seeing is the major compiler bug KT-47664. Though in the bug report they used a much more complex example to demonstrate the issue, the cause of the bug is the same, that being the compiler has inferred an empty intersection type (the intersection of CharSequence and Int is empty) as the type parameter.
The algorithm apparently treats an empty intersection type the same as any other type, doesn't think anything special of it, and so type inference succeeds.
This bug has been fixed by KT-51221 Deprecate inferring type variables into an empty intersection type. From what I understand from reading the reports, there will now be a warning if an empty intersection type is inferred. However, the fix is only included in Kotlin 1.7.20+, which at the time of writing, is not released yet :(

Embed an type of other pkg into mine, and init it by literal

I read how to init embed type, and a related Q&A.
What my problem is when compile this code, I got :
[Error] unknown field 'feature.DefaultSshHelper' in struct literal of type dala02
type FDH feature.DefaultSshHelper
type dala02 struct {
Md5_v string
feature.DefaultSshHelper
//FDH
}
var x_01_h1_p = &dala02{
Md5_v: "",
feature.DefaultSshHelper: feature.DefaultSshHelper{
//FDH: FDH{
// blabla
},
}
// use it by a interface []feature.CmdFioHelper{x_00_h1_p}
At first time, I thought it was an Exported problem, so I added this line 'type FDH feature.DefaultSshHelper'. Now, we have this error :
[Error] cannot use x_01_h1_p (type *dala02) as type feature.CmdFioHelper in array or slice literal:
*dala02 does not implement feature.CmdFioHelper (missing Getnextchecker method)
But a pointer of feature.DefaultSshHelper does implement feature.CmdFioHelper ( a interface ). So pointer of dala02 should also implement that, right? (reference form effective go)
There's an important way in which embedding differs from subclassing. When we embed a type, the methods of that type become methods of the outer type, but when they are invoked the receiver of the method is the inner type, not the outer one.
Question is how to fix this compile error, which line is wrong? I'm not a expert for golang, thanks for your advice :). BTW I do find some workaround.
When you refer to embedded fields, you have to leave out the package name of the embedded type, as the unqualified type name acts as the field name.
Spec: Struct types:
A field declared with a type but no explicit field name is an anonymous field, also called an embedded field or an embedding of the type in the struct. An embedded type must be specified as a type name T or as a pointer to a non-interface type name *T, and T itself may not be a pointer type. The unqualified type name acts as the field name.
So simply write:
var x_01_h1_p = &dala02{
Md5_v: "",
DefaultSshHelper: feature.DefaultSshHelper{
// blabla
},
}
Your other attempt type FDH feature.DefaultSshHelper falls short as this type declaration creates a new type with zero methods: the type FDH does not "inherit" the methods of feature.DefaultSshHelper. And thus any type that embeds FDH will also lack methods of feature.DefaultSshHelper.

CLI/C++ function overload

I am currently writing a wrapper for a native C++ class in CLI/C++. I am on a little GamePacket class at the moment. Consider the following class:
public ref class GamePacket
{
public:
GamePacket();
~GamePacket();
generic<typename T>
where T : System::ValueType
void Write(T value)
{
this->bw->Write(value);
}
};
I want that I'm able to call the function as following in C#, using my Wrapper:
Packet.Write<Int32>(1234);
Packet.Write<byte>(1);
However, I can't compile my wrapper. Error:
Error 1 error C2664: 'void System::IO::BinaryWriter::Write(System::String ^)' : cannot convert argument 1 from 'T' to 'bool'
I don't understand this error, where does the System::String^ comes from. I'm seeing a lot of overloads of the Write() method, does CLI/C++ not call the correct one, and if so, how can I make it call the correct one?
Reference MSDN: http://msdn.microsoft.com/en-us/library/system.io.binarywriter.write(v=vs.110).aspx
Templates and generics don't work the same.
With templates, the code gets recompiled for each set of parameters, and the results can be pretty different (different local variable types, different function overloads selected). Specialization makes this really powerful.
With generics, the code only gets compiled once, and the overload resolution is done without actually knowing the final parameters. So when you call Write(value), the only things the compiler knows is that
value can be converted to Object^, because everything can
value derives from ValueType, because your constraint tells it
Unfortunately, using just that information, the compiler can't find an overload of Write that can be used.
It seems like you expected it to use Write(bool) when T is bool, Write(int) when T is int, and so on. Templates would work like that. Generics don't.
Your options are:
a dozen different copies of your method, each of which has a fixed argument type that can be used to select the right overload of BinaryWrite::Write
find the overload yourself using reflection, make a delegate matching the right overload, and call it
use expression trees or the dynamic language runtime to find and make a delegate matching the right overload, and then you call it

Java: Why method type in .class file contains return type, not only signature?

There is a "NameAndType" structure in the constants pool in .class file.
It is used for dynamic binding.
All methods that class can "export" described as "signature + return type".
Like
"getVector()Ljava/util/Vector;"
That breakes my code when return type of the method in some .jar is changed, even if new type is narrower.
i.e:
I have the following code:
List l = some.getList();
External .jar contains:
public List getList()
Than external jar changes method signature to
public ArrayList getList().
And my code dies in run-time with NoSuchMethodException, because it can't find
getList()Ljava/util/List;
So, I have to recompile my code.
I do not have to change it. Just recompile absolutely the same code!
That also gives ability to have two methods with one signature, but different return types! Compiler would not accept it, but it is possible to do it via direct opcoding.
My questions is why?
Why they did it?
I have only one idea: to prevent sophisticated type checking in the runtime.
You need to look up to the hierarchy and check if there is a parent with List interface.
It takes time, and only compiler has it. JVM does not.
Am I right?
thanks.
One reason may be because method overloading (as opposed to overriding) is determined at compile time. Consider the following methods:
public void doSomething(List util) {}
public void doSomething(ArrayList util) {}
And consider code:
doSomething(getList());
If Java allowed the return type to change and did not throw an exception, the method called would still be doSomething(List) until you recompiled - then it would be doSomething(ArrayList). Which would mean that working code would change behavior just for having recompiled it.