NDepend CQL Query for missing IDisposable implementation - idisposable

I realize that the query this question is looking for won't be enough to find every little problem with IDisposable implementations, but every early warning counts, so I'll take what I can get.
I'd like to know if anyone has come up with a CQL query for NDepend that will list all classes that doesn't implement IDisposable, but has one or more fields that does. A class could end up on the resultlist of this query either through a bug (ie. someone forgot to check the field types for IDisposable implementations), or through code evolution (ie. a class used in a field somewhere gets IDisposable tacked on at a later date without all usages being updated).
The simple query to find all classes that doesn't implement IDisposable is:
SELECT TYPES WHERE !Implement "System.IDisposable"
However, this will of course not check if the class should implement IDisposable for the above rule.
Does anyone have such a query? I'm still getting to grips with CQL so this part eludes me.

Lasse, thanks to CQLinq (Code Rule over LINQ) capabilities matching types that should implement IDisposable is now possible. Actually two related default rules are now provided, and you can easily write your own related rules:
Types with disposable instance fields must be disposable
Disposable types with unmanaged resources should declare finalizer
// <Name>Types with disposable instance fields must be disposable</Name>
warnif count > 0
let iDisposable = ThirdParty.Types.WithFullName("System.IDisposable").FirstOrDefault()
where iDisposable != null // iDisposable can be null if the code base doesn't use at all System.IDisposable
from t in Application.Types where
!t.Implement(iDisposable) &&
!t.IsGeneratedByCompiler
let instanceFieldsDisposable =
t.InstanceFields.Where(f => f.FieldType != null &&
f.FieldType.Implement(iDisposable))
where instanceFieldsDisposable.Count() > 0
select new { t, instanceFieldsDisposable }
// <Name>Disposable types with unmanaged resources should declare finalizer</Name>
// warnif count > 0
let iDisposable = ThirdParty.Types.WithFullName("System.IDisposable").SingleOrDefault()
where iDisposable != null // iDisposable can be null if the code base deosn't use at all System.IDisposable
let disposableTypes = Application.Types.ThatImplement(iDisposable)
let unmanagedResourcesFields = disposableTypes.ChildFields().Where(f =>
!f.IsStatic &&
f.FieldType != null &&
f.FieldType.FullName.EqualsAny("System.IntPtr","System.UIntPtr","System.Runtime.InteropServices.HandleRef")).ToHashSet()
let disposableTypesWithUnmanagedResource = unmanagedResourcesFields.ParentTypes()
from t in disposableTypesWithUnmanagedResource
where !t.HasFinalizer
let unmanagedResourcesTypeFields = unmanagedResourcesFields.Intersect(t.InstanceFields)
select new { t, unmanagedResourcesTypeFields }

Related

OOP: Inheriting from immutable objects

Background
Suppose I have some set of fields which are related to each other I therefore make a class to gather them. Let us call this class Base. There are certain methods as well, which operate on these fields which will be common to all derived classes. Additionally, let us suppose we want Base and all its derived classes to be immutable.
In different contexts, these fields support additional operations, so I have different derived classes which inherit the fields and provide additional methods, depending on their context. Let us call these Derived1, Derived2, etc.
In certain scenarios, the program needs instances of a derived class, but the state of the fields must satisfy some condition. So I made a class RestrictedDerived1 which makes sure that the condition is satisfied (or changes the parameters to conform if it can) in the constructor before calling its base constructor, or throws an error otherwise.
Further, there are situations where I need even more conditions to be met, so I have SuperRestrictedDerived1. (Side note: given that some conditions are met, this class can more efficiently compute certain things, so it overrides some methods of Derived1.)
Problem
So far so good. The problem is that most of the methods of all these classes involve making another instance of some class in this hierarchy (not always the same as the one that the method was called on, but usually the same one) based on itself, but with some modifications which may involve somewhat complex computation (i.e. not just changing one field). For example one of the methods of Derived1 might look like:
public Derived1 foo(Base b) {
TypeA fieldA = // calculations using this and b
TypeB fieldB = // more calculations
// ... calculate all fields in this way
return new Derived1(fieldA, fieldB, /* ... */);
}
But then down the hierarchy RestrictedDerived1 needs this same function to return an instance of itself (obviously throwing an error if it can't be instantiated), so I'd need to override it like so:
#Override
public ResrictedDerived1 foo(Base b) {
return new RestrictedDerived1(super.foo(b));
}
This requires a copy constructor, and unnecessarily allocating an intermediate object which will immediately destroyed.
Possible solution
An alternative solution I thought of was to pass a function to each of these methods which constructs some type of Base, and then the functions would look like this:
// In Derived1
public Derived1 foo(Base b, BaseCreator creator) {
TypeA fieldA = // calculations using this and b
TypeB fieldB = // more calculations
// ... calculate all fields in this way
return creator.create(fieldA, fieldB, /* ... */);
}
public Derived1 foo(Base b) {
return foo(b, Derived1::create);
}
public static Derived1 create(TypeA fieldA, TypeB fieldB, /* ... */) {
return new Derived1(fieldA, fieldB, /* ... */);
}
// In RestrictedDerived1
#Override
public ResrictedDerived1 foo(Base b) {
return (RestrictedDerived1) foo(b, RestrictedDerived1::create);
}
public static RestrictedDerived1 create(TypeA fieldA, TypeB fieldB, /* ... */) {
return new RestrictedDerived1(fieldA, fieldB, /* ... */);
}
My question
This works, however it feels "clunky" to me. My question is, is there some design pattern or concept or alternative design that would facilitate my situation?
I tried do use generics, but that got messy quick, and didn't work well for more than one level of inheritance.
By the way, the actual classes that these refer to is 3D points and vectors. I have a base called Triple with doubles x, y, and z (and some functions which take a lambda and apply them to each coordinate and construct a new Triple with the result). Then I have a derived class Point with some point related functions, and another derived class Vector with its functions. Then I have NonZeroVector (extends Vector) which is a vector that cannot be the zero vector (since other objects that need a vector sometimes need to be guaranteed that it's not the zero vector, and I don't want to have to check that everywhere). Further, I have NormalizedVector (extends NonZeroVector) which is guaranteed to have a length of 1, and will normalize itself upon construction.
MyType
This can be solved using a concept variously known as MyType, this type, or self type. The basic idea is that the MyType is the most-derived type at runtime. You can think of it as the dynamic type of this, but referred to statically (at "compile time").
Unfortunately, not many mainstream programming languages have MyTypes, but e.g. TypeScript does, and I was told Raku does as well.
In TypeScript, you could solve your problem by making the return type of foo the MyType (spelled this in TypeScript). It would look something like this:
class Base {
constructor(public readonly fieldA: number, public readonly fieldB: string) {}
foo(b: Base): this {
return new this.constructor(this.fieldA + b.fieldA, this.fieldB + b.fieldB);
}
}
class Derived1 extends Base {
constructor(fieldA: number, fieldB: string, protected readonly repeat: number) {
super(fieldA * repeat, fieldB.repeat(repeat));
}
override foo(b: Base): this {
return new this.constructor(
this.fieldA + b.fieldA, this.fieldB + b.fieldB, this.repeat
);
}
}
class RestrictedDerived1 extends Derived1 {
constructor(fieldA: number, fieldB: string, repeat: number) {
super(fieldA * repeat, fieldB.repeat(repeat), repeat);
if (repeat >= 3) {
throw new RangeError(`repeat must be less than 3 but is ${repeat}`)
}
}
}
const a = new RestrictedDerived1(23, 'Hello', 2);
const b = new Base(42, ' World');
const restrictedDerived = a.foo(b); // Inferred type is RestrictedDerived1
Slightly b0rken Playground link
Implicit factories
In a language with type classes or implicits (like Scala), you could solve your problem with implicit Factory objects. This would be similar to your second example with the Creators, but without the need to explicitly pass the creators around everywhere. Instead, they would be implicitly summoned by the language.
In fact, your requirement is very similar to one of the core requirements of the Scala Collections Framework, namely that you want operations like map, filter, and reduce to only be implemented once, but still preserve the type of the collection.
Most other Collections Frameworks are only able to achieve one of those goals: Java, C#, and Ruby, for example, only have one implementation for each operation, but they always return the same, most-generic type (Stream in Java, IEnumerable in C#, Array in Ruby). Smalltalk's Collections Framework is type-preserving, but has duplicated implementations for every operation. A non-duplicated type-preserving Collections Framework is one of the holy grails of abstractions designers / language designers. (It's no coincidence that so many papers that present novel approaches to OO uses a refactoring of the Smalltalk Collection Framework as their working example.)
F-bounded Polymorphism
If you have neither MyType nor implicit builders available, you can use F-bounded Polymorphism.
The classic example is how Java's clone method should have been designed:
interface Cloneable<T extends Cloneable<T>> {
public T clone();
}
class Foo implements Cloneable<Foo> {
#Override
public Foo clone() {
return new Foo();
}
}
JDoodle example
However, this gets tedious very quickly for deeply-nested inheritance hierarchies. I tried to model it in Scala, but I gave up.

Comparison operator overloading for class in D?

I am currently learning D and struggling to understand how operator overloading could work for a class? Overriding opCmp makes sense and works correctly for a struct, but for a class it requires taking the right hand side as a Object instead of as my type.
This means I can't access any members to do a comparison. What's the point in the overload then? Am I missing something?
Sure you can access your members:
class MyClass {
int member;
override int opCmp(Object other) {
if (auto mcOther = cast(MyClass)other) {
// other, and thus mcOther, is an instance of MyClass.
// So we can access its members normally:
return member < mcOther.member ? -1
: member > mcOther.member ? 1
: 0;
} else {
// other is not a MyClass, so we give up:
assert(0, "Can't compare MyClass with just anything!");
}
}
}
The reason opCmp for classes takes Object as a parameter is it's being introduced in the Object class, from which every D class derives. Introducing opCmp there was a sensible choice back in the day, but less so now. However, since we don't want to break every piece of D code out there that uses opCmp (and opEquals, toHash and toString) with classes, we're kinda stuck with that choice.

Best way of handling multiple object instances

Due in part to the fact that I cannot create data classes without parameters in Kotlin, I use objects for those cases, e.g.
sealed class Node {
object Leaf : Node()
data class Branch(val left:Node, val right:Node) : Node()
}
The issue is that sometimes, I end up with multiple instances of the Leaf class. Obviously this should not generally happen, but it occurs when serializing and deserializing with some frameworks and with some test cases.
Now, you might argue that I should fix those cases, but it's hard to know where they might be, and not always possible or desirable to modify the deserialization semantics of frameworks.
As such, I want all instances of my objects to act as the same value, much like a parameter-less data class would (or a parameterless case class in Scala).
My best solution so far is the following, included in every object I create that might encounter this issue:
object SaneLeaf {
override fun hashCode() = javaClass.hashCode()
override fun equals(other: Any?) = other?.javaClass == javaClass
}
Obviously, this is verbose and error prone, since it doesn't seem possible to abstract away those implementations to an interface. A super-class would work in cases where the object doesn't need to extend another class, but that's often not the case.
Alternatively, I can create a data class with a Nothing parameter. But that seems like even more of a hack.
How have others dealt with this issue? Are there plans to add hashCode and equals implementations to objects that follow the presumed semantics of those classes (all instances should be equal / same hashCode)?
I believe having multiple instances of an object's underlying class is really an issue you should fix, but there's a simpler workaround that allows you to have the equality semantics you described.
You can define an abstract class that performs the equality logic and make the sealed class inherit from it, like this:
abstract class SingletonEquality {
override fun equals(other: Any?): Boolean =
this::class.objectInstance != null && other?.javaClass == this.javaClass ||
super.equals(other)
override fun hashCode(): Int =
if (this::class.objectInstance != null)
javaClass.hashCode() else
super.hashCode()
}
And the usage:
sealed class Node : SingletonEquality() {
object Leaf : Node()
data class Branch(val left:Node, val right:Node) : Node()
}
Here, Leaf will inherit the equals implementation from SingletonEquality and get compared just the way you want.

Dart serializing immutable objects

I would like serialize this immutable class
class CatalogueItem {
final Uri source;
final DateTime analyis;
final Period fromTo;
CatalogueItem.create(this.source, this.analyis, this.fromTo);
}
But I cannot as it is not a simple class. From the web site https://www.dartlang.org/articles/serialization/
Simple: All of the objects to be serialized are data transfer objects
(DTOs) with a default constructor.
So I have to add a default constructor - which means I have to drop the final keywords and my class is no longer immutable.
class CatalogueItem {
Uri source;
DateTime analyis;
Period fromTo;
CatalogueItem.create(this.source, this.analyis, this.fromTo);
CatalogueItem(){}
}
Is there any way around this one?
I think the default constructor is only necessary for deserialization (never used a package for (de)serialization). Serialization shouldn't need it.
The default constructor is redundant because if the deserialization package needs a default constructor it obviously attempts to create an instance using the default constructor to afterwards set the field values, which can't work with final fields.
I don't know if a serialization package supports a custom toJson() method/fromJson() constructor but I think this would be the easiest way to go.
class CatalogueItem {
final Uri source;
final DateTime analysis;
final Period fromTo;
CatalogueItem.create(this.source, this.analysis, this.fromTo);
factory CatalogueItem.fromJson(Map json) {
return new CatalogueItem.create(
json['source'] == null ? null : Uri.parse(json['source']),
json['analysis'] == null ? null : DateTime.parse(json['analysis'])),
json['fromTo'] == null ? null : new Period.fromJson(json['fromTo']));
}
Map toJson() {
return {
'source': source == null ? null : '$source',
'analysis': analysis == null ? null : '$analysis',
'fromTo': fromTo == null ? null : fromTo.toJson();
}
}
https://github.com/google/built_value.dart may do what you want -- it is specifically for creating immutable classes and serializing them.
Note that this requires a slightly different form for the class. This is to allow built_value to generate an implementation for you, and serializers.
abstract class CatalogueItem
implements Built<CatalogueItem, CatalogueItemBuilder> {
static Serializer<CatalogueItem> get serializer
=> _$catalogueItemSerializer;
Uri get source;
DateTime get analyis;
Period get fromTo;
factory CatalogueItem([updates(CatalogueItemBuilder b)]) =
_$CatalogueItem;
CatalogueItem._();
}
The generated implementation is immutable (uses final), and also provides operator==, hashCode and toString.
More detailed example: https://github.com/google/built_value.dart/blob/master/example/lib/example.dart
One option is to read further in the article and use the serialization package, which does handle such cases.

Would this still be considered a Chain-of-Responsiblity pattern?

I have been using a design pattern for quite some time and have been calling/referring to it as a "Chain-of-Responsibility pattern" but now I realise there are differences, and it may not be appropriate to do so. So my question is 1, "is the following an instance of this pattern, or should it be called something else?", and 2, "is there any reason I should prefer the traditional way?".
I often use the following pattern when developing software. I have an interface that defines a functor, something like this.
interface FooBar{
boolean isFooBar( Object o );
}
These are usually search, filtering, or processing classes; usually something like Comparator. The implementation method is usually functional (i.e. side-effect free). Eventually, I find myself creating an implementation of the interface that looks like:
class FooBarChain implements FooBar{
FooBar[] foobars;
FooBarChain( FooBar... fubars ){
foobars = fubars;
}
boolean isFooBar( Object o ){
for( FooBar f : foobars )
if( f.isFooBar( o ) )
return true;
return false;
}
}
Its not always booleans either -I've used this pattern with mutable objects as well- but there is always a short-circuiting condition (e.g. returns true, the String is empty String, a flag gets set etc).
Until now I have generally calling this a "Chain of Responsibility" pattern, considering the issue of inheriting from a base class to be an implementation detail. However, today I have realised an important difference: the objects along the chain cannot interrupt the rest of chain. There is no way for an implementation to say "this is false, and I can guarantee it will be false for any condition" (nb: short-circuits only on true ).
So, should this be called something other than a chain-of-responsibility pattern? Are there any concerns or issues I should consider when using this approach over the traditional having the instances pass the message along.
I wouldn't call this chain of Chain of Responsibility.
In Chain of Responsibility, the "short-circuit" is roughly "I can handle this, so the next guy in the chain doesn't have to" rather than being a return value of any kind. It's normal for each object in the chain to know who is next in the chain and to pass control to that next object as necessary. They normally do something rather than returning a value.
The example you've presented it is perfectly reasonable, though I'm not sure it's a named pattern. I'm not too clear right now on the other variants you describe.
What you have is a chain-of-responsibility, but you can make a 'pure' chain of responsibility by adding a few small changes.
You can create an enum that will represent the 3 different results that you are expecting from this function.
public enum Validity{
Invalid,
Indeterminate,
Valid
}
You can change the interface to be chain-able like so:
public interface ChainFooBar{
public boolean isFooBar(Object o);
public Validity checkFooBar(Object o);
}
Most of your FooBars would then have to implement a method like this:
public abstract class AbstractFooBar implements FooBar{
public Validity checkFooBar(Object o){
return this.isFooBar(o) ? Validity.Valid : Validity.Indeterminate;
}
}
Then you can change your chain to check for either of the definite answers.
public class FooBarChain implements FooBar{
private FooBar[] fooBars;
public FooBarChain(FooBar... fooBars){
this.fooBars = fooBars;
}
public Validity isFooBar(Object o){
for(FooBar fooBar : this.fooBars){
Validity validity = fooBar.checkFooBar(o);
if(validity != Validity.Indeterminate){
return validity == Validity.Valid;
}
}
return false;
}
}