Why are case objects serializable and case classes not? - serialization

I am playing with this example http://scala.sygneca.com/code/remoteactors to learn how remote actors work in Scala (2.8.0). In particular I slightly modified how the messages send by the actors are defined as it follows:
sealed trait Event extends Serializable
case object Ping extends Event
case object Pong extends Event
case object Quit extends Event
and everything works as expected. Unfortunately if I define the events as case classes instead of case objects as in:
sealed trait Event extends Serializable
case class Ping extends Event
case class Pong extends Event
case class Quit extends Event
my example stop working. In more detail it seems that while case objects are serializable, case classes aren't. Indeed when I try to run my example with this last modification I get the following exception:
scala.actors.remote.DelegateActor#148cc8c: caught java.io.NotSerializableException: scalachat.remote.Ping$
java.io.NotSerializableException: scalachat.remote.Ping$
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1156)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:326)
at scala.actors.remote.JavaSerializer.serialize(JavaSerializer.scala:46)
at scala.actors.remote.NetKernel.namedSend(NetKernel.scala:38)
at scala.actors.remote.NetKernel.forward(NetKernel.scala:71)
at scala.actors.remote.DelegateActor$$anonfun$act$1$$anonfun$apply$1.apply(Proxy.scala:182)
at scala.actors.remote.DelegateActor$$anonfun$act$1$$anonfun$apply$1.apply(Proxy.scala:123)
at scala.actors.ReactorTask.run(ReactorTask.scala:34)
at scala.actors.ReactorTask.compute(ReactorTask.scala:66)
at scala.concurrent.forkjoin.RecursiveAction.exec(RecursiveAction.java:147)
at scala.concurrent.forkjoin.ForkJoinTask.quietlyExec(ForkJoinTask.java:422)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.mainLoop(ForkJoinWorkerThread.java:340)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:325)
Is there a reason why case objects can be made serializable and case classes can't? Is there a way to make my example working with case classes either?
Edit: as suggested by Victor and confirmed by Aaron I am sending the companion object as message instead of the class. Moreover inspecting the compiled code with javap it appears evident that while the class is serializable:
public class scalachat.remote.Ping extends java.lang.Object implements scalachat.remote.Event,java.io.Serializable,scala.ScalaObject,scala.Product
the companion object is not:
public final class scalachat.remote.Ping$ extends scala.runtime.AbstractFunction0 implements scala.ScalaObject
Now the question is: how can I specify that I want to use the class instead of the companion object? I also added an empty couple of parenthesis when I send the message as suggested by Aaron like in:
pong ! Ping()
but nothing is changed. In the end I also added a fake parameter to the case class
case class Ping(i: Int) extends Event
sending the message as:
pong ! Ping(0)
but without experiencing any difference still. Any suggestion?

#serializable case class Foo
I was also surprised that case objects were serializable per default.
Edit: After reading the exception properly I suspect that:
You're trying to send the generated companion object of the case class over the wire, instead of an instance of the case class.

Case classes without parameters are meaningless and deprecated. And I see no Serializable in Scala, just serializable. Does it work if you fix these things?

Related

When to use singletons in OOP?

When reading about singletons, I have found this explanation as a reason to use singleton:
since these object methods are not changing the internal class state, we
can create this class as a singleton.
What does this really mean ? When you consider that some method is not changing internal class state ? If it is a getter ? Can someone provide code examples for class that uses methods that are not changing its internal state, and therefore can be used as a singleton, and class that should not be a singleton ?
Usually, when people are explaining singleton pattern, they use DB connection class as an example. And that makes sense to me, because I know that I want to have only one db connection during one application instance. But what if I want to provide an option to force using the new connection when I instantiate DB connection class? If I have some setter method, or constructor parameter that forces my class to open new connection, is that class still a subject to be a singleton ?
I am using PHP, but may understand examples written in JAVA, C#...
This is the article reference. You can ctrl+f search for "internal". Basically, autor is explaining why FileStorage class is a good candidate to be a singleton. I do not understand this sentance
"These operations do not change the internal class state, so we can
create its instance once and use it multiple times."
and therefore I do not understand when to use singletons.
In their example, they have some FileStorage class :
class FileStorage
{
public function __contruct($root) {
// whatever
}
public function read() {
// whatever
}
public function write($content) {
// whatever
}
}
And they say that this class can be a singleton since its methods read() and write() do not chage internal class structure. What does that mean ? They are not setters and class is automatically singleton ?
The quote reads:
These operations do not change the internal class state, so we can create its instance once and use it multiple times.
This means that the object in question has no interesting internal state that could be changed; it’s just a collection of methods (that could probably be static). If the object has no internal state, you don’t have to create multiple instances of it, you can keep reusing a single one. Therefore you can configure the dependency injection container to treat the object as a singleton.
This is a performance optimization only. You could create a fresh instance of the class each time it’s needed. And it would be better – until the object creation becomes a measurable bottleneck.

Class inheritance in Mule Java / POJO Component

I am trying to use a base class with common methods and to extend it other more specific classes.
public class myClass extends myBaseClass {}
The extended class I am using as a Java component in a flow:
<component class="org.example.MyClass" doc:name="Java"/>
When I am not calling methods from the parent class, everything works well. But each time I try calling one Mule is throwing an exception:
DefaultJavaComponent{vtigersapFlow1.component.9760166}. Message payload is of type: String
In my base class I am using:
#Lookup
private MuleContext muleContext;
and and a NullPointerExcception is thrown when I am doing:
muleContext.getRegistry().get("system.uri");
It's possible that the #Lookup annotation is not honoured correctly in a class hierarchy.
Instead of looking values in the registry, you should receive them by injection, i.e. get system.uri by injection.
You may find that switching to Spring beans will help a lot in doing so. component class= is really for basic stuff.

How to assign value to an implementation of an interface, but the interface doesn't have setter method?

I have an interface: Show, and i have the implementation class calls ShowImpl, and also i have a implementation class calls ManageShowImpl. I have completed all the methods inside ManageShowImpl. Now i am doing Junit testing. The method i defined in the ManageShowImpl, for example: addShows(Show... shows), now i want to assign values to the show array: Show[], but in the interface: Show, i don't have setter method(which is not supposed inside interface), can some expert tell me how can i add the value to Show[].
If I understood correctly your issue, I think you can simply set values in your constructor:
public class ShowImpl implements Show{
private Show[] shows;
public ShowImpl(Show... shows){
this.shows = shows;
}
#Override
public void someInterfaceMethod(){
// ...
}
}
(I am not a junit expert, or even a beginner, but maybe I can inspire a few to answer. I have done a fair amount of testing.)
Given a class with a constructor, you can always create an instance, fill it with whatever data you want, and test it any way you want. Interfaces are a lot more limited. Testing aside, this is a very good thing. It limits the damage someone can do if they get hold of an interface implementation; it safely encapulates the data. But you cannot test an interface in isolation. You need to create an instance of an implementing class first. At that point you should fill in your array. Then pass it to a test method as an interface instance to test the interface.

In an ideal OO design, should inherited objects implicity call super() on each of his methods?

That is, when child.update() is called, should the instance of a derived class implicity call all his superclasses's update() on itself before?
There's no good answer (in the languages I know). Sometimes you want to replace the super method. Sometimes you want to slip something in before it executes, and sometimes after. It does seem the extending class needs to know more about the details of the class it's overriding than it should have to. (This gets awkward with closed-source systems.) Also, the base class really wants to control the behaviour of the calling class sometimes, to force the super method to be called, which isn't right either. I think the best thing is for the super class to document its overridable methods as best it can so the overriding programmer can guess what to do.
The closest I've come to handing this properly and rightly is to make the target method so it cannot be overridden, then have it call a method or methods that do nothing but that can be overridden. Then the overriding class can override whichever methods interest it without being able to undermine the superclass.
The ultimate programming language will have a fool-proof solution to this problem.
No. Someone might need to override update() and wants to prevent exactly any call from a parent. In that case in implicit call would not only hurt performance it also might do things you don't want to.
It really depends on what the superclass / base-class function does. Sometimes I call it first, sometimes I call it last. Sometimes I call it conditionally, and once in a while, I don't call it at all.
Many times (this is coming from a C# background), the base class function just raises an event, and the child class overrides that method to get the event-like functionality. There are cases where the child doesn't want that event to be raised:
class Base {
public event EventHandler UnhandledError;
protected virtual void OnUnhandledError(Error error) {
if (UnhandledError != null)
UnhandledError(this, EventArgs.Empty);
}
}
class Derived : Base {
protected override void OnUnhandledError(Error error) {
if (HandleError(error))
return; // We took care of it. Don't raise the event.
// We couldn't handle it. Let the base class raise the event.
base.OnUnhandledError(error);
}
}
You are not wrapping a class into another, you are inheriting from a super-class.
Overriding super-class methods you should call super.method() only when you need to extend behavior of parent method().

adapter pattern and dependency

I have little doubt about adapter class. I know what's the goal of adapter class. And when should be used. My doubt is about class construction. I've checked some tutorials and all of them say that I should pass "Adaptee" class as a dependency to my "Adapter".
e.g.
Class SampleAdapter implements MyInterface
{
private AdapteeClass mInstance;
public SampleAdapter(AdapteeClass instance)
{
mInstance=instance;
}
}
This example is copied from wikipedia. As you can see AdapteeClass is passed to my object as dependency. The question is why? If I'm changing interface of an object It's obvious I'm going to use "new" interface and I won't need "old" one. Why I need to create instance of "old" class outside my adapter. Someone may say that I should use dependency injection so I can pass whatever I want, but this is adapter - I need to change interface of concrete class. Personally I think code bellow is better.
Class SampleAdapter implements MyInterface
{
private AdapteeClass mInstance;
public SampleAdapter()
{
mInstance= new AdapteeClass();
}
}
What is your opinion?
I would say that you should always avoid the new operator in a class when it comes to complex objects (except when the class is a Builder or Factory) to reduce coupling and make your code better testable. Off course objects like a List or Dictionary or value objects can be constructed inside a class method (which is probably the purpose of the class method!)
Lets say for example that your AdapteeClass is a Remote Proxy. If you want to use Unit Testing, your unit tests will have to use the real proxy class because there is no way to replace it in your unit tests.
If you use the first approach, you can easily inject a mock or fake into the constructor when running your unit test so you can test all code paths.
Google has a guide on writing testable code which describes this in more detail but some important points are:
Warning Signs for not testable code
new keyword in a constructor or at field declaration
Static method calls in a constructor or at field declaration
Anything more than field assignment in constructors
Object not fully initialized after the constructor finishes (watch out for initialize methods)
Control flow (conditional or looping logic) in a constructor
Code does complex object graph construction inside a constructor rather than using a factory or builder
Adding or using an initialization block
AdapteeClass can have one or more non-trivial constructors. In this case you'll need to duplicate all of them in your SampleAdapter constructor to have the same flexibility. Passing already constructed object is simpler.
I think creating the Adaptee inside the Adapter is limiting. What if some day you want to adapt a pre-existing instance?
To be honest though, I'd do both if at all possible.
Class SampleAdapter implements MyInterface
{
private AdapteeClass mInstance;
public SampleAdapter()
: base (new AdapteeClass())
{
}
public SampleAdapter(AdapteeClass instance)
{
mInstance=instance;
}
}
Let's assume you have an external hard drive with a regular USB port and you are trying to hook it up with a Mac which only has type-c ports. Yes, you can buy a new drive which has a type-c port but what about the data in it?
It's the same for the adapter pattern. There're times you initialize AdapteeClass with tons of flavors. When you do the conversion, you want to keep all the context.