ArrayList(XmlValueModel) move from one activity to another Activity - show

I am sending Arraylist one Activity to another but 2nd activity show null pointer and null adapter. how to fix it ??? thanks.
First Activity:
Bundle b = new Bundle();
b.putSerializable("array_list", listdata);
Intent movedata = new Intent(getApplicationContext(),Shopping.class);
movedata.putExtras(movedata);
startActivity(movedata);
Second Activity:
Intent intent = this.getIntent();
Bundle bundle = intent.getExtras();
ArrayList<XmlValueModel> listdata = (ArrayList<XmlValueModel>) bundle.getSerializable("array_list");
// ArrayAdapter<XmlValueModel> array = new ArrayAdapter<XmlValueModel>(getApplicationContext(), android.R.layout.simple_list_item_1, list);
listview = (ListView) findViewById(R.id.ListOne);
adapter = new CustomListAdapter(this, listdata);
listview.setAdapter(adapter);
}catch (Exception e){
Log.e("Error:", e.getMessage());
}

Is your XmlValueModel serialisable? You should implement the XmlValueModel with Serializable!
public class XmlValueModel implements Serializable {...}
In case the XmlValueModel comes from a library derive a new class from the XmlValueModel and implement Serializable.

try this way may this help you
First you want to make
Class XmlValueModel implements Serializable
public class XmlValueModel implements Serializable {
/**
*
*/
private static final long serialVersionUID = 1L;
}
In your FirstActivity
do this way
ArrayList<XmlValueModel> XmlValueModelArrayList = new ArrayList<XmlValueModel>();
Intent intent = new Intent(this,secondActivity.class);
intent.putExtra("XmlValueModelArrayList", XmlValueModelArrayList);
In your SecondActivity
do this way
ArrayList<XmlValueModel> XmlValueModelList;
XmlValueModelList = (ArrayList<XmlValueModel>) getIntent().getSerializableExtra(
"XmlValueModelArrayList");

Related

Unable to add mutator for an existing field of a class

I'm trying to add a mutator for an existing private final field. I can transform the field modifiers to remove the final specification and add an accessor method:
// accessor interface
public interface UniqueIdAccessor {
Serializable getUniqueId();
}
// mutator interface
public interface UniqueIdMutator {
void setUniqueId(Serializable uniqueId);
}
...
// fragment of Java agent implementation
return new AgentBuilder.Default()
.type(hasSuperType(named("org.junit.runner.Description")))
.transform(new Transformer() {
#Override
public DynamicType.Builder<?> transform(DynamicType.Builder<?> builder, TypeDescription typeDescription,
ClassLoader classLoader, JavaModule module) {
return builder.field(named("fUniqueId")).transform(ForField.withModifiers(FieldManifestation.PLAIN))
.implement(UniqueIdAccessor.class).intercept(FieldAccessor.ofField("fUniqueId"))
// .implement(UniqueIdMutator.class).intercept(FieldAccessor.ofField("fUniqueId"))
.implement(Hooked.class);
}
})
.installOn(instrumentation);
...
Here's a method that uses reflection to check the modifiers of the target field and calls the accessor to get the value of the field.
private static void injectProxy(Description description) {
try {
Field bar = Description.class.getDeclaredField("fUniqueId");
System.out.println("isFinal: " + ((bar.getModifiers() & Modifier.FINAL) != 0));
} catch (NoSuchFieldException | SecurityException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
Serializable uniqueId = ((UniqueIdAccessor) description).getUniqueId();
System.out.println("uniqueId: " + uniqueId);
}
// isFinal: false
// uniqueId: <description-unique-id>
... but if I uncomment the second "implement" expression to add the mutator, the transform blows up:
// isFinal: true
// java.lang.ClassCastException:
// class org.junit.runner.Description cannot be cast to class com.nordstrom.automation.junit.UniqueIdAccessor
// (org.junit.runner.Description and com.nordstrom.automation.junit.UniqueIdAccessor
// are in unnamed module of loader 'app')
I could set the field value with reflection, but that defeats the purpose of using Byte Buddy in the first place!
The problem with this approach is that the field accessor considers the input type prior to the modification. Byte Buddy prohibits this as it does not consider the mutation to be legal, not knowing about the removed modifier. As a result, the transformation fails in its entirety and you get the error you are seeing. (Register a listener to see this error.)
To avoid this, you can implement a custom Implementation using FieldAccess (without or). You can have a look at the more convenient FieldAccessor to see how this is implemented, only that you need to drop the validity checks.
Thanks for pointing me in the right direction! I assemble the StackManipulation object that defines the mutator method with this:
final TypeDescription description = TypePool.Default.ofSystemLoader().describe("org.junit.runner.Description").resolve();
final Generic _void_ = TypeDescription.VOID.asGenericType();
final Generic serializable = TypePool.Default.ofSystemLoader().describe("java.io.Serializable").resolve().asGenericType();
final MethodDescription.Token setUniqueIdToken = new MethodDescription.Token("setUniqueId", Modifier.PUBLIC, _void_, Arrays.asList(serializable));
final MethodDescription setUniqueId = new MethodDescription.Latent(description, setUniqueIdToken);
final Token fUniqueIdToken = new FieldDescription.Token("fUniqueId", Modifier.PRIVATE, serializable);
final FieldDescription fUniqueId = new FieldDescription.Latent(description, fUniqueIdToken);
final StackManipulation setUniqueIdImpl = new StackManipulation.Compound(
MethodVariableAccess.loadThis(),
MethodVariableAccess.load(setUniqueId.getParameters().get(0)),
Assigner.DEFAULT.assign(serializable, serializable, Typing.STATIC),
FieldAccess.forField(fUniqueId).write(),
MethodReturn.VOID
);
... and I transform the target class with this:
return new AgentBuilder.Default()
.type(hasSuperType(named("org.junit.runner.Description")))
.transform(new Transformer() {
#Override
public DynamicType.Builder<?> transform(DynamicType.Builder<?> builder, TypeDescription typeDescription,
ClassLoader classLoader, JavaModule module) {
return builder.field(named("fUniqueId")).transform(ForField.withModifiers(FieldManifestation.PLAIN))
.implement(AnnotationsAccessor.class).intercept(FieldAccessor.ofField("fAnnotations"))
.implement(UniqueIdAccessor.class).intercept(FieldAccessor.ofField("fUniqueId"))
.implement(UniqueIdMutator.class).intercept(new Implementation.Simple(setUniqueIdImpl));
}
})
.installOn(instrumentation);
Here are the definitions of the three interfaces used in the transform:
// annotations accessor interface
public interface AnnotationsAccessor {
Annotation[] annotations();
}
// unique ID accessor interface
public interface UniqueIdAccessor {
Serializable getUniqueId();
}
// unique ID mutator interface
public interface UniqueIdMutator {
void setUniqueId(Serializable uniqueId);
}

#JsonIdentityReference does not recognize equal values

I'm trying to serialize an object (Root), with some duplicated entries of MyObject. Just want store the whole objects one, I'm using #JsonIdentityReference, which works pretty well.
However, I realize that it will generate un-deserializable object, if there're equal objects with different reference. I wonder if there's a configuration in Jackson to change this behavior, thanks!
#Value
#AllArgsConstructor
#NoArgsConstructor(force = true)
class Root {
private List<MyObject> allObjects;
private Map<String, MyObject> objectMap;
}
#Value
#AllArgsConstructor
#NoArgsConstructor(force = true)
#JsonIdentityReference
#JsonIdentityInfo(generator = ObjectIdGenerators.PropertyGenerator.class, property = "id")
class MyObject {
private String id;
private int value;
}
public class Main {
public static void main() throws JsonProcessingException {
// Constructing equal objects
val obj1 = new MyObject("a", 1);
val obj2 = new MyObject("a", 1);
assert obj1.equals(obj2);
val root = new Root(
Lists.newArrayList(obj1),
ImmutableMap.of(
"lorem", obj2
)
);
val objectMapper = new ObjectMapper();
val json = objectMapper.writeValueAsString(root);
// {"allObjects":[{"id":"a","value":1}],"objectMap":{"lorem":{"id":"a","value":1}}}
// Note here both obj1 and obj2 are expanded.
// Exception: Already had POJO for id
val deserialized = objectMapper.readValue(json, Root.class);
assert root.equals(deserialized);
}
}
I'm using Jackson 2.10.
Full stacktrace:
Exception in thread "main" com.fasterxml.jackson.databind.JsonMappingException: Already had POJO for id (java.lang.String) [[ObjectId: key=a, type=com.fasterxml.jackson.databind.deser.impl.PropertyBasedObjectIdGenerator, scope=java.lang.Object]] (through reference chain: Root["objectMap"]->java.util.LinkedHashMap["lorem"]->MyObject["id"])
at com.fasterxml.jackson.databind.JsonMappingException.wrapWithPath(JsonMappingException.java:394)
at com.fasterxml.jackson.databind.JsonMappingException.wrapWithPath(JsonMappingException.java:353)
at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.wrapAndThrow(BeanDeserializerBase.java:1714)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:371)
at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeWithObjectId(BeanDeserializerBase.java:1257)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:157)
at com.fasterxml.jackson.databind.deser.std.MapDeserializer._readAndBindStringKeyMap(MapDeserializer.java:527)
at com.fasterxml.jackson.databind.deser.std.MapDeserializer.deserialize(MapDeserializer.java:364)
at com.fasterxml.jackson.databind.deser.std.MapDeserializer.deserialize(MapDeserializer.java:29)
at com.fasterxml.jackson.databind.deser.impl.FieldProperty.deserializeAndSet(FieldProperty.java:138)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:288)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:151)
at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4202)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3205)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3173)
at Main.main(Main.java:53)
Caused by: java.lang.IllegalStateException: Already had POJO for id (java.lang.String) [[ObjectId: key=a, type=com.fasterxml.jackson.databind.deser.impl.PropertyBasedObjectIdGenerator, scope=java.lang.Object]]
at com.fasterxml.jackson.annotation.SimpleObjectIdResolver.bindItem(SimpleObjectIdResolver.java:24)
at com.fasterxml.jackson.databind.deser.impl.ReadableObjectId.bindItem(ReadableObjectId.java:57)
at com.fasterxml.jackson.databind.deser.impl.ObjectIdValueProperty.deserializeSetAndReturn(ObjectIdValueProperty.java:101)
at com.fasterxml.jackson.databind.deser.impl.ObjectIdValueProperty.deserializeAndSet(ObjectIdValueProperty.java:83)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:369)
... 14 more
As I mentioned earlier, this setup only works if obj1 == obj2, as the two objects with same ID should be identity-equal. In that case, the second object would also net get expanded during serialization (alwaysAsId = false only expands the first object).
However, if you want to have this setup and are fine with the serialization, you could use a custom Resolver for deserialization that stores a single instance per key:
#JsonIdentityReference(alwaysAsId = false)
#JsonIdentityInfo(generator = ObjectIdGenerators.PropertyGenerator.class, property = "id", resolver = CustomScopeResolver.class)
static class MyObject {
private String id;
// ...
}
class CustomScopeResolver implements ObjectIdResolver {
Map<String, MyObject> data = new HashMap<>();
#Override
public void bindItem(final IdKey id, final Object pojo) {
data.put(id.key.toString(), (MyObject) pojo);
}
#Override
public Object resolveId(final IdKey id) {
return data.get(id.key);
}
#Override
public ObjectIdResolver newForDeserialization(final Object context) {
return new CustomScopeResolver();
}
#Override
public boolean canUseFor(final ObjectIdResolver resolverType) {
return false;
}
}
NEW EDIT: Apparently, its very easy: Just turn on objectMapper.configure(SerializationFeature.USE_EQUALITY_FOR_OBJECT_ID, true); so that the DefaultSerializerProvider uses a regular Hashmap instead of an IdentityHashMap to manage the serialized beans.
DEPRECATED: Update for Serialization: It is possible to achieve this by adding a custom SerializationProvider:
class CustomEqualObjectsSerializerProvider extends DefaultSerializerProvider {
private final Collection<MyObject> data = new HashSet<>();
private final SerializerProvider src;
private final SerializationConfig config;
private final SerializerFactory f;
public CustomEqualObjectsSerializerProvider(
final SerializerProvider src,
final SerializationConfig config,
final SerializerFactory f) {
super(src, config, f);
this.src = src;
this.config = config;
this.f = f;
}
#Override
public DefaultSerializerProvider createInstance(final SerializationConfig config, final SerializerFactory jsf) {
return new CustomEqualObjectsSerializerProvider(src, this.config, f);
}
#Override
public WritableObjectId findObjectId(final Object forPojo, final ObjectIdGenerator<?> generatorType) {
// check if there is an equivalent pojo, use it if exists
final Optional<MyObject> equivalentObject = data.stream()
.filter(forPojo::equals)
.findFirst();
if (equivalentObject.isPresent()) {
return super.findObjectId(equivalentObject.get(), generatorType);
} else {
if (forPojo instanceof MyObject) {
data.add((MyObject) forPojo);
}
return super.findObjectId(forPojo, generatorType);
}
}
}
#Test
public void main() throws IOException {
// Constructing equal objects
final MyObject obj1 = new MyObject();
obj1.setId("a");
final MyObject obj2 = new MyObject();
obj2.setId("a");
assert obj1.equals(obj2);
final Root root = new Root();
root.setAllObjects(Collections.singletonList(obj1));
root.setObjectMap(Collections.singletonMap(
"lorem", obj2));
final ObjectMapper objectMapper = new ObjectMapper();
objectMapper.setSerializerProvider(
new CustomEqualObjectsSerializerProvider(
objectMapper.getSerializerProvider(),
objectMapper.getSerializationConfig(),
objectMapper.getSerializerFactory()));
final String json = objectMapper.writeValueAsString(root);
System.out.println(json); // second object is not expanded!
}

How to refresh my RecyclerView with another Room-DAO Query

I have a RecyclerView with an AAC in my Fragment.
ViewModel --> Repository --> DAO with some custom Queries and a getAllItems.
I want to use a Filter FAB or a Spinner to call getOrderItemList or getWhereItemList queries but i dont know how must i do it.
I have a Repository Filter for my SearchView but is a different thing, now i want to change list order (alphabetical, year...) and create a WhereCondition with a lot of checkbox that i have in a Dialog (example: i check "complete" and "Action" checkbox and creates the String whereCondition = "(status = 'complete' and genre like '%Action%')" ).
How can i call getWhereItemList and getOrderItemList queries from my Fragment to change my RecyclerView content?
ItemDAO:
#Query("SELECT * from item_table ")
<List<Item>> getItemList();
#Query("SELECT * from item_table ORDER by :order DESC")
<List<Item>> getOrderItemList(String order);
#Query("SELECT * from item_table WHERE :whereCondition")
<List<Item>> getWhereItemList(String whereCondition);
My Fragment fills the RecyclerView with getAllItems:
private ItemViewModel myItemViewModel;
RecyclerView myRecyclerView = findViewById(R.id.recyclerview);
final ItemListAdapter myAdapter = new ItemListAdapter(this);
myRecyclerView.setAdapter(myAdapter);
myRecyclerView.setLayoutManager(new LinearLayoutManager(this));
myItemViewModel = ViewModelProviders.of(this).get(ItemViewModel.class);
myItemViewModel.getAllItems().observe(this, new Observer<List<Item>>() {
#Override
public void onChanged(#Nullable final List<Item> items) {
myAdapter.setItems(items);
}
ItemListAdapter:
private List<Item> myItems;
void setItems(List<Item> items){
myItems = items;
notifyDataSetChanged();
}
ItemViewModel:
private ItemRepository myRepository;
private LiveData<List<Item>> myAllItems;
public ItemViewModel (Application application) {
super(application);
myRepository = new ItemRepository(application);
myAllItems = myRepository.getAllItems();
}
LiveData<List<Item>> getAllItems() { return myAllItems; }
Thanks.
The idea is to have two LiveData instances:
one that keeps track of the current filter type. You may set its initial value.
one that emits List<Item>. This also should react to the other LiveData change and get new List<Item> if necessary.
You can use Transformations.SwitchMap to implement LiveData2. What it does is it basically returns a LiveData instance that can switch to a different source in response to another LiveData object.
ItemViewModel:
private ItemRepository myRepository;
/**
* Keep track of the current filter type.
* In this example the initial value is set to Filter.ALL, which
* represents the non-filtered list.
*/
private MutableLiveData<Filter> itemFilter = new MutableLiveData<>(Filter.ALL);
/**
* Emits list of items
*/
private LiveData<List<Item>> myItems = Transformations.switchMap(itemFilter, filter -> {
// Everytime itemFilter emits a new value, this piece of code
// will be invoked. You are responsible for returning the
// LiveData instance according to the filter value.
switch(filter.type) {
case ALL:
return myRepository.getAllItems();
case ORDER_BY:
return myRepository.getOrderItemList(filter.query);
case WHERE:
return myRepository.getWhereItemList(filter.query);
}
});
public ItemViewModel (Application application) {
super(application);
myRepository = new ItemRepository(application);
}
public LiveData<List<Item>> getItems() { return myItems; }
/**
* View should call this method in order to switch to different
* filter.
*/
public void changeFilter(Filter itemFilter) {
this.itemFilter.setValue(filter);
}
Define this custom filter class:
public class Filter {
public enum Type {
ALL,
ORDER_BY,
WHERE
}
final public Type type;
final public String query;
public Filter(Type type, String query) {
this.type = type;
this.query = query;
}
}

Storm Kafkaspout KryoSerialization issue for java bean from kafka topic

Hi I am new to Storm and Kafka.
I am using storm 1.0.1 and kafka 0.10.0
we have a kafkaspout that would receive java bean from kafka topic.
I have spent several hours digging to find the right approach for that.
Found few articles which are useful but none of the approaches worked for me so far.
Following is my codes:
StormTopology:
public class StormTopology {
public static void main(String[] args) throws Exception {
//Topo test /zkroot test
if (args.length == 4) {
System.out.println("started");
BrokerHosts hosts = new ZkHosts("localhost:2181");
SpoutConfig kafkaConf1 = new SpoutConfig(hosts, args[1], args[2],
args[3]);
kafkaConf1.zkRoot = args[2];
kafkaConf1.useStartOffsetTimeIfOffsetOutOfRange = true;
kafkaConf1.startOffsetTime = kafka.api.OffsetRequest.LatestTime();
kafkaConf1.scheme = new SchemeAsMultiScheme(new KryoScheme());
KafkaSpout kafkaSpout1 = new KafkaSpout(kafkaConf1);
System.out.println("started");
ShuffleBolt shuffleBolt = new ShuffleBolt(args[1]);
AnalysisBolt analysisBolt = new AnalysisBolt(args[1]);
TopologyBuilder topologyBuilder = new TopologyBuilder();
topologyBuilder.setSpout("kafkaspout", kafkaSpout1, 1);
//builder.setBolt("counterbolt2", countbolt2, 3).shuffleGrouping("kafkaspout");
//This is for field grouping in bolt we need two bolt for field grouping or it wont work
topologyBuilder.setBolt("shuffleBolt", shuffleBolt, 3).shuffleGrouping("kafkaspout");
topologyBuilder.setBolt("analysisBolt", analysisBolt, 5).fieldsGrouping("shuffleBolt", new Fields("trip"));
Config config = new Config();
config.registerSerialization(VehicleTrip.class, VehicleTripKyroSerializer.class);
config.setDebug(true);
config.setNumWorkers(1);
LocalCluster cluster = new LocalCluster();
cluster.submitTopology(args[0], config, topologyBuilder.createTopology());
// StormSubmitter.submitTopology(args[0], config,
// builder.createTopology());
} else {
System.out
.println("Insufficent Arguements - topologyName kafkaTopic ZKRoot ID");
}
}
}
I am serializing the data at kafka using kryo
KafkaProducer:
public class StreamKafkaProducer {
private static Producer producer;
private final Properties props = new Properties();
private static final StreamKafkaProducer KAFKA_PRODUCER = new StreamKafkaProducer();
private StreamKafkaProducer(){
props.put("bootstrap.servers", "localhost:9092");
props.put("acks", "all");
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "com.abc.serializer.MySerializer");
producer = new org.apache.kafka.clients.producer.KafkaProducer(props);
}
public static StreamKafkaProducer getStreamKafkaProducer(){
return KAFKA_PRODUCER;
}
public void produce(String topic, VehicleTrip vehicleTrip){
ProducerRecord<String,VehicleTrip> producerRecord = new ProducerRecord<>(topic,vehicleTrip);
producer.send(producerRecord);
//producer.close();
}
public static void closeProducer(){
producer.close();
}
}
Kyro Serializer:
public class DataKyroSerializer extends Serializer<Data> implements Serializable {
#Override
public void write(Kryo kryo, Output output, VehicleTrip vehicleTrip) {
output.writeLong(data.getStartedOn().getTime());
output.writeLong(data.getEndedOn().getTime());
}
#Override
public Data read(Kryo kryo, Input input, Class<VehicleTrip> aClass) {
Data data = new Data();
data.setStartedOn(new Date(input.readLong()));
data.setEndedOn(new Date(input.readLong()));
return data;
}
I need to get the data back to the Data bean.
As per few articles I need to provide with a custom scheme and make it part of topology but till now I have no luck
Code for Bolt and Scheme
Scheme:
public class KryoScheme implements Scheme {
private ThreadLocal<Kryo> kryos = new ThreadLocal<Kryo>() {
protected Kryo initialValue() {
Kryo kryo = new Kryo();
kryo.addDefaultSerializer(Data.class, new DataKyroSerializer());
return kryo;
};
};
#Override
public List<Object> deserialize(ByteBuffer ser) {
return Utils.tuple(kryos.get().readObject(new ByteBufferInput(ser.array()), Data.class));
}
#Override
public Fields getOutputFields( ) {
return new Fields( "data" );
}
}
and bolt:
public class AnalysisBolt implements IBasicBolt {
/**
*
*/
private static final long serialVersionUID = 1L;
private String topicname = null;
public AnalysisBolt(String topicname) {
this.topicname = topicname;
}
public void prepare(Map stormConf, TopologyContext topologyContext) {
System.out.println("prepare");
}
public void execute(Tuple input, BasicOutputCollector collector) {
System.out.println("execute");
Fields fields = input.getFields();
try {
JSONObject eventJson = (JSONObject) JSONSerializer.toJSON((String) input
.getValueByField(fields.get(1)));
String StartTime = (String) eventJson.get("startedOn");
String EndTime = (String) eventJson.get("endedOn");
String Oid = (String) eventJson.get("_id");
int V_id = (Integer) eventJson.get("vehicleId");
//call method getEventForVehicleWithinTime(Long vehicleId, Date startTime, Date endTime)
System.out.println("==========="+Oid+"| "+V_id+"| "+StartTime+"| "+EndTime);
} catch (Exception e) {
e.printStackTrace();
}
}
but if I submit the storm topology i am getting error:
java.lang.IllegalStateException: Spout 'kafkaspout' contains a
non-serializable field of type com.abc.topology.KryoScheme$1, which
was instantiated prior to topology creation.
com.minda.iconnect.topology.KryoScheme$1 should be instantiated within
the prepare method of 'kafkaspout at the earliest.
Appreciate help to debug the issue and guide to right path.
Thanks
Your ThreadLocal is not Serializable. The preferable solution would be to make your serializer both Serializable and threadsafe. If this is not possible, then I see 2 alternatives since there is no prepare method as you would get in a bolt.
Declare it as static, which is inherently transient.
Declare it transient and access it via a private get method. Then you can initialize the variable on first access.
Within the Storm lifecycle, the topology is instantiated and then serialized to byte format to be stored in ZooKeeper, prior to the topology being executed. Within this step, if a spout or bolt within the topology has an initialized unserializable property, serialization will fail.
If there is a need for a field that is unserializable, initialize it within the bolt or spout's prepare method, which is run after the topology is delivered to the worker.
Source: Best Practices for implementing Apache Storm

Wrong approach or Wrong OOP design?

Following is my code isolation.
Interactable Interface.
public interface Interactable <E extends Interactable> {
List<Person> personsInteracting = new ArrayList<>();
List<Person> personsWaiting = new ArrayList<>();
long INTERACTION_TIME = 5 * 60;
default int getNumberOfPeopleInteracting () {
return personsInteracting.size();
}
default int getNumberOfPeopleWaiting () {
return personsWaiting.size();
}
boolean isMultipleActionsAllowed ();
boolean isFurtherActionsAllowed ();
public abstract boolean tryOccupiedBy (final Person person, final Interactions interaction)
throws InteractionNotPossibleException;
E getObject ();
EnumSet<Interactions> getInteractions ();
}
InteractiveObject Abstract Class
public abstract class InteractiveObject implements Interactable {
protected final String name;
protected int numberOfSimultaneousInteractions;
protected Interactions currentInteraction;
public InteractiveObject (final String name) {
this.name = name;
}
#Override
public boolean isMultipleActionsAllowed () {
return numberOfSimultaneousInteractions > 1;
}
#Override
public boolean isFurtherActionsAllowed () {
return personsInteracting.isEmpty() ||
(getNumberOfPeopleInteracting() > numberOfSimultaneousInteractions);
}
#Override
public boolean tryOccupiedBy (final Person person, final Interactions interaction)
throws InteractionNotPossibleException {
boolean isOccupied = false;
if (!isFurtherActionsAllowed()) {
throw new InteractionNotPossibleException(this + " is already in use by some other " +
"person.");
}
personsInteracting.add(person);
currentInteraction = interaction;
return isOccupied;
}
#Override
public String toString () {
return name;
}
public int getNumberOfSimultaneousInteractions () {
return numberOfSimultaneousInteractions;
}
}
Chair (One of the child class)
public class Chair extends InteractiveObject {
private final EnumSet<Interactions> INTERACTIONS = EnumSet.copyOf(Arrays.asList(
new Interactions[] {Interactions.DRAG, Interactions.SIT}));
public Chair (final String objectName) {
super(objectName);
super.numberOfSimultaneousInteractions = 1;
}
#Override
public Interactable getObject () {
return this;
}
#Override
public EnumSet<Interactions> getInteractions () {
return INTERACTIONS;
}
}
Here is the piece of code that executes and brings the problem, this question is asked for.
final InteractiveObject chair1 = new Chair("Chair1");
final Person person1 = new Person("Person1");
final Room room = new Room("Room1", 2, 2);
room.personEnters(person1);
room.putObject(chair1);
person1.tryOccupying(chair1);
Above piece of code, successfully occupies the chair object. Now,
final InteractiveObject chair2 = new Chair("Chair2");
final Person person2 = new Person("Person2");
final Room room2 = new Room("Room2", 2, 2);
room2.personEnters(person2);
room2.putObject(chair2);
person2.tryOccupying(chair2);
This piece of code doesn't let the person2 occupy since my code states that 1 person is already interacting with chair2, where as no one is interacting with it.
Solution of my problem:
I moved my List of personInteracting to InteractiveObject and function tryOccupiedBy to each child class and everything works fine.
Questions:
I put personsInteracting in Interactable interface since I believe that every future implementation of Interactable will have it. Developers won't have to implement themselves. (But perhaps this idea seems to be wrong)
If tryOccupiedBy function has same implementation, what is the purpose of whole OOP?
I now know that the isolation was wrong and I know where to place the pieces to get the results. But can someone kindly point me out about some OOP concept which I did not understand and should be implemented in a much better way?
The default keyword was not added to the Java language to do the kind of thing which you seem to be trying to achieve. Data defined in an interface is intended to be constant - the modifiers 'public static' are automatically applied to any field definitions in an interface. If you create a default method in the interface then it must either be stateless or act directly only on purely statically available state. Default methods can call other interface methods to modify instance state, .
By placing personsInteracting field in the interface, you made the same instance common to every object implementing that interface, and so your tryOccupying method was acting on purely global state.
So, the purpose of having default methods in the Java language is to support adding new methods to interfaces in a backwards compatible fashion, nothing more. You shouldn't reuse it as a generic form of code re-use - it was never intended for that and you'll get (as you did) weird behaviour.
You didn't have to put tryOccupiedBy in the child classes, however, so you didn't have to have a load of duplicated code. You could still declare the method signature in the interface (which is what interfaces are generally supposed to do) and then implement the common method in your abstract base class. By putting the data fields in the base class, you make them instance fields and so they are not shared between objects.
public interface Interactable <E extends Interactable> {
...
boolean tryOccupiedBy (final Person person, final Interactions interaction)
throws InteractionNotPossibleException;
...
}
public abstract class InteractiveObject implements Interactable {
private final List<Person> personsInteracting = new ArrayList<>();
private final List<Person> personsWaiting = new ArrayList<>();
...
#Override
public final boolean tryOccupiedBy (final Person person, final Interactions interaction)
throws InteractionNotPossibleException {
boolean isOccupied = false;
if (!isFurtherActionsAllowed()) {
throw new InteractionNotPossibleException(this + " is already in use by some other " +
"person.");
}
personsInteracting.add(person);
currentInteraction = interaction;
return isOccupied;
}
...
}