How can I XML serialize a synchronized ArrayList? - serialization

When I try to serialize and deserialize an ArrayList wrapped in Collections.synchronizedList using beans.XMLEncoder and beans.XMLDecoder, I get the following error:
java.lang.reflect.InvocationTargetException
Continuing ...
java.lang.Exception: XMLEncoder: discarding statement XMLEncoder.writeObject*Collections$SynchronizedRandomAccessList);
Continuing ...
Since the program I am working on is a multithreaded music library client/server application, I need the synchronization. If using an ordinary ArrayList, the serialization/deserialization works fine. I really don't want to use Vector, since it contains a lot of legacy operations.
Here are my methods to serialize and deserialize:
/**
* Serializes library into an XML file
* #param xmlFileLocation - location of XML file
*/
public void saveLibrary (String xmlFileLocation) {
FileOutputStream fos;
try {
fos = new FileOutputStream(xmlFileLocation);
XMLEncoder encoder = new XMLEncoder(fos);
encoder.writeObject(lib);
encoder.close();
fos.close();
} catch (Exception e) {
e.printStackTrace();
}
}
/**
* Constructor for Library, deserializes XML file
* #param xmlFileLocation - location of XML file
*/
#SuppressWarnings("unchecked")
public Library(String xmlFileLocation) {
FileInputStream fis;
try {
fis = new FileInputStream(xmlFileLocation);
XMLDecoder decoder = new XMLDecoder(fis);
Object o = decoder.readObject();
if (o instanceof List)
setLib((List<MusicDescription>) o);
decoder.close();
fis.close();
} catch (Exception e) {
e.printStackTrace();
}
}
As I stated, I really don't want to use Vector, since it contains a lot of legacy operations.

Related

Access JCas Annotation list

I am developing an Apache UIMA v2 application to annotate documents.
I developed properly the process() method because I obtain the correct annotations (tested with debug and UIMA CAS Visual Debugger).
My application consists in a simple instantiation of the JCas object and the process of a document, i.e. a simple string in this case. Here's the code:
public class MainProgram {
public static void main(String[] args) {
try {
XMLInputSource in = new XMLInputSource("desc/dictionaryDescriptor.xml");
ResourceSpecifier specifier = UIMAFramework.getXMLParser().parseResourceSpecifier(in);
AnalysisEngine ae = UIMAFramework.produceAnalysisEngine(specifier);
JCas jcas = ae.newJCas();
jcas.setDocumentText("prova di a#gmail.com, timido, word, excel. ");
ae.process(jcas);
processResults(jcas);
ae.destroy();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (InvalidXMLException e1) {
e1.printStackTrace();
} catch (ResourceInitializationException e2) {
e2.printStackTrace();
} catch (AnalysisEngineProcessException e3) {
e3.printStackTrace();
}
}
public static void processResults(JCas jcas) {
System.out.println("Done!");
// TODO read annotations from jcas
}
}
If I add a breakpoint inside the processResults() method I can see the content of jcas and the list of annotation:
I want to access to the SubTypes list in the AnnotationIndex object, without taking care of the class type.
Here is an example through a specific type:
AnnotationIndex<Annotation> programIndex = jcas.getAnnotationIndex(Programma.type);
Iterator programIter = programIndex.iterator();
while(programIter.hasNext()) {
Programma p = (Programma) programIter.next();
}
You can use the JCasUtil to extract the annotations of the JCas:
JCasUtil.select(jCas, Annotation.class).stream()....
and with the getType() method of the annotation you can check for the type of the annotation.

throwing exception inside the java 8 stream foreach

I am using java 8 stream and I can not throw the exceptions inside the foreach of stream.
stream.forEach(m -> {
try {
if (isInitial) {
isInitial = false;
String outputName = new SimpleDateFormat(Constants.HMDBConstants.HMDB_SDF_FILE_NAME).format(new Date());
if (location.endsWith(Constants.LOCATION_SEPARATOR)) {
savedPath = location + outputName;
} else {
savedPath = location + Constants.LOCATION_SEPARATOR + outputName;
}
File output = new File(savedPath);
FileWriter fileWriter = null;
fileWriter = new FileWriter(output);
writer = new SDFWriter(fileWriter);
}
writer.write(m);
} catch (IOException e) {
throw new ChemIDException(e.getMessage(),e);
}
});
and this is my exception class
public class ChemIDException extends Exception {
public ChemIDException(String message, Exception e) {
super(message, e);
}
}
I am using loggers to log the errors in upper level. So I want to throw the exception to top. Thanks
Try extending RuntimeException instead. The method that is created to feed to the foreach does not have that type as throwable, so you need something that is runtime throwable.
WARNING: THIS IS PROBABLY NOT A VERY GOOD IDEA
But it will probably work.
Why are you using forEach, a method designed to process every element, when all you want to do, is to process the first element? Instead of realizing that forEach is the wrong method for the job (or that there are more methods in the Stream API than forEach), you are kludging this with an isInitial flag.
Just consider:
Optional<String> o = stream.findFirst();
if(o.isPresent()) try {
String outputName = new SimpleDateFormat(Constants.HMDBConstants.HMDB_SDF_FILE_NAME)
.format(new Date());
if (location.endsWith(Constants.LOCATION_SEPARATOR)) {
savedPath = location + outputName;
} else {
savedPath = location + Constants.LOCATION_SEPARATOR + outputName;
}
File output = new File(savedPath);
FileWriter fileWriter = null;
fileWriter = new FileWriter(output);
writer = new SDFWriter(fileWriter);
writer.write(o.get());
} catch (IOException e) {
throw new ChemIDException(e.getMessage(),e);
}
which has no issues with exception handling. This example assumes that the Stream’s element type is String. Otherwise, you have to adapt the Optional<String> type.
If, however, your isInitial flag is supposed to change more than once during the stream processing, you are definitely using the wrong tool for your job. You should have read and understood the “Stateless behaviors” and “Side-effects” sections of the Stream API documentation, as well as the “Non-interference” section, before using Streams. Just converting loops to forEach invocations on a Stream doesn’t improve the code.

org.apache.commons.io.FileCleaningTracker does not delete temp files unless explicitly calling System.gc()?

I am working on a upload image feature for my web app, and am having a strange issue with the "FileCleaningTracker" from apache commons fileupload. I have a ImageUploadService with a instance variable FileCleaningTracker, then I have a upload method that creates an instance of DiskFileItemFactory and then references the FileCleaningTracker, after the upload method completes successfully, I set the FileCleaningTracker of DiskFileItemFactory to null, so i would expect the DiskFileItemFactory to be garbage collected and then the underlying subclass of PhantomReference in FileCleaningTracker will be notified hence delete the temp file the DiskFileItemFactory created.
But that does not happen until I null the DiskFileItemFactory and call System.gc() (only nulling the DiskFileItemFactory does not help) at the end of the upload method. THis seems very strange to me. Here is my code :
#Override
public void upload(final HttpServletRequest request) {
ValidateUtils.checkNotNull(request, "upload request");
final File tmp = new File(this.tempFolder);
if (!tmp.exists()) {
tmp.mkdir();
}
DiskFileItemFactory fileItemFactory = new DiskFileItemFactory(this.sizeThreshold, tmp);
fileItemFactory.setFileCleaningTracker(this.fileCleaningTracker);
ServletFileUpload uploadHandler = new ServletFileUpload(fileItemFactory);
List items;
try {
items = uploadHandler.parseRequest(request);
} catch (final FileUploadException e) {
throw new ImageUploadServiceException("Error parsing the http servlet request for image upload.", e);
}
final Iterator it = items.iterator();
while (it.hasNext()) {
final DiskFileItem item = (DiskFileItem) it.next();
if (item.isFormField()) {
// log message
} else {
final String fileName = item.getName();
final File destination = this.createFileForUpload(fileName, this.uploadFolder);
FileChannel outChannel;
try {
outChannel = new FileOutputStream(destination).getChannel();
} catch (final FileNotFoundException e) {
throw new ImageUploadServiceException(e);
}
FileChannel inChannel = null;
try {
inChannel = new FileInputStream(item.getStoreLocation()).getChannel();
outChannel.transferFrom(inChannel, 0, item.getSize());
} catch (final IOException e) {
throw new ImageUploadServiceException(String.format("Error uploading image to '%s/%s'.", this.uploadFolder, destination.getName()), e);
} finally {
IOUtils.closeChannel(inChannel);
IOUtils.closeChannel(outChannel);
}
}
}
fileItemFactory.setFileCleaningTracker(null);
}
The above code causes every upload creates a file in the temp folder but does not remove it at the end by the "fileCleaningTracker", possibly because the DiskFileItemFactory instance is not garbage collected(I've failed to see why it shouldn't have) or it has been GCed but not notified by the PhantomReference in fileCleaningTracker(how reliable is PhantomReference?)
I waited 10 minutes and the files are still there, so it should't be because the GC has not run. and there are no exceptions.
Now if I add the following code, the temp files are removed every time after the upload:
fileItemFactory = null;
System.gc();
This looks very strange to me as I would expect the fileItemFactory be GCed without an explict call to System.gc().
Any input will be appreciated.
Thank you.
I have the same problem. The temporary files are never removed even after the server shutdown: GC process had not been started so FileCleaningTracker had no chance to get tracked files to delete from ReferenceQueue and all the files remain on the hard drive.
Due to specific behavior of my application I have to clean up after each upload (files might be very big). Instead of using standard org.apache.commons.io.FileCleaningTracker I am feeling lucky to override this class with my own implementation:
/**
* Cleaning tracker to clean files after each upload with special method invocation.
* Not thread safe and must be used with 1 factory = 1 thread policy.
*/
public class DeleteFilesOnEndUploadCleaningTracker extends FileCleaningTracker {
private List<String> filesToDelete = new ArrayList();
public void deleteTemporaryFiles() {
for (String file : filesToDelete) {
new File(file).delete();
}
filesToDelete.clear();
}
#Override
public synchronized void exitWhenFinished() {
deleteTemporaryFiles();
}
#Override
public int getTrackCount() {
return filesToDelete.size();
}
#Override
public void track(File file, Object marker) {
filesToDelete.add(file.getAbsolutePath());
}
#Override
public void track(File file, Object marker, FileDeleteStrategy deleteStrategy) {
filesToDelete.add(file.getAbsolutePath());
}
#Override
public void track(String path, Object marker) {
filesToDelete.add(path);
}
#Override
public void track(String path, Object marker, FileDeleteStrategy deleteStrategy) {
filesToDelete.add(path);
}
}
If this the right case for you just inject the instance of the class above into your DiskFileItemFactory:
DeleteFilesOnEndUploadCleaningTracker tracker = new DeleteFilesOnEndUploadCleaningTracker();
fileItemFactory.setFileCleaningTracker(tracker);
And don't forget to invoke the cleaning method after your work with uploaded items is done:
tracker.deleteTemporaryFiles();
Forgot to mention: I use commons-fileupload version 1.2.2 and commons-io version 1.3.2.

Kafka Serialization of an object [duplicate]

This question already has answers here:
Writing Custom Kafka Serializer
(3 answers)
Closed 2 years ago.
I started playing with Kafka. I've set an a zookeeper configuration, and I managed to send and consume String messages.
Now I am trying to pass an Object (in java), but from some reason, when parsing the Message in the consumer I have header issues. I tried several serialization options (using Decoder/Encoder), and all of the return the same header issue.
Here is my code
The producer:
Properties props = new Properties();
props.put("zk.connect", "localhost:2181");
props.put("serializer.class", "com.inneractive.reporter.kafka.EventsDataSerializer");
ProducerConfig config = new ProducerConfig(props);
Producer<Long, EventDetails> producer = new Producer<Long, EventDetails>(config);
ProducerData<Long, EventDetails> data = new ProducerData<Long, EventDetails>("test3", 1, Arrays.asList(new EventDetails());
try {
producer.send(data);
} finally {
producer.close();
}
And the consumer:
Properties props = new Properties();
props.put("zk.connect", "localhost:2181");
props.put("zk.connectiontimeout.ms", "1000000");
props.put("groupid", "test_group");
// Create the connection to the cluster
ConsumerConfig consumerConfig = new ConsumerConfig(props);
ConsumerConnector consumerConnector = Consumer.createJavaConsumerConnector(consumerConfig);
// create 4 partitions of the stream for topic “test”, to allow 4 threads to consume
Map<String, List<KafkaMessageStream<EventDetails>>> topicMessageStreams =
consumerConnector.createMessageStreams(ImmutableMap.of("test3", 4), new EventsDataSerializer());
List<KafkaMessageStream<EventDetails>> streams = topicMessageStreams.get("test3");
// create list of 4 threads to consume from each of the partitions
ExecutorService executor = Executors.newFixedThreadPool(4);
// consume the messages in the threads
for (final KafkaMessageStream<EventDetails> stream: streams) {
executor.submit(new Runnable() {
public void run() {
for(EventDetails event: stream) {
System.err.println("********** Got message" + event.toString());
}
}
});
}
and my Serializer:
public class EventsDataSerializer implements Encoder<EventDetails>, Decoder<EventDetails> {
public Message toMessage(EventDetails eventDetails) {
try {
ObjectMapper mapper = new ObjectMapper(new SmileFactory());
byte[] serialized = mapper.writeValueAsBytes(eventDetails);
return new Message(serialized);
} catch (IOException e) {
e.printStackTrace();
return null; // TODO
}
}
public EventDetails toEvent(Message message) {
EventDetails event = new EventDetails();
ObjectMapper mapper = new ObjectMapper(new SmileFactory());
try {
//TODO handle error
return mapper.readValue(message.payload().array(), EventDetails.class);
} catch (IOException e) {
e.printStackTrace();
return null;
}
}
}
And this is the error I get:
org.codehaus.jackson.JsonParseException: Input does not start with Smile format header (first byte = 0x0) and parser has REQUIRE_HEADER enabled: can not parse
at [Source: N/A; line: -1, column: -1]
When I worked with MessagePack and with plain writing to a ObjectOutputStream I got a similiar header issue. I also tried to add the payload CRC32 to the message, but that didn't help as well.
What am I doing wrong here?
Hm, I haven't run into the same header issue that you are encountering but my project wasn't compiling correctly when I didn't provide a VerifiableProperties constructor in my encoder/decoder. It seems strange that the missing constructor would corrupt Jackson's deserialization though.
Perhaps try splitting up your encoder and decoder and include the VerifiableProperties constructor in both; you shouldn't need to implement Decoder[T] for serialization. I was able to successfully implement json de/serialization using ObjectMapper following the format in this post.
Good luck!
Bytebuffers .array() method is not very reliable. It depends on the particular implementation. You might want to try
ByteBuffer bb = message.payload()
byte[] b = new byte[bb.remaining()]
bb.get(b, 0, b.length);
return mapper.readValue(b, EventDetails.class)

Serialization via J2ME or BlackBerry APIs

Is it possible to serialize an object into a string or a byte array using either the J2ME or BlackBerry APIs?
Thanks.
The way I handle the object serialization case is by implementing my own infrastructure for handling everything. You don't have reflection in this API, but you do have "Class.forName()" which is better than nothing. So here's what I do...
First, this is the interface that I have every serializable object implement:
public interface Serializable {
void serialize(DataOutput output) throws IOException;
void deserialize(DataInput input) throws IOException;
}
The serialize() method writes the object's fields to the DataOutput instance, while the deserialize() method sets the object's fields from the DataInput instance. (these are both plain top-level interfaces used by the data-oriented I/O streams, which allows me to have more flexibility) Also, any class implementing this interface needs to have a default no-arguments constructor. Of course if you want your serialized class to be robust against change, you may want to choose your underlying data formats accordingly. (for example, I implemented a serializable hashtable as an underlying container to handle these cases)
Now, to actually serialize a class implementing this interface, I have a method that looks something like this:
public static byte[] serializeClass(Serializable input) {
ByteArrayOutputStream buffer = new ByteArrayOutputStream();
DataOutputStream output = new DataOutputStream(buffer);
try {
output.writeUTF(input.getClass().getName());
input.serialize(output);
} catch (IOException ex) {
// do nothing
}
return buffer.toByteArray();
}
And to deserialize:
public static Serializable deserializeClass(byte[] data) {
DataInputStream input = new DataInputStream(new ByteArrayInputStream(data));
Object deserializedObject;
Serializable result = null;
try {
String classType = input.readUTF();
deserializedObject = Class.forName(classType).newInstance();
if(deserializedObject instanceof Serializable) {
result = (Serializable)deserializedObject;
result.deserialize(input);
}
} catch (IOException ex) {
result = null;
} catch (ClassNotFoundException ex) {
result = null;
} catch (InstantiationException ex) {
result = null;
} catch (IllegalAccessException ex) {
result = null;
}
return result;
}
Java ME, unfortunately, doesn't have any built-in APIs for serialization, so you'll have to invent something yourself.
If your goal is to serialize an object or object graph for persisting to flash memory, you can use the PersistentStore class. Many of the native object types such as Boolean, Byte, Character, Integer, Long, Object, Short, String, Vector, Hashtable are implicitly persistable.
You are stuck with creating your own serialization process for your classes. It wouldn't be too difficult to create your own base class and then use somesort of reflection to automatically serialize your properties.
ByteArrayOutputStream baos = new ByteArrayOutputStream();
DataOutputStream outputStream = new DataOutputStream(baos);
try {
// serialize your object -
outputStream.writeInt(this.name);
// Then push the player name.
outputStream.writeUTF(this.timestamp);
}
catch (IOException ioe) {
System.out.println(ioe);
ioe.printStackTrace();
}
// Extract the byte array
byte[] b = baos.toByteArray();