How to implement ClassFileTransformer#transform with byte buddy? - byte-buddy

Is there a way to use byte buddy to implement ClassFileTransformer#transform?
At the moment my implementation uses javassist but I want to replace it with byte buddy as it has a better generics support.
So far my implementation looks like this:
public byte[] transform(ClassLoader loader, String className, Class<?> classBeingRedefined,
ProtectionDomain protectionDomain, byte[] classfileBuffer)
{
if (className.startsWith("my.package."))
{
try {
final CtClass ctClass = classPool.makeClass(new ByteArrayInputStream(classfileBuffer));
/* class manipulation */
return ctClass.toBytecode();
// remove class from class pool if it hasn't been modified
ctClass.detach();
} catch(final Exception ex) {
logger.error("failed to analyse/transform class {}", className, ex);
}
}
return classfileBuffer;
}
Is something similar possible with byte buddy? Are there ways to feed byte buddy with the byte code provided in parameter classfileBuffer?
The ClassFileTransformer implementation is configured into the Spring Load Time Weaver. So I already have the "infrastructure" available. Therefore I would rather not install another byte buddy agent to solve this problem.

Yes, look into AgentBuilder.Default. It offers a DSL for implementing Java agents. You do not need to implement your own class file transformer using it, just specify the transformations you want to make.

Related

Hazelcast, Kryo, JsonNode Serializer

I am implementing a Hazelcast application using distributed MAP with as the entry. My JsonNodeSerializer looks like as shown below
private final ObjectReader jsonNodeReader;
private final ObjectWriter jsonNodeWriter;
#Override
public void write(ObjectDataOutput out, JsonNode jsonNode)
throws IOException {
out.write(jsonNodeWriter.writeValueAsBytes(jsonNode));
}
#Override
public JsonNode read(ObjectDataInput in)
throws IOException {
return jsonNodeReader.readTree(in);
}
However, I wanted to use Kryo to avoid using JsonNodeReader/Writer to save some space and improve performance.
I tried using Kryo and I am not able to read JsonNode/ObjectNode as we do not have no-args constructor.
#Override
public void write(ObjectDataOutput out, JsonNode jsonNode)
throws IOException {
Kryo kryo = KRYO_THREAD_LOCAL.get();
Output output = new Output((OutputStream) out);
kryo.writeObject(output, jsonNode);
output.flush();
//out.write(jsonNodeWriter.writeValueAsBytes(jsonNode));
}
#Trace(dispatcher = true)
#Override
public JsonNode read(ObjectDataInput in)
throws IOException {
InputStream inputStream = (InputStream) in;
Input input = new Input(inputStream);
Kryo kryo = KRYO_THREAD_LOCAL.get();
return kryo.readObject(input, ObjectNode.class);
// return jsonNodeReader.readTree(in);
}
Not sure if my approach to use JsonNodeReader/Writer is optimal or Using Kryo will make my solution better.
My goal is the save space and improve performance.
Any suggestions are welcome to put me in right direction.
Thanks
Not sure if kryo is actually able to write those JSON nodes. I think there are multiple possible options:
You stay with kryo, but that means you should read and write the objects as separate values, than you can recreate the JsonNode instances with constructor parameters
If you anyways gonna write independent values, you might want to write the values directly into ObjectDataOutput and read it using ObjectDataInput
From my pov the best way though is to use Jackson - you might want to have a look into the CBOR dataformat which is binary, very concise and directly available for Jackson - in addition you won't loose the schemaless, dynamic nature of JSON (https://github.com/FasterXML/jackson-dataformats-binary/tree/master/cbor)
In addition to good points and suggestions by #noctarius, there's another binary JSON alternative aside from CBOR called Smile. Found from same binary dataformats module:
https://github.com/FasterXML/jackson-dataformats-binary
In your case I do not think use of Kryo makes sense if and when you are dealing with JSON tree (or general tree models): Kryo works best when using POJOs, and can take full advantage of exact knowledge of structures. Tree models require inclusion of names, which eliminates size benefits that formats like Kryo, Avro, Protobuf and Thrift otherwise have.

How can I serialize/deserialize java.util.stream.Stream using Jackson?

Assuming I have the following object
public class DataObjectA {
private Stream<DataObjectB> dataObjectBStream;
}
How can I serialize them using Jackson?
As others have pointed out, you can only iterate once over a stream. If that works for you, you can use this to serialize:
new ObjectMapper().writerFor(Iterator.class).writeValueAsString(dataObjectBStream.iterator())
If you're using a Jackson version prior to 2.5, use writerWithType() instead of writerFor().
See https://github.com/FasterXML/jackson-modules-java8/issues/3 for the open issue to add java.util.Stream support to Jackson. There's a preliminary version of the code included. (edit: this is now merged and supported in 2.9.0).
Streaming support feels like it would work naturally/safely if the stream is the top level object you were (de)serializing, eg returning a java.util.stream.Stream<T> from a JAX-RS resource, or reading a Stream from a JAX-RS client.
A Stream as a member variable of a (de)serialized object, as you have in your example, is trickier, because it's mutable and single use:
private Stream<DataObjectB> dataObjectBStream;
Assuming it was supported, all of the caveats around storing references to streams would apply. You wouldn't be able to serialize the object more than once, and once you deserialized the wrapping object presumably it's stream member would retain a live connection back through the JAX-RS client and HTTP connection, which could create surprises.
You don’t.
A Stream is a single-use chain of operations and never meant to be persistent. Even storing it into an instance field like in your question is an indicator for a misunderstanding of it’s purpose. Once a terminal operation has been applied on the stream, it is useless and streams can’t be cloned. This, there is no point in remembering the unusable stream in a field then.
Since the only operations offered by Stream are chaining more operations to the pipeline and finally evaluating it, there is no way of querying its state such that it would allow to create an equivalent stream regarding its behavior. Therefore, no persistence framework can store it. The only thing a framework could do, is traversing the resulting elements of the stream operation and store them but that means effectively storing a kind of collection of objects rather than the Stream. Besides that, the single-use nature of a Stream also implies that a storage framework traversing the stream in order to store the elements had the side-effect of making the stream unusable at the same time.
If you want to store elements, resort to an ordinary Collection.
On the other hand, if you really want to store behavior, you’ll end up storing an object instance whose actual class implements the behavior. This still works with Streams as you can store an instance of a class which has a factory method producing the desired stream. Of course, you are not really storing the behavior but a symbolic reference to it, but this is always the case when you use an OO storage framework to store behavior rather than data.
I had below class having 2 elements one of them was Stream, had to annotate the getterStream method with#JsonSerializer and then override Serialize method, produces stream of JSON in my Response API:
public class DataSetResultBean extends ResultBean
{
private static final long serialVersionUID = 1L;
private final List<ComponentBean> structure;
private final Stream<DataPoint> datapoints;
private static class DataPointSerializer extends JsonSerializer<Stream<DataPoint>>
{
#Override
public void serialize(Stream<DataPoint> stream, JsonGenerator gen, SerializerProvider serializers) throws IOException, JsonProcessingException
{
gen.writeStartArray();
try
{
stream.forEach(dp -> serializeSingle(gen, dp));
}
catch (UncheckedIOException e)
{
throw (IOException) e.getCause();
}
finally
{
stream.close();
}
gen.writeEndArray();
}
public synchronized void serializeSingle(JsonGenerator gen, DataPoint dp) throws UncheckedIOException
{
try
{
gen.writeStartObject();
for (Entry<DataStructureComponent<?, ?, ?>, ScalarValue<?, ?, ?>> entry: dp.entrySet())
{
gen.writeFieldName(entry.getKey().getName());
gen.writeRawValue(entry.getValue().toString());
}
gen.writeEndObject();
}
catch (IOException e)
{
throw new UncheckedIOException(e);
}
}
}
public DataSetResultBean(DataSet dataset)
{
super("DATASET");
structure = dataset.getMetadata().stream().map(ComponentBean::new).collect(toList());
datapoints = dataset.stream();
}
public List<ComponentBean> getStructure()
{
return structure;
}
#JsonSerialize(using = DataPointSerializer.class)
public Stream<DataPoint> getDatapoints()
{
return datapoints;
}
}

Serialization in Hadoop - Writable

This is the class that implements Writable ..
public class Test implements Writable {
List<AtomicWritable> atoms = new ArrayList<AtomicWritable>();
public void write(DataOutput out) throws IOException {
IntWritable size = new IntWritable(atoms.size());
size.write(out);
for (AtomicWritable atom : atoms)
atom.write(out);
}
public void readFields(DataInput in) throws IOException {
atoms.clear();
IntWritable size = new IntWritable();
size.readFields(in);
int n = size.get();
while(n-- > 0) {
AtomicWritable atom = new AtomicWritable();
atom.readFields(in);
atoms.add(atom);
}
}
}
I will really appreciate if one can help me understand how to invoke write and readFields method.
Basically I m failing to understand how to construct Test object in this case. Once the object is written to DataOutput obj, how do we restore it in DataInput object. This may sound silly, but am a newbie to Hadoop and have been assigned a project that uses Hadoop. Please help.
Thanks!!!
Basically I m failing to understand how to construct Test object in this case.
Yup, you're missing the point. If you need to construct an instance of Test and populate atoms, then you need to add a constructor to Test:
public Test(ArrayList<AtomicWritable> atoms) {
this.atoms = atoms;
}
or you need to use the default constructor and add a method or a setter that lets you add items to atoms or set the value of atoms. The latter is actually pretty common in the Hadoop framework, to have a default constructor and a set method. cf., e.g., Text.set.
You don't call readFields and write; the Hadoop framework does that for you when it needs to serialize and deserialize inputs and outputs to and from map and reduce.

Protobuf-net serializer for NEventStore 3+

Can anyone point me to a protobuf-net serializer for NEventStore 3.0?
I'm having trouble I think mainly due to the serialization in event store 3 wrapping the event body and headers in an EventMessage.
I'm not sure how to setup the custom serializer correctly.
This is entirely untested guesswork based on a very brief glance at github, but it looks like you want to use the wire-up API to specify a custom serializer, for example:
var store = Wireup.Init()
.UsingSqlPersistence("Name Of EventStore ConnectionString In Config File")
.InitializeStorageEngine()
.UsingCustomSerialization(mySerializer)
... etc
where mySerializer is an instance of a type that implements the ISerialize interface. It looks like this should work:
class ProtobufSerializer : EventStore.Serialization.ISerialize
{
public void Serialize<T>(Stream output, T graph)
{
ProtoBuf.Serializer.Serialize<T>(output, graph);
}
public T Deserialize<T>(Stream input)
{
return ProtoBuf.Serializer.Deserialize<T>(input);
}
}
(so obviously mySerializer here would be a new ProtobufSerializer())

Java Serialization with RMI

I'm working on a project using Java RMI.
This is the class causing problem:
public class FSFile implements Serializable
{
public static final int READ = 0;
public static final int WRITE = 1;
private int flag;
private String filename;
private transient BufferedWriter writer;
private transient BufferedReader reader;
...
private void writeObject(ObjectOutputStream stream) throws IOException
{
stream.defaultWriteObject();
stream.writeObject(writer);
stream.writeObject(reader);
}
private void readObject(ObjectInputStream stream) throws IOException, ClassNotFoundException
{
stream.defaultReadObject();
writer = (BufferedWriter) stream.readObject();
reader = (BufferedReader) stream.readObject();
}
}
Basically, I use RMI to send that FSFile object to another process locally (for now) and here's the error I get:
java.rmi.UnmarshalException: error unmarshalling return; nested exception is:
java.io.WriteAbortedException: writing aborted; java.io.NotSerializableException;
java.io.BufferedReader
To be more precise, there's one class named FileService which use a function fetch() from a FileServer to get a FSFile in return. There is nothing special in the fetch() function, it just creates a FSFile and returns it. All communications between those classes are made via RMI.
How come I have an error like this ?
You can't serialize readers and writers. It makes no sense. It's like trying to send a telephone over a telephone line. If you want to send a file, send the file.
And your code just calls writeObject on these objects as though they were Serializable. They aren't. Otherwise you could have made them non-transient and omitted the custom readObject and writeObject methods altogether. Just re-coding what the system would have done anyway doesn't change anything. It certainly doesn't make classes Serializable that aren't.
If you don't want to re-implement file/stream sending via RMI you could look into RMIIO, it handles such things in concise and effective way.