When attaching agent to running process, bytebuddy transformer doesn't seem to take effect - byte-buddy

The code of my program to be attached is as below.
public class Foo {
}
public class TestEntry {
public TestEntry() {
}
public static void main(String[] args) throws Exception {
try
{
while(true)
{
System.out.println(new Foo().toString());
Thread.sleep(1000);
}
}
catch(Exception e)
{}
}
}
What I attempt to do is to make Foo.toString() returns 'test' by using the following agent.
public class InjectionAgent {
public InjectionAgent() {
}
public static void agentmain(String args, Instrumentation inst) throws Exception
{
System.out.println("agentmain Args:" + args);
new AgentBuilder.Default()
.type(ElementMatchers.named("Foo"))
.transform(new AgentBuilder.Transformer() {
#Override
public Builder<?> transform(Builder<?> arg0, TypeDescription arg1,
ClassLoader arg2, JavaModule arg3) {
return arg0.method(ElementMatchers.named("toString"))
.intercept(FixedValue.value("test"));
}
}).installOn(inst);
}
public static void premain(String args, Instrumentation inst) throws Exception
{
System.out.println("premain Args:" + args);
new AgentBuilder.Default()
.type(ElementMatchers.named("Foo"))
.transform(new AgentBuilder.Transformer() {
#Override
public Builder<?> transform(Builder<?> arg0, TypeDescription arg1,
ClassLoader arg2, JavaModule arg3) {
return arg0.method(ElementMatchers.named("toString"))
.intercept(FixedValue.value("test"));
}
}).installOn(inst);
}
}
I notice that, it was successful when I using -javaagent way, whereas attach way failed, here is code for attach.
public class Injection {
public Injection() {
}
public static void main(String[] args) throws AttachNotSupportedException, IOException, AgentLoadException, AgentInitializationException, InterruptedException {
VirtualMachine vm = null;
String agentjarpath = args[0];
vm = VirtualMachine.attach(args[1]);
vm.loadAgent(agentjarpath, "This is Args to the Agent.");
vm.detach();
}
}
I tried to add AgentBuilder.Listener.StreamWriting.toSystemOut() to the agent, after attaching, the output of TestEntry shows
[Byte Buddy] DISCOVERY Foo [sun.misc.Launcher$AppClassLoader#33909752, null, loaded=true]
[Byte Buddy] TRANSFORM Foo [sun.misc.Launcher$AppClassLoader#33909752, null, loaded=true]
[Byte Buddy] COMPLETE Foo [sun.misc.Launcher$AppClassLoader#33909752, null, loaded=true]
Foo#7f31245a
Foo#6d6f6e28
Foo#135fbaa4
Foo#45ee12a7
Foo#330bedb4
==================================Update=====================================
I defined a public method 'Bar' in Foo like this
public class Foo {
public String Bar()
{
return "Bar";
}
}
and then I was trying to make Foo.Bar() returns "modified" in the following way:
public static void agentmain(String args, Instrumentation inst) throws Exception
{
System.out.println("agentmain Args:" + args);
premain(args, inst);
new AgentBuilder.Default()
.with(RedefinitionStrategy.RETRANSFORMATION)
.disableClassFormatChanges()
.with(AgentBuilder.Listener.StreamWriting.toSystemOut())
.type(ElementMatchers.named("Foo"))
.transform(new AgentBuilder.Transformer() {
#Override
public Builder<?> transform(Builder<?> arg0, TypeDescription arg1,
ClassLoader arg2, JavaModule arg3) {
return arg0.visit(Advice.to(InjectionTemplate.class).on(ElementMatchers.named("Bar")));
}
})
.installOn(inst);
}
static class InjectionTemplate {
#Advice.OnMethodExit
static void exit(#Advice.Return String self) {
System.out.println(self.toString() + " " + self.getClass().toString());
self = new String("modified");
}
}
but I got this error:
java.lang.IllegalStateException: Cannot write to read-only parameter class java.lang.String at 1
any suggestions?

It does not seem like you are using redefinition for your agent. You can activate it using:
new AgentBuilder.Default()
.with(RedefinitionStrategy.RETRANSFORMATION)
.disableClassFormatChanges();
The last part is required on most JVMs (with the notable exception of the dynamic code evolution VM, a custom build of HotSpot). It tells Byte Buddy to not add fields or methods, what most VMs do not support.
In this case, it is no longer possible to invoke the original implementation of a method what is however not required for your FixedValue. Typically, users of Byte Buddy take advantage of Advice when creating an agent that applies dynamic transformations of classes.

Related

How to do failure tolerance for Flink to sink data to hdfs as gzip compression?

We want to write compressed data to HDFS by Flink's BucketingSink or StreamingFileSink. I have write my own Writer which works fine if no failure occurs. However when It encounters a failure and restart from checkpoint, It will generate valid-length file(hadoop < 2.7) or truncate the file. Unluckily gzips are binary files which have trailer at the end of file. Therefore simple truncation does not work in my case. Any ideas to enable exactly-once semantic for compression hdfs sink?
That's my writer's code:
public class HdfsCompressStringWriter extends StreamWriterBaseV2<JSONObject> {
private static final long serialVersionUID = 2L;
/**
* The {#code CompressFSDataOutputStream} for the current part file.
*/
private transient GZIPOutputStream compressionOutputStream;
public HdfsCompressStringWriter() {}
#Override
public void open(FileSystem fs, Path path) throws IOException {
super.open(fs, path);
this.setSyncOnFlush(true);
compressionOutputStream = new GZIPOutputStream(this.getStream(), true);
}
public void close() throws IOException {
if (compressionOutputStream != null) {
compressionOutputStream.close();
compressionOutputStream = null;
}
resetStream();
}
#Override
public void write(JSONObject element) throws IOException {
if (element == null || !element.containsKey("body")) {
return;
}
String content = element.getString("body") + "\n";
compressionOutputStream.write(content.getBytes());
compressionOutputStream.flush();
}
#Override
public Writer<JSONObject> duplicate() {
return new HdfsCompressStringWriter();
}
}
I would recommend to implement a BulkWriter for the StreamingFileSink which compresses the elements via a GZIPOutputStream. The code could look the following:
public static void main(String[] args) throws Exception {
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
env.setParallelism(1);
env.enableCheckpointing(1000);
final DataStream<Integer> input = env.addSource(new InfinitySource());
final StreamingFileSink<Integer> streamingFileSink = StreamingFileSink.<Integer>forBulkFormat(new Path("output"), new GzipBulkWriterFactory<>()).build();
input.addSink(streamingFileSink);
env.execute();
}
private static class GzipBulkWriterFactory<T> implements BulkWriter.Factory<T> {
#Override
public BulkWriter<T> create(FSDataOutputStream fsDataOutputStream) throws IOException {
final GZIPOutputStream gzipOutputStream = new GZIPOutputStream(fsDataOutputStream, true);
return new GzipBulkWriter<>(new ObjectOutputStream(gzipOutputStream), gzipOutputStream);
}
}
private static class GzipBulkWriter<T> implements BulkWriter<T> {
private final GZIPOutputStream gzipOutputStream;
private final ObjectOutputStream objectOutputStream;
public GzipBulkWriter(ObjectOutputStream objectOutputStream, GZIPOutputStream gzipOutputStream) {
this.gzipOutputStream = gzipOutputStream;
this.objectOutputStream = objectOutputStream;
}
#Override
public void addElement(T t) throws IOException {
objectOutputStream.writeObject(t);
}
#Override
public void flush() throws IOException {
objectOutputStream.flush();
}
#Override
public void finish() throws IOException {
objectOutputStream.flush();
gzipOutputStream.finish();
}
}

MethodDecorator.Fody and getting parameter values

I was wondering if with MethodDecorator it's possible to have the passed parameter during the OnException... that would be great since if I can catch an exception I can also have the passed parameter values
Consider this piece of code
static void Main(string[] args)
{
Worker worker = new Worker();
worker.DoWork(6);
}
[AttributeUsage(AttributeTargets.Method | AttributeTargets.Constructor | AttributeTargets.Assembly | AttributeTargets.Module)]
public class LoggableAttribute : Attribute, IMethodDecorator
{
public void OnEntry(System.Reflection.MethodBase method)
{
var args = method.GetParameters();
var arguments = method.GetGenericArguments();
}
public void OnExit(System.Reflection.MethodBase method)
{
}
public void OnException(System.Reflection.MethodBase method, Exception exception)
{
}
}
and
public class Worker
{
[Loggable]
public void DoWork(int i )
{
}
}
I wish to have 6 on the OnEntry/Nor OnException
Thanks
I know this is an old question, but in case someone stumbles upon this like I did, you can add an Init method and capture the argument values there.
e.g:
public class LoggableAttribute : Attribute, IMethodDecorator
{
private object[] arguments;
public void Init(object instance, MethodBase method, object[] args) {
this.arguments = args;
}
public void OnEntry()
{
// this.arguments[0] would be 6 when calling worker.DoWork(6);
}
}
Check out the example on https://github.com/Fody/MethodDecorator

spring batch null job runner null pointer exception

Hi I'm getting null pointer exception while executing a sample spring batch job. Exception is thrown from job launcher. This is my job launcher code. Thanks in advance.
public class NewJobRunner {
public static Job job;
public static JobLauncher jobLauncher;
public static JobRepository jobRepository;
public static void main(String args[]) {
try {
AbstractApplicationContext applicationContext = new ClassPathXmlApplicationContext("/resources/job-context.xml");
jobLauncher.run(job, new JobParametersBuilder()
.toJobParameters()
);
}catch(Exception e) {
e.printStackTrace();
}
}
public void setJobLauncher(JobLauncher jobLauncher) {
this.jobLauncher = jobLauncher;
}
public void setJobRepository(JobRepository jobRepository) {
this.jobRepository = jobRepository;
}
public void setJob(Job job) {
this.job = job;
}

Initialize public static variable in Hadoop through arguments

I have a problem with changing public static variables in Hadoop.
I am trying to pass some values as arguments to the jar file from command line.
here is my code:
public class MyClass {
public static long myvariable1 = 100;
public static class Map extends Mapper<Object, Text, Text, Text> {
public static long myvariabl2 = 200;
public void map(Object key, Text value, Context context) throws IOException, InterruptedException {
}
}
public static class Reduce extends Reducer<Text, Text, Text, Text> {
public void reduce(Text key, Iterable<Text> values, Context context)
throws IOException, InterruptedException {
}
}
public static void main(String[] args) throws Exception {
col_no = Long.parseLong(args[0]);
Map.myvariable1 = Long.parseLong(args[1]);
Map.myvariable2 = Long.parseLong(args[1]);
other stuff here
}
}
But it is not working, myvariable1 & myvaribale2 always have 100 & 200.
I use Hadoop 0.20.203 with Ubuntu 10.04
What you can do to get the same behavior is to store your variables in the Configuration you use to launch the job.
public static class Map extends Mapper<Object, Text, Text, Text> {
public void map(Object key, Text value, Context context) throws IOException, InterruptedException {
Configuration conf = context.getConfiguration();
String var2String = conf.get("myvariable2");
long myvariable2 = Long.parseLong(var2String);
//etc.
}
}
public static void main(String[] args) throws Exception {
col_no = Long.parseLong(args[0]);
String myvariable1 = args[1];
String myvariable2 = args[1];
// add values to configuration
Configuration conf = new Configuration();
conf.set("myvariable1", myvariable1);
conf.set("myvariable2", myvariable2);
//other stuff here
}

Postsharp Newbie - Why is args.Instance null?

New to PostSharp --- I'm trying out the NuGet version now and I'm trying to understand wny in the AuthoriseAttribute OnEntry method that the agrs.Instance value is null. I'm trying to implement authorsation that depends on the values of the object e.g. A customer who's been archived can't have a credit limit raised. I'm implementing the rules within other classes specific to the rules.
public class Program
{
static void Main(string[] args)
{
var c = new Customer();
c.RaiseCreditLimit(100000);
c.Error(00);
}
}
public class Customer
{
[AuthorizeActivity]
public void RaiseCreditLimit(int newValue)
{
}
[AuthorizeActivity]
public void Error(int newValue)
{
}
}
[Serializable]
public class AuthorizeActivityAttribute : OnMethodBoundaryAspect
{
public override void OnEntry(MethodExecutionArgs args)
{
//
//Why is args.Instance null???????????
//
if (args.Method.Name == "RaiseCreditLimit")
{
Debug.WriteLine(args.Method.Name + " started");
}
else
{
throw new Exception("Crap");
}
}
public override void OnExit(MethodExecutionArgs args)
{
Debug.WriteLine(args.Method.Name + " finished");
}
}
The answer is because you're not using it in your aspect. It's an optimization. If you use it in the aspect then it will be set. Change your aspect to consume instance and it will be there.
public override void OnEntry(MethodExecutionArgs args)
{
//
//Why is args.Instance null???????????
//
if (args.Method.Name == "RaiseCreditLimit")
{
Debug.WriteLine(args.Instance.GetType().Name);
Debug.WriteLine(args.Method.Name + " started");
}
else
{
throw new Exception("Crap");
}
}
For more info check out this article to see what else PostSharp does to optimize code http://programmersunlimited.wordpress.com/2011/03/23/postsharp-weaving-community-vs-professional-reasons-to-get-a-professional-license/