Groovy compiler fails with unexpected token on readObject - serialization

My Gradle project has a mix of Java and Groovy classes. All source is under src/main/groovy. One of my Groovy classes contains a Map that I have created from reading a JSON string via JsonSlurper.parseText(). This class is marked Serializable.
To avoid a NotSerializableException, I have implemented my own writeObject() and readObject() methods, but my code is not compiling. I didn't find many Groovy examples, but various Java references and tutorials told me to use these signatures:
private void writeObject(java.io.ObjectOutputStream out)
throws IOException
private void readObject(java.io.ObjectInputStream in)
throws IOException, ClassNotFoundException
My class looks like this:
import groovy.json.JsonBuilder
import groovy.json.JsonSlurper
class GroovyJSONMap implements Serializable {
private static final long serialVersionUID = 20150902L
Map myJSON = [:]
GroovyJSONMap() {
//no op
}
GroovyJSONMap(String json) {
if (json) {
try {
setJSON(json)
} catch (any) {
println "WHOOPS! Not a JSON object...."
myJSON = ["invalid": true]
}
}
}
GroovyJSONMap(Map json) {
if (json) {
setJSON(json)
}
}
final void setJSON(String json) {
myJSON = new JsonSlurper().parseText(json)
}
String getJSON() {
new JsonBuilder(myJSON).toString()
}
#Override
String toString() {
getJSON()
}
private void readObject(ObjectInputStream in) throws IOException, ClassNotFoundException {
setJSON((String)in.readObject())
}
private void writeObject(ObjectOutputStream out) throws IOException {
out.writeObject(getJSON())
}
}
The compiler error:
:clean
:compileJava UP-TO-DATE
:compileGroovy
startup failed:
c:\path\to\src\main\groovy\GroovyJSONMap.groovy: 44: unexpected token: ObjectInputStream # line 110, column 29.
private void readObject(ObjectInputStream in) throws IOException, ClassNotFoundException {
^
1 error
:compileGroovy FAILED
I have moved the readObject() method to various positions in the source, but it still is not compiling. The compiler does not complain about writeObject(), only readObject(). Why is my code not compiling?

The compiler points to ObjectOutputStream, but the problem is really at in.
The word in is a reserved word in Groovy and cannot be used for a variable or method name.
The solution is to rename in to any non-Groovy-reserved word, such as stream (also changed writeObject() for consistency):
private void readObject(ObjectInputStream stream) throws IOException, ClassNotFoundException {
setJSON((String)stream.readObject())
}
private void writeObject(ObjectOutputStream stream) throws IOException {
stream.writeObject(getJSON())
}

Related

Getting noBaseStepListener error while using Serenity RestAssured

I am trying to implement Rest Assured framework with cucumber
I am facing a weird scenario that I have defined all my step definitions of my feature file then also I am getting error as below when I run my feature file:-
Step undefined
You can implement this step and 3 other step(s) using the snippet(s) below:
#Given("I create new service by using create service API data")
public void i_create_new_service_by_using_create_service_api_data() {
// Write code here that turns the phrase above into concrete actions
throw new io.cucumber.java.PendingException();
}
and When I run the same from Junit Testrunner then I get error as below :-
INFO net.serenitybdd.rest.decorators.request.RequestSpecificationDecorated - No BaseStepListener, POST /services not registered.
In my framework I am defining basepackage containing base class file which is as below :-
public class TestBase {
public static Properties propertyConfig = new Properties();
public static FileInputStream fis;
public static Response response;
public static RequestSpecification requestSpecification;
public static void loadPreConfigs(){
try {
fis = new FileInputStream("./src/test/resources/ConfigurationURLs/config.properties");
try {
propertyConfig.load(fis);
} catch (IOException e) {
e.printStackTrace();
}
} catch (FileNotFoundException e) {
e.printStackTrace();
}
RestAssured.baseURI=propertyConfig.getProperty("BaseURI");
}
}
Then I have a ApiCall package which contains all class files which have request specification and respective response storing rest API calls
The APICall file is given below:-
public class PostRequestCall extends TestBase {
private static String productVal;
public static int getProductVal() {
return Integer.parseInt(productVal);
}
public static void setProductVal(String productVal) {
PostRequestCall.productVal= productVal;
}
public RequestSpecification definePostRequest(){
requestSpecification= SerenityRest.given();
requestSpecification.contentType(ContentType.JSON);
return requestSpecification;
}
public Response CreateService(String serviceName){
JSONObject jsonObject=new JSONObject();
jsonObject.put("name",serviceName);
response=definePostRequest().body(jsonObject).post(propertyConfig.getProperty("createService"));
return response;
}
}
Then I have step file which are the class file in which I define the steps of serenity given below:
public class PostRequestSteps {
PostRequestCall postRequestCall=new PostRequestCall();
#Step
public RequestSpecification setPostSpecification(){
TestBase.requestSpecification=postRequestCall.definePostRequest();
return TestBase.requestSpecification;
}
#Step
public Response setPostRequestCall(String serviceName){
TestBase.response=postRequestCall.CreateService(serviceName);
return TestBase.response;
}
}
And I have defined a package which contains all the step definition classes one such class is as below :-
public class PostRequest_StepDefinitions {
String serviceID;
#Steps
PostRequestSteps postRequestSteps=new PostRequestSteps();
#Before
public void setUp() {
TestBase.loadPreConfigs();
}
#Given("I create new service by using create service API data")
public void i_create_new_service_by_using_create_service_api_data() {
postRequestSteps.setPostSpecification();
}
#When("I provide valid name {string} for service creation")
public void i_provide_valid_name_for_service_creation(String serviceName) {
TestBase.response=postRequestSteps.setPostRequestCall(serviceName);
}
#And("I save the id of created service")
public void i_save_the_id_of_created_service() {
serviceID=TestBase.response.jsonPath().get("id").toString();
PostRequestCall.setProductVal(serviceID);
}
#Then("I validate status code {int}")
public void i_validate_status_code(int statusCode) {
Assert.assertEquals(TestBase.response.getStatusCode(),statusCode);
}
The Junit Runner file and feature files are below

How to do failure tolerance for Flink to sink data to hdfs as gzip compression?

We want to write compressed data to HDFS by Flink's BucketingSink or StreamingFileSink. I have write my own Writer which works fine if no failure occurs. However when It encounters a failure and restart from checkpoint, It will generate valid-length file(hadoop < 2.7) or truncate the file. Unluckily gzips are binary files which have trailer at the end of file. Therefore simple truncation does not work in my case. Any ideas to enable exactly-once semantic for compression hdfs sink?
That's my writer's code:
public class HdfsCompressStringWriter extends StreamWriterBaseV2<JSONObject> {
private static final long serialVersionUID = 2L;
/**
* The {#code CompressFSDataOutputStream} for the current part file.
*/
private transient GZIPOutputStream compressionOutputStream;
public HdfsCompressStringWriter() {}
#Override
public void open(FileSystem fs, Path path) throws IOException {
super.open(fs, path);
this.setSyncOnFlush(true);
compressionOutputStream = new GZIPOutputStream(this.getStream(), true);
}
public void close() throws IOException {
if (compressionOutputStream != null) {
compressionOutputStream.close();
compressionOutputStream = null;
}
resetStream();
}
#Override
public void write(JSONObject element) throws IOException {
if (element == null || !element.containsKey("body")) {
return;
}
String content = element.getString("body") + "\n";
compressionOutputStream.write(content.getBytes());
compressionOutputStream.flush();
}
#Override
public Writer<JSONObject> duplicate() {
return new HdfsCompressStringWriter();
}
}
I would recommend to implement a BulkWriter for the StreamingFileSink which compresses the elements via a GZIPOutputStream. The code could look the following:
public static void main(String[] args) throws Exception {
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
env.setParallelism(1);
env.enableCheckpointing(1000);
final DataStream<Integer> input = env.addSource(new InfinitySource());
final StreamingFileSink<Integer> streamingFileSink = StreamingFileSink.<Integer>forBulkFormat(new Path("output"), new GzipBulkWriterFactory<>()).build();
input.addSink(streamingFileSink);
env.execute();
}
private static class GzipBulkWriterFactory<T> implements BulkWriter.Factory<T> {
#Override
public BulkWriter<T> create(FSDataOutputStream fsDataOutputStream) throws IOException {
final GZIPOutputStream gzipOutputStream = new GZIPOutputStream(fsDataOutputStream, true);
return new GzipBulkWriter<>(new ObjectOutputStream(gzipOutputStream), gzipOutputStream);
}
}
private static class GzipBulkWriter<T> implements BulkWriter<T> {
private final GZIPOutputStream gzipOutputStream;
private final ObjectOutputStream objectOutputStream;
public GzipBulkWriter(ObjectOutputStream objectOutputStream, GZIPOutputStream gzipOutputStream) {
this.gzipOutputStream = gzipOutputStream;
this.objectOutputStream = objectOutputStream;
}
#Override
public void addElement(T t) throws IOException {
objectOutputStream.writeObject(t);
}
#Override
public void flush() throws IOException {
objectOutputStream.flush();
}
#Override
public void finish() throws IOException {
objectOutputStream.flush();
gzipOutputStream.finish();
}
}

When attaching agent to running process, bytebuddy transformer doesn't seem to take effect

The code of my program to be attached is as below.
public class Foo {
}
public class TestEntry {
public TestEntry() {
}
public static void main(String[] args) throws Exception {
try
{
while(true)
{
System.out.println(new Foo().toString());
Thread.sleep(1000);
}
}
catch(Exception e)
{}
}
}
What I attempt to do is to make Foo.toString() returns 'test' by using the following agent.
public class InjectionAgent {
public InjectionAgent() {
}
public static void agentmain(String args, Instrumentation inst) throws Exception
{
System.out.println("agentmain Args:" + args);
new AgentBuilder.Default()
.type(ElementMatchers.named("Foo"))
.transform(new AgentBuilder.Transformer() {
#Override
public Builder<?> transform(Builder<?> arg0, TypeDescription arg1,
ClassLoader arg2, JavaModule arg3) {
return arg0.method(ElementMatchers.named("toString"))
.intercept(FixedValue.value("test"));
}
}).installOn(inst);
}
public static void premain(String args, Instrumentation inst) throws Exception
{
System.out.println("premain Args:" + args);
new AgentBuilder.Default()
.type(ElementMatchers.named("Foo"))
.transform(new AgentBuilder.Transformer() {
#Override
public Builder<?> transform(Builder<?> arg0, TypeDescription arg1,
ClassLoader arg2, JavaModule arg3) {
return arg0.method(ElementMatchers.named("toString"))
.intercept(FixedValue.value("test"));
}
}).installOn(inst);
}
}
I notice that, it was successful when I using -javaagent way, whereas attach way failed, here is code for attach.
public class Injection {
public Injection() {
}
public static void main(String[] args) throws AttachNotSupportedException, IOException, AgentLoadException, AgentInitializationException, InterruptedException {
VirtualMachine vm = null;
String agentjarpath = args[0];
vm = VirtualMachine.attach(args[1]);
vm.loadAgent(agentjarpath, "This is Args to the Agent.");
vm.detach();
}
}
I tried to add AgentBuilder.Listener.StreamWriting.toSystemOut() to the agent, after attaching, the output of TestEntry shows
[Byte Buddy] DISCOVERY Foo [sun.misc.Launcher$AppClassLoader#33909752, null, loaded=true]
[Byte Buddy] TRANSFORM Foo [sun.misc.Launcher$AppClassLoader#33909752, null, loaded=true]
[Byte Buddy] COMPLETE Foo [sun.misc.Launcher$AppClassLoader#33909752, null, loaded=true]
Foo#7f31245a
Foo#6d6f6e28
Foo#135fbaa4
Foo#45ee12a7
Foo#330bedb4
==================================Update=====================================
I defined a public method 'Bar' in Foo like this
public class Foo {
public String Bar()
{
return "Bar";
}
}
and then I was trying to make Foo.Bar() returns "modified" in the following way:
public static void agentmain(String args, Instrumentation inst) throws Exception
{
System.out.println("agentmain Args:" + args);
premain(args, inst);
new AgentBuilder.Default()
.with(RedefinitionStrategy.RETRANSFORMATION)
.disableClassFormatChanges()
.with(AgentBuilder.Listener.StreamWriting.toSystemOut())
.type(ElementMatchers.named("Foo"))
.transform(new AgentBuilder.Transformer() {
#Override
public Builder<?> transform(Builder<?> arg0, TypeDescription arg1,
ClassLoader arg2, JavaModule arg3) {
return arg0.visit(Advice.to(InjectionTemplate.class).on(ElementMatchers.named("Bar")));
}
})
.installOn(inst);
}
static class InjectionTemplate {
#Advice.OnMethodExit
static void exit(#Advice.Return String self) {
System.out.println(self.toString() + " " + self.getClass().toString());
self = new String("modified");
}
}
but I got this error:
java.lang.IllegalStateException: Cannot write to read-only parameter class java.lang.String at 1
any suggestions?
It does not seem like you are using redefinition for your agent. You can activate it using:
new AgentBuilder.Default()
.with(RedefinitionStrategy.RETRANSFORMATION)
.disableClassFormatChanges();
The last part is required on most JVMs (with the notable exception of the dynamic code evolution VM, a custom build of HotSpot). It tells Byte Buddy to not add fields or methods, what most VMs do not support.
In this case, it is no longer possible to invoke the original implementation of a method what is however not required for your FixedValue. Typically, users of Byte Buddy take advantage of Advice when creating an agent that applies dynamic transformations of classes.

Mocking the static method with Mockito

I am trying to mock static method using powermock.
Below is my code:
public class Helper{
public static User getLoggedInUser(HttpServletRequest request) throws NotFoundException {
String access = request.getHeader("Authorization");
if(access == null || access.isEmpty()) {
throw new Exception("Access is null");
}
User user = new User();
return user;
}
}
And this is the controller function from where i am calling the static method getUser:
#RequestMapping(value = "user/userInfo/{Id}", method = RequestMethod.GET, headers = "Accept=application/json")
public #ResponseBody
ResultDTO getUser(#PathVariable("Id") Integer Id, HttpServletRequest request) throws NotFoundException, UnauthorizedException {
Integer userID = -1;
User user = Helper.getLoggedInUser(request);
if(user != null){
userID = user.getUserId();
}
//do something
}
And this is my test class:
//#RunWith(PowerMockRunner.class)
//#PrepareForTest(Helper.class)
public class CustomerControllerNGTest {
#InjectMocks
private userController instance = new PaymentCustomerController();
public PaymentCustomerControllerNGTest() {
}
#BeforeClass
public void setUpClass() throws Exception {
}
#AfterClass
public static void tearDownClass() throws Exception {
}
#BeforeMethod
public void setUpMethod() throws Exception {
try{
MockitoAnnotations.initMocks(this);
}catch(Exception ex){
System.out.println(ex.getMessage());
}
try{
mockMvc = MockMvcBuilders.standaloneSetup(instance).build();
// mockMvc = MockMvcBuilders.webAppContextSetup(wac).build();
}catch(Exception ex){
System.out.println(ex.getMessage());
}
}
#AfterMethod
public void tearDownMethod() throws Exception {
}
#Test
public void testGetUserInfo() throws Exception {
User user = new User();
user.setUserId(1234);
HttpServletRequest request = mock(HttpServletRequest.class);
//this is for the static method
PowerMockito.mockStatic(Helper.class);
**PowerMockito.when(Helper.getLoggedInUser(request)).thenReturn(user);**
//do something
}
}
Now whenever i am executing the test case, and whenever it is executing the lone marked with bold, it is going inside the static method and throwing the exception "Access is null" rather than mocking the method , it is executing the method. Any idea?
I also tried by uncommenting these lines:
//#RunWith(PowerMockRunner.class)
//#PrepareForTest(Helper.class)
but still same exception.
Thanks
Try to uncomment:
//#RunWith(PowerMockRunner.class)
//#PrepareForTest(Helper.class)
and use
Mockito.when(Helper.getLoggedInUser(request)).thenReturn(user);
I wrote blog post on topic, that contain links to working examples on GitHub. These use TestNg instead of JUnit, but this shouldn't matter.
EDIT
I would suggest to always use latest combination of Mockito and PowerMock available. Older combinations were often pretty buggy with confusing errors. Current latest combination is Mockito 1.9.5-rc1+, PowerMock 1.5+. Pre-1.5 versions of PowerMock wasn't Java7 compliant.

Implementation of custom Writable in Hadoop?

I have defined a custom Writable class in Hadoop, but Hadoop gives me the following error message when running my program.
java.lang.RuntimeException: java.lang.NullPointerException
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:115)
at org.apache.hadoop.io.SortedMapWritable.readFields(SortedMapWritable.java:180)
at EquivalenceClsAggValue.readFields(EquivalenceClsAggValue.java:82)
at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:67)
at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:40)
at org.apache.hadoop.mapred.Task$ValuesIterator.readNextValue(Task.java:1282)
at org.apache.hadoop.mapred.Task$ValuesIterator.next(Task.java:1222)
at org.apache.hadoop.mapred.Task$CombineValuesIterator.next(Task.java:1301)
at Mondrian$Combine.reduce(Mondrian.java:119)
at Mondrian$Combine.reduce(Mondrian.java:1)
at org.apache.hadoop.mapred.Task$OldCombinerRunner.combine(Task.java:1442)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1436)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1298)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1136)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.lang.NullPointerException
at java.util.concurrent.ConcurrentHashMap.hash(ConcurrentHashMap.java:332)....
EquivalenceClsAggValue is the name of the Writable class I've defined and this is my class:
public class EquivalenceClsAggValue implements WritableComparable<EquivalenceClsAggValue>{
public ArrayList<SortedMapWritable> aggValues;
public EquivalenceClsAggValue(){
aggValues = new ArrayList<SortedMapWritable>();
}
#Override
public void readFields(DataInput arg0) throws IOException {
int size = arg0.readInt();
for (int i=0;i<size;i++){
SortedMapWritable tmp = new SortedMapWritable();
tmp.readFields(arg0);
aggValues.add(tmp);
}
}
#Override
public void write(DataOutput arg0) throws IOException {
//write the size first
arg0.write(aggValues.size());
//write each element
for (SortedMapWritable s:aggValues){
s.write(arg0);
}
}
I wonder to know what is the source of the problem.
Looks like an error in your write(DataOutput) method:
#Override
public void write(DataOutput arg0) throws IOException {
//write the size first
// arg0.write(aggValues.size()); // here you're writing an int as a byte
// try this instead:
arg0.writeInt(aggValues.size()); // actually write int as an int
//..
Look at the API docs for DataOutput.write(int) vs DataOutput.writeInt(int)
I'd also amend your creation of the SortedMapWritable tmp local variable in readFields to use ReflectionUtils.newInstance():
#Override
public void readFields(DataInput arg0) throws IOException {
int size = arg0.readInt();
for (int i=0;i<size;i++){
SortedMapWritable tmp = ReflectionUtils.newInstance(
SortedMapWritable.class, getConf());
tmp.readFields(arg0);
aggValues.add(tmp);
}
}
Note for this to work, you'll also need to amend you class signature to extend Configurable (such that Hadoop will inject a Configuration object when your object is initially created):
public class EquivalenceClsAggValue
extends Configured
implements WritableComparable<EquivalenceClsAggValue> {