Processing Json through Pig Scripts - apache-pig

I have currently started to work with JSON files and process data using PIG scripts. I am using Pig version 0.9.3.I have come across PiggyBank which i thought will be useful to load and process json file in PIG scripts.
I have built piggybank.jar through ANT.
Later, I have compiled the Java File and updated the piggybank.jar. Was trying to run the given example json file.
I have written a simple PIGSCRIPT and the respective JSON as follows.
REGISTER piggybank.jar
a = LOAD 'file3.json' using org.apache.pig.piggybank.storage.JsonLoader() AS (json:map[]);
b = foreach a GENERATE flatten(json#'menu') AS menu;
c = foreach b generate flatten(menu#'popup') as popup;
d = foreach c generate flatten(popup#'menuitem') as menu;
e = foreach d generate flatten(menu#'value') as val;
DUMP e;
file3.json
{ "menu" : {
"id" : "file",
"value" : "File",
"popup": {
"menuitem" : [
{"value" : "New", "onclick": "CreateNewDoc()"},
{"value" : "Open", "onclick": "OpenDoc()"},
{"value" : "Close", "onclick": "CloseDoc()"}
]
}
}}
I get the following exception during runtime:
org.apache.pig.backend.executionengine.ExecException: ERROR 6018: Error while reading input - Could not json-decode string: { "menu" : {
at org.apache.pig.piggybank.storage.JsonLoader.parseStringToTuple(JsonLoader.java:127)
Pig log file:
Pig Stack Trace
---------------
ERROR 1066: Unable to open iterator for alias e
org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to open iterator for alias e
at org.apache.pig.PigServer.openIterator(PigServer.java:901)
at org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:655)
at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:303)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:188)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:164)
at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:84)
at org.apache.pig.Main.run(Main.java:561)
at org.apache.pig.Main.main(Main.java:111)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: java.io.IOException: Job terminated with anomalous status FAILED
at org.apache.pig.PigServer.openIterator(PigServer.java:893)
... 12 more
================================================================================
Please correct me if I am wrong. Thanks

You can handle nested json loading with Twitter's Elephant Bird: https://github.com/kevinweil/elephant-bird
a = LOAD 'file3.json' USING com.twitter.elephantbird.pig.load.JsonLoader('-nestedLoad')
This will parse the JSON into a map http://pig.apache.org/docs/r0.11.1/basic.html#map-schema the JSONArray gets parsed into a DataBag of maps.

Related

Cannot coerce String { class: java.lang.String }

I am trying to send email from the flow and stored all the email addresses in the yaml file like below
# Email
email:
toEmail: "abc.123#gg.org,def.456#gg.org"
fromEmail: "ms-dev#gg.org"
ccAddress: "abc123#gmail.com"
I am trying use the above values in to the send email connector like
<email:send doc:name="Send" doc:id="fd09c56f-eaed-44c4-ab06-aa0417f2fdbf" config-ref="Email_SMTP" subject="Error with SOW integration between D365 and Salesforce " fromAddress='#[p("email.fromEmail")]' toAddresses='#[p("email.toEmail") splitBy ","]' ccAddresses='#[p("email.ccAddress")]'>
<email:body contentType="text/html">
<email:content ><![CDATA[#[vars.emailBody]]]></email:content>
</email:body>
</email:send>
But when debugging Iam getting the error like below
org.mule.runtime.core.internal.exception.OnErrorPropagateHandler:
********************************************************************************
Message : "Cannot coerce String { class: java.lang.String } ("abc123#gmail.com" as String {class: "java.lang.String"}) to Array" evaluating expression: "p("email.ccAddress")".
Element : salesforce-proc-SendEmail_Flow/processors/1 # salesforce-proc:salesforce-proc-implementation.xml:566 (Send)
Element DSL : <email:send doc:name="Send" doc:id="fd09c56f-eaed-44c4-ab06-aa0417f2fdbf" config-ref="Email_SMTP" subject="Error with SOW integration between D365 and Salesforce " fromAddress="#[p("email.fromEmail")]" toAddresses="#[p("email.toEmail") splitBy ","]" ccAddresses="#[p("email.ccAddress")]">
<email:body contentType="text/html">
<email:content><![CDATA[
#[vars.emailBody]
]]></email:content>
</email:body>
</email:send>
Error type : MULE:EXPRESSION
FlowStack : at salesforce-proc-SendEmail_Flow(salesforce-proc-SendEmail_Flow/processors/1 # salesforce-proc:salesforce-proc-implementation.xml:566 (Send))
at listener-flow(listener-flow/errorHandler/0/processors/2 # salesforce-proc:salesforce-proc-implementation.xml:547 (Flow Reference))\
After fixing ccAddresses error I get below error
caf9-11ec-b461-025041000001] org.mule.runtime.core.internal.exception.OnErrorPropagateHandler:
********************************************************************************
Message : Error while sending email: Exception reading response
Element : salesforce-proc-SendEmail_Flow/processors/1 # salesforce-proc:salesforce-proc-implementation.xml:566 (Send)
Element DSL : <email:send doc:name="Send" doc:id="fd09c56f-eaed-44c4-ab06-aa0417f2fdbf" config-ref="Email_SMTP" subject="Error with SOW integration between D365 and Salesforce " fromAddress="#[p("email.fromEmail")]" toAddresses="#[p("email.toEmail") splitBy ","]" ccAddresses="#[p("email.ccAddress") splitBy ","]">
<email:body contentType="text/html">
<email:content><![CDATA[
#[vars.emailBody]
]]></email:content>
</email:body>
</email:send>
Error type : EMAIL:SEND
FlowStack : at salesforce-proc-SendEmail_Flow(salesforce-proc-SendEmail_Flow/processors/1 # salesforce-proc:salesforce-proc-implementation.xml:566 (Send))
at listener-flow(listener-flow/errorHandler/0/processors/2 # salesforce-proc:salesforce-proc-implementation.xml:547 (Flow Reference))
Can anyone please suggest what is that I am missing here.
The problem is that ccAddresses expects an array. Because you are using an expression to configure that attribute you need to convert the string value from the configuration file explicitly to an array, for example using the splitBy() function as you did in toAddresses.
Or if you will only use one address simply remove the expression and use a property placeholder (ccAddresses="${email.ccAddress}").

Trying to retrieve all rows from local H2 database with Quarkus

I'm creating a Quarkus project in Kotlin. I'm trying to implement an API where I hit "/users" endpoint, and it returns all the users I have in my local database.
Unfortunately, I'm getting an error. The stacktrace:
The stacktrace below has been reversed to show the root cause first.
org.h2.jdbc.JdbcSQLException: Column "USER0_.CREATEDAT" not found; SQL statement:
select user0_.id as id1_7_, user0_.createdAt as createda2_7_, user0_.email as email3_7_, user0_.fullName as fullname4_7_, user0_.updatedAt as updateda5_7_ from User user0_ [42122-197]
at org.h2.message.DbException.getJdbcSQLException(DbException.java:357)
at org.h2.message.DbException.get(DbException.java:179)
at org.h2.message.DbException.get(DbException.java:155)
at org.h2.expression.ExpressionColumn.optimize(ExpressionColumn.java:150)
at org.h2.expression.Alias.optimize(Alias.java:51)
at org.h2.command.dml.Select.prepare(Select.java:858)
at org.h2.command.Parser.prepareCommand(Parser.java:283)
at org.h2.engine.Session.prepareLocal(Session.java:611)
at org.h2.engine.Session.prepareCommand(Session.java:549)
at org.h2.jdbc.JdbcConnection.prepareCommand(JdbcConnection.java:1247)
at org.h2.jdbc.JdbcPreparedStatement.<init>(JdbcPreparedStatement.java:76)
at org.h2.jdbc.JdbcConnection.prepareStatement(JdbcConnection.java:304)
at io.agroal.pool.wrapper.ConnectionWrapper.prepareStatement(ConnectionWrapper.java:659)
at org.hibernate.engine.jdbc.internal.StatementPreparerImpl$5.doPrepare(StatementPreparerImpl.java:149)
at org.hibernate.engine.jdbc.internal.StatementPreparerImpl$StatementPreparationTemplate.prepareStatement(StatementPreparerImpl.java:176)
at org.hibernate.engine.jdbc.internal.StatementPreparerImpl.prepareQueryStatement(StatementPreparerImpl.java:151)
at org.hibernate.loader.Loader.prepareQueryStatement(Loader.java:2103)
at org.hibernate.loader.Loader.executeQueryStatement(Loader.java:2040)
at org.hibernate.loader.Loader.executeQueryStatement(Loader.java:2018)
at org.hibernate.loader.Loader.doQuery(Loader.java:948)
at org.hibernate.loader.Loader.doQueryAndInitializeNonLazyCollections(Loader.java:349)
at org.hibernate.loader.Loader.doList(Loader.java:2849)
at org.hibernate.loader.Loader.doList(Loader.java:2831)
at org.hibernate.loader.Loader.listIgnoreQueryCache(Loader.java:2663)
at org.hibernate.loader.Loader.list(Loader.java:2658)
at org.hibernate.loader.hql.QueryLoader.list(QueryLoader.java:506)
at org.hibernate.hql.internal.ast.QueryTranslatorImpl.list(QueryTranslatorImpl.java:400)
at org.hibernate.engine.query.spi.HQLQueryPlan.performList(HQLQueryPlan.java:219)
at org.hibernate.internal.SessionImpl.list(SessionImpl.java:1414)
at org.hibernate.query.internal.AbstractProducedQuery.doList(AbstractProducedQuery.java:1625)
at org.hibernate.query.internal.AbstractProducedQuery.list(AbstractProducedQuery.java:1593)
at org.hibernate.query.Query.getResultList(Query.java:165)
at io.quarkus.hibernate.orm.panache.common.runtime.CommonPanacheQueryImpl.list(CommonPanacheQueryImpl.java:239)
at io.quarkus.hibernate.orm.panache.kotlin.runtime.PanacheQueryImpl.list(PanacheQueryImpl.java:154)
at io.quarkus.hibernate.orm.panache.kotlin.runtime.KotlinJpaOperations.list(KotlinJpaOperations.java:24)
at io.quarkus.hibernate.orm.panache.kotlin.runtime.KotlinJpaOperations.list(KotlinJpaOperations.java:10)
at io.quarkus.hibernate.orm.panache.common.runtime.AbstractJpaOperations.listAll(AbstractJpaOperations.java:289)
at com.fortuneapp.backend.application.domain.adapters.databases.panache.UserRepository.listAll(UserRepository.kt)
at com.fortuneapp.backend.application.domain.adapters.databases.panache.UserRepository_Subclass.listAll$$superaccessor28(UserRepository_Subclass.zig:4915)
at com.fortuneapp.backend.application.domain.adapters.databases.panache.UserRepository_Subclass$$function$$28.apply(UserRepository_Subclass$$function$$28.zig:29)
at io.quarkus.arc.impl.AroundInvokeInvocationContext.proceed(AroundInvokeInvocationContext.java:54)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor.proceed(InvocationInterceptor.java:63)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor.monitor(InvocationInterceptor.java:49)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor_Bean.intercept(InvocationInterceptor_Bean.zig:521)
at io.quarkus.arc.impl.InterceptorInvocation.invoke(InterceptorInvocation.java:41)
at io.quarkus.arc.impl.AroundInvokeInvocationContext.perform(AroundInvokeInvocationContext.java:41)
at io.quarkus.arc.impl.InvocationContexts.performAroundInvoke(InvocationContexts.java:32)
at com.fortuneapp.backend.application.domain.adapters.databases.panache.UserRepository_Subclass.listAll(UserRepository_Subclass.zig:4873)
at com.fortuneapp.backend.application.domain.adapters.databases.panache.UserRepository_ClientProxy.listAll(UserRepository_ClientProxy.zig:1353)
at com.fortuneapp.backend.application.rest.UsersResource.getAllUsers(UserResource.kt:24)
at com.fortuneapp.backend.application.rest.UsersResource_Subclass.getAllUsers$$superaccessor2(UsersResource_Subclass.zig:354)
at com.fortuneapp.backend.application.rest.UsersResource_Subclass$$function$$2.apply(UsersResource_Subclass$$function$$2.zig:29)
at io.quarkus.arc.impl.AroundInvokeInvocationContext.proceed(AroundInvokeInvocationContext.java:54)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor.proceed(InvocationInterceptor.java:63)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor.monitor(InvocationInterceptor.java:49)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor_Bean.intercept(InvocationInterceptor_Bean.zig:521)
at io.quarkus.arc.impl.InterceptorInvocation.invoke(InterceptorInvocation.java:41)
at io.quarkus.arc.impl.AroundInvokeInvocationContext.perform(AroundInvokeInvocationContext.java:41)
at io.quarkus.arc.impl.InvocationContexts.performAroundInvoke(InvocationContexts.java:32)
at com.fortuneapp.backend.application.rest.UsersResource_Subclass.getAllUsers(UsersResource_Subclass.zig:312)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.jboss.resteasy.core.MethodInjectorImpl.invoke(MethodInjectorImpl.java:170)
at org.jboss.resteasy.core.MethodInjectorImpl.invoke(MethodInjectorImpl.java:130)
at org.jboss.resteasy.core.ResourceMethodInvoker.internalInvokeOnTarget(ResourceMethodInvoker.java:660)
at org.jboss.resteasy.core.ResourceMethodInvoker.invokeOnTargetAfterFilter(ResourceMethodInvoker.java:524)
at org.jboss.resteasy.core.ResourceMethodInvoker.lambda$invokeOnTarget$2(ResourceMethodInvoker.java:474)
at org.jboss.resteasy.core.interception.jaxrs.PreMatchContainerRequestContext.filter(PreMatchContainerRequestContext.java:364)
at org.jboss.resteasy.core.ResourceMethodInvoker.invokeOnTarget(ResourceMethodInvoker.java:476)
at org.jboss.resteasy.core.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:434)
at org.jboss.resteasy.core.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:408)
at org.jboss.resteasy.core.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:69)
at org.jboss.resteasy.core.SynchronousDispatcher.invoke(SynchronousDispatcher.java:492)
at org.jboss.resteasy.core.SynchronousDispatcher.lambda$invoke$4(SynchronousDispatcher.java:261)
at org.jboss.resteasy.core.SynchronousDispatcher.lambda$preprocess$0(SynchronousDispatcher.java:161)
at org.jboss.resteasy.core.interception.jaxrs.PreMatchContainerRequestContext.filter(PreMatchContainerRequestContext.java:364)
at org.jboss.resteasy.core.SynchronousDispatcher.preprocess(SynchronousDispatcher.java:164)
at org.jboss.resteasy.core.SynchronousDispatcher.invoke(SynchronousDispatcher.java:247)
at io.quarkus.resteasy.runtime.standalone.RequestDispatcher.service(RequestDispatcher.java:73)
at io.quarkus.resteasy.runtime.standalone.VertxRequestHandler.dispatch(VertxRequestHandler.java:138)
at io.quarkus.resteasy.runtime.standalone.VertxRequestHandler.access$000(VertxRequestHandler.java:41)
at io.quarkus.resteasy.runtime.standalone.VertxRequestHandler$1.run(VertxRequestHandler.java:93)
at io.quarkus.runtime.CleanableExecutor$CleaningRunnable.run(CleanableExecutor.java:231)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.jboss.threads.EnhancedQueueExecutor$Task.run(EnhancedQueueExecutor.java:2415)
at org.jboss.threads.EnhancedQueueExecutor$ThreadBody.run(EnhancedQueueExecutor.java:1452)
at org.jboss.threads.DelegatingRunnable.run(DelegatingRunnable.java:29)
at org.jboss.threads.ThreadLocalResettingRunnable.run(ThreadLocalResettingRunnable.java:29)
at java.base/java.lang.Thread.run(Thread.java:834)
at org.jboss.threads.JBossThread.run(JBossThread.java:501)
Resulted in: org.hibernate.exception.SQLGrammarException: could not prepare statement
at org.hibernate.exception.internal.SQLStateConversionDelegate.convert(SQLStateConversionDelegate.java:103)
at org.hibernate.exception.internal.StandardSQLExceptionConverter.convert(StandardSQLExceptionConverter.java:42)
at org.hibernate.engine.jdbc.spi.SqlExceptionHelper.convert(SqlExceptionHelper.java:113)
at org.hibernate.engine.jdbc.internal.StatementPreparerImpl$StatementPreparationTemplate.prepareStatement(StatementPreparerImpl.java:186)
... 78 more
Resulted in: javax.persistence.PersistenceException: org.hibernate.exception.SQLGrammarException: could not prepare statement
at org.hibernate.internal.ExceptionConverterImpl.convert(ExceptionConverterImpl.java:154)
at org.hibernate.query.internal.AbstractProducedQuery.list(AbstractProducedQuery.java:1602)
... 62 more
Resulted in: org.jboss.resteasy.spi.UnhandledException: javax.persistence.PersistenceException: org.hibernate.exception.SQLGrammarException: could not prepare statement
at org.jboss.resteasy.core.ExceptionHandler.handleApplicationException(ExceptionHandler.java:106)
at org.jboss.resteasy.core.ExceptionHandler.handleException(ExceptionHandler.java:372)
at org.jboss.resteasy.core.SynchronousDispatcher.writeException(SynchronousDispatcher.java:218)
at org.jboss.resteasy.core.SynchronousDispatcher.invoke(SynchronousDispatcher.java:519)
... 18 more
I've set up my local H2 database connection by adding this to my application.yaml file:
quarkus:
datasource:
db-kind: h2
username: sa
jdbc:
url: "jdbc:h2:mem:default"
flyway:
migrate-at-start: true
Furthermore, I'm using https://quarkus.io/guides/hibernate-orm-panache, which is quite easy to use. I've created a User entity, and a User repository in my project. I then use those in my User resource, where I define the api.
#Path("/users")
class UsersResource {
#Inject
lateinit var userRepository: UserRepository
#GET
#Produces(MediaType.APPLICATION_JSON)
fun getAllUsers() : GetAllUsersResponse =
try {
GetAllUsersSuccess(userRepository.listAll())
} catch (e: NotFoundException) {
GetAllUsersFailure(e)
}
}
I've already implemented Flyway, for my database migration. This seems to go well. I'm also adding a row to the table when the migration runs, and I can see that "1 row is affected", so I think Flyway has access to the H2 database.
Do any of you guys know what I'm missing?
Kind regards
I found the problem. I defined my column names in snake case, while my entity properties where defined in camel case.

SQLScriptPreparer NullPointerException

We have successfully executed the DatabaseTablesPreparer and inited the tables in the DB, but when we try to init the indexes on the table with SQLScriptPreparer, we get the following exception:
ES1 dbinit [] [] com.intershop.platform.cartridge.internal.CartridgeImpl [] [] [] [] "main" Neither Ivy descriptor nor cartridge properties found for cartridge 'app_core_a1'!
ES1 dbinit [] [app_core_a1:Class1 DatabaseIndexesPreparer [hr/a1/core/dbinit/scripts/dbindex.ddl] Version:null] com.intershop.beehive.core.dbinit.preparer.database.DatabaseIndexesPreparer [] [] [] [] "main" [core] Exception java.lang.NullPointerException: null
at com.intershop.beehive.core.dbinit.preparer.database.SQLScriptPreparer.getCommand(SQLScriptPreparer.java:158)
at com.intershop.beehive.core.dbinit.preparer.database.SQLScriptPreparer.process(SQLScriptPreparer.java:353)
We had the similar problem with DatabaseTablesPreparer (Cartridge was null), and we solved it by adding cartridge.properties file, but now we are getting the same error ("Neither Ivy descriptor nor cartridge properties found for cartridge 'app_core_a1'") even though the cartridge properties file is defined.
There are the lines in decompiled preparer code where the null pointer exception occurs:
getCartridge().getVersion() + (getCartridge().getBuild().isEmpty() ? "" : new StringBuilder().append(".").append(getCartridge().getBuild()).toString()) };
This is the preparer from dbinit.properties:
Class1 = com.intershop.beehive.core.dbinit.preparer.database.DatabaseIndexesPreparer \
hr/a1/core/dbinit/scripts/dbindex.ddl
And this is the dbinit command we are executing:
dbinit.bat --exec-id=app_core_a1:Class1
DatabaseTablesPreparer from the same cartridge, defined in the same dbinit executes successfully.
Problem was fixed by publishing cartridge. It seems that ivy descriptor was deleted and it had to be republished.

Aerospike UDF module not found

I'm writing a stream UDF for Aerospike 3.6.2, and I'd like to put some code in a separate Lua module. I followed the example exactly and created a file mymodule.lua with the following contents:
local exports = {}
function exports.one()
return 1
end
function exports.two()
return 2
end
return exports
and put my UDF in a file testUdf.lua:
local MM = require('mymodule')
local function three()
return MM.one() + MM.two()
end
function testUdf(stream)
local type = three()
local testFilter = function(record)
return record.campaignType == type
end
return stream : filter(testFilter)
end
I register both modules and execute a query from the Java client:
LuaConfig.SourceDirectory = this.udfPath;
List<RegisterTask> tasks = new ArrayList<>();
for (String udfName : new String[] { "mymodule.lua", "testUdf.lua" }) {
File udf = new File (this.udfPath, udfName);
tasks.add (this.aerospike.register (null, udf.getPath(), udfName, Language.LUA));
}
tasks.stream().forEach (RegisterTask::waitTillComplete);
Statement stmt = new Statement();
stmt.setNamespace (getRawFactNamespace());
stmt.setSetName (SET_FACT);
stmt.setFilters (Filter.equal (BIN_PRODUCT_CODE, "UX"));
stmt.setBinNames (FACT_BINS);
int count = 0;
try (ResultSet rs = this.aerospike.queryAggregate (null, stmt, "testUdf", "testUdf")) {
while (rs.next()) {
count++;
}
}
I see the lua files, with the correct contents, on both of my test Aerospike servers in aerospike/var/udf/lua, which is where mod-lua.user-path points to in aerospike.conf. I logged package.path and it includes the aerospike/var/udf/lua directory as well.
But when I invoke my UDF, I get the following error:
Exception in thread "main" com.aerospike.client.AerospikeException: org.luaj.vm2.LuaError: testUdf:1 module 'mymodule' not found: mymodule
no field package.preload['mymodule']
mymodule.lua
no class 'mymodule'
stack traceback:
testUdf:1: in main chunk
[Java]: in ?
at com.aerospike.client.query.QueryExecutor.checkForException(QueryExecutor.java:122)
at com.aerospike.client.query.ResultSet.next(ResultSet.java:78)
...
Caused by: org.luaj.vm2.LuaError: testUdf:1 module 'mymodule' not found: mymodule
no field package.preload['mymodule']
mymodule.lua
no class 'mymodule'
stack traceback:
testUdf:1: in main chunk
[Java]: in ?
at org.luaj.vm2.LuaValue.error(Unknown Source)
at org.luaj.vm2.lib.PackageLib$require.call(Unknown Source)
at org.luaj.vm2.LuaClosure.execute(Unknown Source)
at org.luaj.vm2.LuaClosure.onInvoke(Unknown Source)
at org.luaj.vm2.LuaClosure.invoke(Unknown Source)
at org.luaj.vm2.LuaValue.invoke(Unknown Source)
at com.aerospike.client.lua.LuaInstance.loadPackage(LuaInstance.java:113)
at com.aerospike.client.query.QueryAggregateExecutor.runThreads(QueryAggregateExecutor.java:92)
at com.aerospike.client.query.QueryAggregateExecutor.run(QueryAggregateExecutor.java:77)
What am I doing wrong?

Is there a bug in Play2 testing with FakeRequests and chunked responses (Enumerator)?

I ran into an issue with Play 2.3.7 when testing an Action that returns a chunked response using an enumerator:
def text = Action {
Ok.chunked(Enumerator("abc"))
}
Using curl http://localhost:9000/text I get the expected result: abc but the following test:
class ApplicationSpec extends Specification {
"Application" should {
"stream text" in new WithApplication{
val request = route(FakeRequest(GET, "/text")).get
contentAsString(request) mustEqual "abc"
}
}
}
fails with a comparison error:
[info] Application should
[info] x stream text
[error] '3
[error] abc
[error] 0
[error]
[error] ' is not equal to 'abc' (ApplicationSpec.scala:31)
Where do those extra characters come from? I suspect that it might be an issue with FakeRequest and Enumerators? In a more complex case with concatenated Enumerators in the Action there will be characters mixed in between the content generated by the Enumerators.
This is a known issue that has been fixed for the upcoming Play 2.4, but is not available in 2.3.x. The extra characters are introduced from the chunked encoding. They represent chunk lengths in hexadecimal that are at the beginning of each HTTP response body. The old play test helper is just concatenating them together, instead of weeding them out.
I've been using the following code to work around the problem on 2.3.x for now (Thanks to marcuslinke's post from this github issue):
import scala.concurrent._
import scala.concurrent.duration._
import play.api.mvc._
import play.api.libs.iteratee._
import akka.util.Timeout
def contentAsBytes(of: Future[Result])(implicit timeout: Timeout): Array[Byte] = {
val result = Await.result(of, timeout.duration)
val eBytes = result.header.headers.get(TRANSFER_ENCODING) match {
case Some("chunked") => result.body &> Results.dechunk
case _ => result.body
}
Await.result(eBytes |>>> Iteratee.consume[Array[Byte]](), timeout.duration)
}
Which I use in tests (specs2) like this:
new String(contentAsBytes(result)) must equalTo("expected value")
For reference, here is the pull request that was merged into master.