Apache Pig: ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1200: can't look backwards more than one token in this stream - apache-pig

I wrote a UDF that returns a string and here is a sample code:
split data into purchased IF ((boolean) (myudf(param)), failed OTHERWISE;
As an example, here is the example of that my udf returns:
split data into purchased IF ((boolean) (retcode == 'SUCCESS')), failed OTHERWISE;
Unfortunately. I get the following error:
Apache Pig: ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1200: can't look backwards more than one token in this stream
I also tried this:
split data into purchased IF ((boolean) '(retcode == 'SUCCESS')'), failed OTHERWISE;
I get this error:
2015-06-19 10:10:48,330 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1200: <line 11, column 85> Syntax error, unexpected symbol at or near '250.00'
I also tried this:
split data into purchased IF ((boolean) '(retcode == \'SUCCESS\')'), failed OTHERWISE;
I don't get any error, but the I don't get the expected result back.
Any help with this would be great.

That error is thrown because ANTLR can't parse correctly that sentence. Pig should give you a different error showing what is the problem, like it generally does, but it seems the parsing rules for the SPLIT sentence don't take into account what happens when you try to cast the condition.
The problem here is solved simply by removing that cast to boolean:
split data into purchased IF (retcode == 'SUCCESS'), failed OTHERWISE;
That will work.
Why does the cast make it fail? That I don't know. I guess that casting the output of an expression counts as two different expressions, since it has to resolve the inner expression to apply the cast afterwards, and the syntax rule for the split operator does not allow this. 0% sure though.

Related

ERROR: Limit set by ERRORS= option reached. Further errors for this INPUT function will not be printed

While using the code able I'm getting an error message as below and also I'm getting the output table.
input(put(serv_to_DT_KEY,8.),yymmdd8.)
between datepart(D.throughdate)
and datepart(intnx('day',d.throughdate,31))
Error: INPUT function reported 'ERROR: Invalid date value' while processing WHERE clause.
ERROR: Limit set by ERRORS= option reached. Further errors for this INPUT function will not be printed.
Could you please help
It's not a true error, exactly; it's a data error, which SAS will let by. What it's saying is the value in serv_to_DT_KEY is not a valid yymmdd8 for some records. (The rest of the message, about "limit set by ...", is just SAS saying it's only going to tell you about 20 or so individual data errors instead of showing you every single one, so your log isn't hopeless.)
To fix this you have several options:
If there is a data issue [ie, all rows should have a valid date], then fix that.
If it's okay that some rows don't have a valid date (for example, they might have a missing), you can use the ?? modifier in the informat in input to tell it to ignore these.
Like this:
input(put(serv_to_DT_KEY,8.),?yymmdd8.)
between datepart(D.throughdate)
and datepart(intnx('day',d.throughdate,31))

'An invalid floating point operation occurred' using LOG function SQL Server

I am trying to perform an operation in SQL server using LOG() and a few fields that are within my table. When I try and run the query, I get the error
'An invalid floating point operation occurred.'
I tried the solution provided in An invalid floating point operation occurred but I was not able to get it to solve. This was the original code that I used
SELECT
MIN(ROUND((ROUND(Log(amt1 / ABS(rate * amt2 - amt1)),5) / (round(LOG(1 + rate),10))),0))
FROM table
WHERE amt2 - amt1 > 0
I expected the output to show a value. I can run this for one specific field, but as I span this out to the whole data set the error occurs.

CF Report Builder Calculated Field Error

I am working on an existing Cold Fusion report and am getting an error when trying to create a calculated field.
When I use the following expression in the field:
IIF(calc.WI_TOT_AGT_CNT_MTD NEQ 0 AND CALC.WI_TOT_AGT_CNT_MTD NEQ '',
'(CALC.WI_TOT_SRVY_CNT_MTD / CALC.WI_TOT_AGT_CNT_MTD)',
DE('-'))
it runs fine. The problem is when I update this expression to use a different calculated expression:
IIF(calc.FL_AGT_CNT_TOTAL NEQ 0 AND CALC.WI_TOT_AGT_CNT_MTD NEQ '',
'(CALC.WI_TOT_SRVY_CNT_MTD / CALC.WI_TOT_AGT_CNT_MTD)',
DE('-'))
I get an error. I updated just one piece at a time to see if I can pinpoint what is causing the error. I can pass "Calc.FL_AGT_CNT_TOTAL" into the report and verify that it returns 0 as the value. I have verified that calc.FL_AGT_CNT_TOTAL is the same data type as calc.WI_TOT_SRVY_CNT_MTD.
The error I am getting is just a generic "An error has occured, please contact administrator", and I can't figure out where in this cold fusion application the error is being redirected from. Any ideas as to what could be causing this calculation to fail? Thanks!
The Calculated Field was a float data type but it was trying to pass the '-', which is a string.

CSV file input not working together with set field value step in Pentaho Kettle

I have a very simple Pentaho Kettle transformation that causes a strange error. It consists of reading a field X from a CSV, add a field Y, set Y=X and finally write it back to another CSV.
Here you can see the steps and the configuration for them:
You can also download the ktr file from here. The input data is just this:
1
2
3
When I run this transformation, I get this error message:
ERROR (version 5.4.0.1-130, build 1 from 2015-06-14_12-34-55 by buildguy) : Unexpected error
ERROR (version 5.4.0.1-130, build 1 from 2015-06-14_12-34-55 by buildguy) : org.pentaho.di.core.exception.KettleStepException:
Error writing line
Error writing field content to file
Y Number : There was a data type error: the data type of [B object [[B©b4136a] does not correspond to value meta [Number]
at org.pentaho.di.trans.steps.textfiIeoutput.TextFiIeOutput.writeRowToFile(TextFiIeOutput.java:273)
at org.pentaho.di.trans.steps.textfiIeoutput.TextFileOutput.processRow(TextFiIeOutput.java:195)
at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
atjava.Iang.Thread.run(Unknown Source)
Caused by: org.pentaho.di.core.exception.KettleStepException:
Error writing field content to file
Y Number : There was a data type error: the data type of [B object [[B©b4136a] does not correspond to value meta [Number]
at org.pentaho.di.trans.steps.textfiIeoutput.TextFiIeOutput.writeField(TextFileOutput.java:435)
at org.pentaho.di.trans.steps.textfiIeoutput.TextFiIeOutput.writeRowToFile(TextFiIeOutput.java:249)
3 more
Caused by: org.pentaho.di.core.exception.KettleVaIueException:
Y Number : There was a data type error: the data type of [B object [[B©b4136a] does not correspond to value meta [Number]
at org.pentaho.di.core.row.vaIue.VaIueMetaBase.getBinaryString(VaIueMetaBase.java:2185)
at org.pentaho.di.trans.steps.textfiIeoutput.TextFiIeOutput.formatField(TextFiIeOutput.java:290)
at org.pentaho.di.trans.steps.textfiIeoutput.TextFiIeOutput.writeField(TextFileOutput.java:392)
4 more
All of the above lines start with 2015/09/23 12:51:18 - Text file output.0 -, but I edited it out for brevity. I think the relevant, and confusing, part of the error message is this:
Y Number : There was a data type error: the data type of [B object [[B©b4136a] does not correspond to value meta [Number]
Some further notes:
If I bypass the set field value step by using the lower hop instead, the transformation finish without errors. This leads me to believe that it is the set field value step that causes the problem.
If I replace the CSV file input with a data frame with the same data (1,2,3) everything works just fine.
If I replace the file output step with a dummy the transformation finish without errors. However, if I preview the dummy, it causes a similar error and the field Y has the value <null> on all three rows.
Before I created this MCVE I got the error on all sorts of seemingly random steps, even when there was no file output present. So I do not think this is related to the file output.
If I change the format from Number to Integer, nothing changes. But if I change it to string the transformations finish without errors, and I get this output:
X;Y
1;[B#49e96951
2;[B#7b016abf
3;[B#1a0760b0
Is this a bug? Am I doing something wrong? How can I make this work?
It's because of lazy conversion. Turn it off. This is behaving exactly as designed - although admittedly the error and user experience could be improved.
Lazy conversion must not be used when you need to access the field value in your transformation. That's exactly what it does. The default should probably be off rather than on.
If your field is going directly to a database, then use it and it will be faster.
You can even have "partially lazy" streams, where you use lazy conversion for speed, but then use select values step, to "un-lazify" the fields you want to access, whilst the remainder remain lazy.
Cunning huh?

Extended errno in LDAP

Following are the lines of my code in C:
ldap_bind_s(ld, root_dn, root_pw, auth_method) != LDAP_SUCCESS
ldap_perror( ld, "ldap_bind" );//to print the EXACT error like 525,52e
When executed
Project ./a.out CN=username,OU=ABC,DC=example,DC=com wrong-pasword
ldap_bind: Invalid credentials (49)
additional info: 80090308: LdapErr: DSID-0C090334, comment:
AcceptSecurityContext error, data 52e, vece
Here the string 52e after data represents the extended error.
Can anyone please suggest how do I access this extended error directly. Currently I am parsing the string to extract this value. Normally when I print LDAP_OPT_ERROR_NUMBER it returns just 49(INVALID_CREDENTIALS) but that is not sufficient for me. How do I get the code(only code) of extended error.
I even tried printing ld->ld_errno but it is not allowing me since no memory has been alloted to ld.
You have to parse error string in order to extract error number,as you are unable to to see the error code from LDAP structure,because it is opaque datatype and you can access element of LDAP structure only through routines which can see the incomplete definition of structure.