OpenNI 1.5 changes list? - kinect

I am using the OpenNI v1.5.2.23, but most samples I find are for 1.0.0.23, so I have to change the code everytime to use the new features, instead of the obsolete ones.
Examples:
OldCode: using xn;
NewCode: using OpenNI;
---
OldCode: depth.GetDepthMapPtr().ToPointer();
NewCode: depth.DepthMapPtr.ToPointer();
---
OldCode: depth.GetMapOutputMode();
NewCode: depth.MapOutputMode;
---
OldCode: new Context(#"..\..\data\openniconfig.xml");
NewCode: Context.CreateFromXmlFile(#"..\..\data\openniconfig.xml", out scriptNode);
---
OldCode: Bitmap((int)mapMode.nXRes, (int)mapMode.nYRes ...
NewCode: Bitmap((int)mapMode.XRes, (int)mapMode.YRes ...
---
I am looking at the old then I go to newer samples or documentation to find the right new method/class/property, but that is very time-consuming.
So the question is, did someone made any sort of list of what has changed? (the changelog that comes with OpenNI didn't helped me much)

Related

Sportsipy API request

I need to use the sportsipy API to get the schedule for all teams in a dataframe. This is what I have:
from sportsreference.nba.schedule import Schedule
league = ['MIL','CHO','LAL','LAC','SAC','ATL','MIA','DAL','POR',
'HOU','NOP','PHO','WAS','MEM','BOS','DEN','TOR','SAS',
'PHI','BRK','UTA','IND','OKC','ORL','MIN','DET',
'NYK','CLE','CHI','GSW']
for i in league:
mil2019 = Schedule( i , year = '2020')
mil2019.dataframe_extended
The error i get is:
TypeError: unsupported operand type(s) for -: 'NoneType' and 'NoneType'
As mentioned above in the comment above, I believe your import is wrong. Using package sportsipy 0.6.0 and following the docs: https://sportsipy.readthedocs.io/en/stable/ I was able to achieve your desired result using following code:
from sportsipy.nba.schedule import Schedule
# MIL removed from league list as it is used to initiate league_schedule
league = ['CHO','LAL','LAC','SAC','ATL','MIA','DAL','POR',
'HOU','NOP','PHO','WAS','MEM','BOS','DEN','TOR','SAS',
'PHI','BRK','UTA','IND','OKC','ORL','MIN','DET',
'NYK','CLE','CHI','GSW']
league_schedule = Schedule('MIL', year="2020").dataframe
for team in league:
league_schedule = league_schedule.append(Schedule(team , year="2020").dataframe)
(Resulting dataframe has dimensions: [2286 rows x 15 columns])
Same should work with dataframe_extended, but it takes rather long time to get all that data. Maybe double check if you need all of it.
In case I am wrong and package you want to use in your question is correct please add additional info to your question, such as where can we get that package.
It appears you are using the module from pip install sportsreference from here which is on v0.5.2 in which case that is a valid import, even though you mentioned you're using sportsipy which caused some confusion for others. The latest version has refactored the package name to sportsipy.
If it wasn't a valid import, it would be throwing an import error on the very first line, so I'm not sure why folks are getting hung up on that.
You really should include the entire Python traceback, not just the final message, so we can determine exactly where in your code and the module's source code this exception is being raised. Also include the specific version of the library you're using, e.g. from pip freeze.
My initial thought is one of the requests somewhere for one of these teams is returning something unexpected and the library is not handling it properly, but without the full traceback that's just a theory.
It's probably a bug in v0.5.2 of sportsipy. I would try using the latest version from git and see if you can reproduce the error. Something, somewhere isn't validating things are what it expects before trying to do things with it. If I had the full traceback, I could tell you exactly what.
You could try catching the TypeError and passing on it, to see if skipping it allows everything else to continue working, but without knowing exactly where the error is coming from it's hard to say for sure at this point.
for i in league:
try:
mil2019 = Schedule( i , year = '2020')
mil2019.dataframe_extended
except TypeError:
pass
This won't fix the problem, it's actually hiding it, but if it's just one record from one game that is returning something unexpected this would at least let you get the rest of the results, possibly. It's also possible the issue would create other problems later, depending on exactly what it is. Again, this is where the whole traceback would have been helpful.
I will say that trying your code for just one team works for me. For example:
from sportsreference.nba.schedule import Schedule
mil2019 = Schedule("MIL", year="2020")
print(mil2019.dataframe_extended.head(10))
Returns this:
away_assist_percentage ... winning_name
201910240HOU 67.4 ... Milwaukee Bucks
201910260MIL 71.7 ... Miami Heat
201910280MIL 42.2 ... Cleveland Cavaliers
201910300BOS 55.3 ... Milwaukee Bucks
201911010ORL 51.1 ... Milwaukee Bucks
201911020MIL 70.6 ... Toronto Raptors
201911040MIN 48.0 ... Milwaukee Bucks
201911060LAC 42.9 ... Milwaukee Bucks
201911080UTA 41.2 ... Milwaukee Bucks
201911100OKC 57.4 ... Milwaukee Bucks
[10 rows x 82 columns]
It takes forever just to get the games from one team. The library is not passing around an existing requests.Session() when calling PyQuery (even though PyQuery supports a session kwarg), so every request for every box score is renegotiating a fresh TCP connection which is absurd, but I digress:
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): www.basketball-reference.com:80
DEBUG:urllib3.connectionpool:http://www.basketball-reference.com:80 "GET /teams/MIL/2020_games.html HTTP/1.1" 301 183
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): www.basketball-reference.com:443
DEBUG:urllib3.connectionpool:https://www.basketball-reference.com:443 "GET /teams/MIL/2020_games.html HTTP/1.1" 200 34571
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): www.basketball-reference.com:443
DEBUG:urllib3.connectionpool:https://www.basketball-reference.com:443 "GET /boxscores/201910240HOU.html HTTP/1.1" 200 46549
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): www.basketball-reference.com:443
I would add some debugging to your code to establish which team your code is working on when this exception is raised. Try first with one team like I did and confirm it generally works, then iterate through the list of teams with logging enabled like:
import logging
from sportsreference.nba.schedule import Schedule
logging.basicConfig(level=logging.DEBUG)
league = ['CHO', 'LAL', 'LAC', 'SAC', 'ATL', 'MIA', 'DAL', 'POR',
'HOU', 'NOP', 'PHO', 'WAS', 'MEM', 'BOS', 'DEN', 'TOR', 'SAS',
'PHI', 'BRK', 'UTA', 'IND', 'OKC', 'ORL', 'MIN', 'DET',
'NYK', 'CLE', 'CHI', 'GSW']
for i in league:
logging.info("Working on league: %s", i)
mil2019 = Schedule(i, year="2020")
print(mil2019.dataframe_extended)
This way you will know specifically what league and which specific request is responsible for the issue and that would help you determine what the root cause is.

nutch 1.16 crawl example from NutchTutorial returns NoSuchMethodError on org.apache.commons.cli.OptionBuilder (Windows 10)

I have been trying to run a Nutch 1.16 crawler using code example and instructions from https://cwiki.apache.org/confluence/display/NUTCH/NutchTutorial but no matter what, I seem to get stuck when initiating the actual crawl.
I'm running it through Cygwin64 on a Windows 10 machine, using a binary installation (though I have tried compiling one with the same results). Initially, Nutch would throw an UnsatisfiedLinkError (NativeIO$Windows.access0) which I fixed by adding libraries from several other answers for the same issue. Upon doing so, I could at least start a server, but trying to crawl through nutch itself would return NoSuchMethodError no matter what I did. nutch-site.xml only contains http.agent.name and plugin.includes options, both taken from the same example.
The following is the error message (I also tried to omit seed.txt):
$ bin/nutch inject crawl/crawldb urls/seed.txt
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.commons.cli.OptionBuilder.withArgPattern(Ljava/lang/String;I)Lorg/apache/commons/cli/OptionBuilder;
at org.apache.hadoop.util.GenericOptionsParser.buildGeneralOptions(GenericOptionsParser.java:207)
at org.apache.hadoop.util.GenericOptionsParser.parseGeneralOptions(GenericOptionsParser.java:370)
at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:153)
at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:138)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:59)
at org.apache.nutch.crawl.Injector.main(Injector.java:534)
The following is the list of libraries currently present in the lib directory:
activation-1.1.jar
amqp-client-5.2.0.jar
animal-sniffer-annotations-1.14.jar
antlr-runtime-3.5.2.jar
antlr4-4.5.1.jar
aopalliance-1.0.jar
apache-nutch-1.16.jar
apacheds-i18n-2.0.0-M15.jar
apacheds-kerberos-codec-2.0.0-M15.jar
api-asn1-api-1.0.0-M20.jar
api-util-1.0.0-M20.jar
args4j-2.0.16.jar
ascii-utf-themes-0.0.1.jar
asciitable-0.3.2.jar
asm-3.3.1.jar
asm-7.1.jar
avro-1.7.7.jar
bootstrap-3.0.3.jar
cglib-2.2.1-v20090111.jar
cglib-2.2.2.jar
char-translation-0.0.2.jar
checker-compat-qual-2.0.0.jar
closure-compiler-v20130603.jar
commons-beanutils-1.7.0.jar
commons-beanutils-core-1.8.0.jar
commons-cli-1.2-sources.jar
commons-cli-1.2.jar
commons-codec-1.11.jar
commons-collections-3.2.2.jar
commons-collections4-4.2.jar
commons-compress-1.18.jar
commons-configuration-1.6.jar
commons-daemon-1.0.13.jar
commons-digester-1.8.jar
commons-el-1.0.jar
commons-httpclient-3.1.jar
commons-io-2.4.jar
commons-jexl-2.1.1.jar
commons-lang-2.6.jar
commons-lang3-3.8.1.jar
commons-logging-1.1.3.jar
commons-math3-3.1.1.jar
commons-net-3.1.jar
crawler-commons-1.0.jar
curator-client-2.7.1.jar
curator-framework-2.7.1.jar
curator-recipes-2.7.1.jar
cxf-core-3.3.3.jar
cxf-rt-bindings-soap-3.3.3.jar
cxf-rt-bindings-xml-3.3.3.jar
cxf-rt-databinding-jaxb-3.3.3.jar
cxf-rt-frontend-jaxrs-3.3.3.jar
cxf-rt-frontend-jaxws-3.3.3.jar
cxf-rt-frontend-simple-3.3.3.jar
cxf-rt-security-3.3.3.jar
cxf-rt-transports-http-3.3.3.jar
cxf-rt-transports-http-jetty-3.3.3.jar
cxf-rt-ws-addr-3.3.3.jar
cxf-rt-ws-policy-3.3.3.jar
cxf-rt-wsdl-3.3.3.jar
dom4j-1.6.1.jar
ehcache-3.3.1.jar
elasticsearch-0.90.1.jar
error_prone_annotations-2.1.3.jar
FastInfoset-1.2.16.jar
geronimo-jcache_1.0_spec-1.0-alpha-1.jar
gora-hbase-0.3.jar
gson-2.2.4.jar
guava-25.0-jre.jar
guice-3.0.jar
guice-servlet-3.0.jar
h2-1.4.197.jar
hadoop-0.20.0-ant.jar
hadoop-0.20.0-core.jar
hadoop-0.20.0-examples.jar
hadoop-0.20.0-test.jar
hadoop-0.20.0-tools.jar
hadoop-annotations-2.9.2.jar
hadoop-auth-2.9.2.jar
hadoop-common-2.9.2.jar
hadoop-core-1.2.1.jar
hadoop-core_0.20.0.xml
hadoop-core_0.21.0.xml
hadoop-core_0.22.0.xml
hadoop-hdfs-2.9.2.jar
hadoop-hdfs-client-2.9.2.jar
hadoop-mapreduce-client-common-2.2.0.jar
hadoop-mapreduce-client-common-2.9.2.jar
hadoop-mapreduce-client-core-2.2.0.jar
hadoop-mapreduce-client-core-2.9.2.jar
hadoop-mapreduce-client-jobclient-2.2.0.jar
hadoop-mapreduce-client-jobclient-2.9.2.jar
hadoop-mapreduce-client-shuffle-2.2.0.jar
hadoop-mapreduce-client-shuffle-2.9.2.jar
hadoop-yarn-api-2.9.2.jar
hadoop-yarn-client-2.9.2.jar
hadoop-yarn-common-2.9.2.jar
hadoop-yarn-registry-2.9.2.jar
hadoop-yarn-server-common-2.9.2.jar
hadoop-yarn-server-nodemanager-2.9.2.jar
hbase-0.90.0-tests.jar
hbase-0.90.0.jar
hbase-0.92.1.jar
hbase-client-0.98.0-hadoop2.jar
hbase-common-0.98.0-hadoop2.jar
hbase-protocol-0.98.0-hadoop2.jar
HikariCP-java7-2.4.12.jar
htmlparser-1.6.jar
htrace-core-2.04.jar
htrace-core4-4.1.0-incubating.jar
httpclient-4.5.6.jar
httpcore-4.4.9.jar
httpcore-nio-4.4.9.jar
icu4j-61.1.jar
istack-commons-runtime-3.0.8.jar
j2objc-annotations-1.1.jar
jackson-annotations-2.9.9.jar
jackson-core-2.9.9.jar
jackson-core-asl-1.9.13.jar
jackson-databind-2.9.9.jar
jackson-dataformat-cbor-2.9.9.jar
jackson-jaxrs-1.9.13.jar
jackson-jaxrs-base-2.9.9.jar
jackson-jaxrs-json-provider-2.9.9.jar
jackson-mapper-asl-1.9.13.jar
jackson-module-jaxb-annotations-2.9.9.jar
jackson-xc-1.9.13.jar
jakarta.activation-api-1.2.1.jar
jakarta.ws.rs-api-2.1.5.jar
jakarta.xml.bind-api-2.3.2.jar
jasper-compiler-5.5.12.jar
jasper-runtime-5.5.12.jar
java-xmlbuilder-0.4.jar
javassist-3.12.1.GA.jar
javax.annotation-api-1.3.2.jar
javax.inject-1.jar
javax.persistence-2.2.0.jar
javax.servlet-api-3.1.0.jar
jaxb-api-2.2.2.jar
jaxb-impl-2.2.3-1.jar
jaxb-runtime-2.3.2.jar
jcip-annotations-1.0-1.jar
jersey-client-1.19.4.jar
jersey-core-1.9.jar
jersey-guice-1.9.jar
jersey-json-1.9.jar
jersey-server-1.9.jar
jets3t-0.9.0.jar
jettison-1.1.jar
jetty-6.1.26.jar
jetty-client-6.1.22.jar
jetty-continuation-9.4.19.v20190610.jar
jetty-http-9.4.19.v20190610.jar
jetty-io-9.4.19.v20190610.jar
jetty-security-9.4.19.v20190610.jar
jetty-server-9.4.19.v20190610.jar
jetty-sslengine-6.1.26.jar
jetty-util-6.1.26.jar
jetty-util-9.4.19.v20190610.jar
joda-time-2.3.jar
jquery-2.0.3-1.jar
jquery-selectors-0.0.3.jar
jquery-ui-1.10.2-1.jar
jquerypp-1.0.1.jar
jsch-0.1.54.jar
json-smart-1.3.1.jar
jsp-2.1-6.1.14.jar
jsp-api-2.1-6.1.14.jar
jsp-api-2.1.jar
jsr305-3.0.0.jar
junit-3.8.1.jar
juniversalchardet-1.0.3.jar
leveldbjni-all-1.8.jar
log4j-1.2.17.jar
lucene-analyzers-common-4.3.0.jar
lucene-codecs-4.3.0.jar
lucene-core-4.3.0.jar
lucene-grouping-4.3.0.jar
lucene-highlighter-4.3.0.jar
lucene-join-4.3.0.jar
lucene-memory-4.3.0.jar
lucene-queries-4.3.0.jar
lucene-queryparser-4.3.0.jar
lucene-sandbox-4.3.0.jar
lucene-spatial-4.3.0.jar
lucene-suggest-4.3.0.jar
maven-parent-config-0.3.4.jar
metrics-core-3.0.1.jar
modernizr-2.6.2-1.jar
mssql-jdbc-6.2.1.jre7.jar
neethi-3.1.1.jar
netty-3.6.2.Final.jar
netty-all-4.0.23.Final.jar
nimbus-jose-jwt-4.41.1.jar
okhttp-2.7.5.jar
okio-1.6.0.jar
org.apache.commons.cli-1.2.0.jar
ormlite-core-5.1.jar
ormlite-jdbc-5.1.jar
oro-2.0.8.jar
paranamer-2.3.jar
protobuf-java-2.5.0.jar
reflections-0.9.8.jar
servlet-api-2.5-20081211.jar
servlet-api-2.5.jar
skb-interfaces-0.0.1.jar
slf4j-api-1.7.26.jar
slf4j-log4j12-1.7.25.jar
snappy-java-1.0.5.jar
spatial4j-0.3.jar
spring-aop-4.0.9.RELEASE.jar
spring-beans-4.0.9.RELEASE.jar
spring-context-4.0.9.RELEASE.jar
spring-core-4.0.9.RELEASE.jar
spring-expression-4.0.9.RELEASE.jar
spring-web-4.0.9.RELEASE.jar
ST4-4.0.8.jar
stax-api-1.0-2.jar
stax-ex-1.8.1.jar
stax2-api-3.1.4.jar
t-digest-3.2.jar
tika-core-1.22.jar
txw2-2.3.2.jar
typeaheadjs-0.9.3.jar
warc-hadoop-0.1.0.jar
webarchive-commons-1.1.5.jar
wicket-bootstrap-core-0.9.2.jar
wicket-bootstrap-extensions-0.9.2.jar
wicket-core-6.17.0.jar
wicket-extensions-6.13.0.jar
wicket-ioc-6.17.0.jar
wicket-request-6.17.0.jar
wicket-spring-6.17.0.jar
wicket-util-6.17.0.jar
wicket-webjars-0.4.0.jar
woodstox-core-5.0.3.jar
wsdl4j-1.6.3.jar
xercesImpl-2.12.0.jar
xml-apis-1.4.01.jar
xml-resolver-1.2.jar
xmlenc-0.52.jar
xmlParserAPIs-2.6.2.jar
xmlschema-core-2.2.4.jar
zookeeper-3.4.6.jar
This is my java version:
java version "1.8.0_241"
Java(TM) SE Runtime Environment (build 1.8.0_241-b07)
Java HotSpot(TM) 64-Bit Server VM (build 25.241-b07, mixed mode)
I'd also like to point out that, despite what another answer may have said, nutch 1.4 (or any other version of nutch for that matter) did NOT resolve the issue, at least on Windows.
EDIT: The following answer worked for me, but I left the original one because it may still be useful to someone working with other versions of nutch.
Again, thanks to Sebastian Nagel, in order to get around the NoSuchMethodError, just edit ivy\ivy.xml to reference a different version of hadoop libraries, in my case I installed hadoop 3.1.3 and I also added the corresponding 3.1.3 versions of winutils.exe and hadoop.dll to the hadoop\bin directory referenced by HADOOP_HOME. Running bin/crawl and it seems to be working correctly.
Outdated answer: Okay, after working on the source code itself (courtesy of https://github.com/apache/commons-cli) under the suggestion of Sebastian Nagel, I was able to find the (very simple) implementation for the method (https://github.com/marcelmaatkamp/EntityExtractorUtils/blob/master/src/main/java/org/apache/commons/cli/OptionBuilder.java):
/**
* The next Option created will have an argument patterns and
* the number of pattern occurances
*
* #param argPattern string representing a pattern regex
* #param limit the number of pattern occurance in the argument
* return the OptionBuilder instance
*/
public static OptionBuilder withArgPattern( String argPattern,
int limit )
{
OptionBuilder.argPattern = argPattern;
OptionBuilder.limit = limit;
Using maven I was then able to compile the code into their own jar files, which I then added in the lib folder for apache nutch.
This still did not completely resolve my problem, as there seem to be deprecated functions being used by the entire nutch framework, which will probably mean even more work under similar circumstances (for instance, right after using the new jar I've been returned a NoSuchMethodError over org.apache.hadoop.mapreduce.Job.getInstance).
I leave this answer here as a temporary solution to anyone who may have also gotten stuck on the same issue, but I surely wish there was an easier way of finding out which methods appear in which jar file before exploring their entire file structure, although it may just be me ignoring it.

Kafka 1.0.0 - Serialized.with() uses default serde instead of the ones provided

We recently updated our kafka version from 0.10 to 1.0 and I am updating the deprecated code
KTable<Long, myClass> myKTable = this.streamBuilder
.stream(Serdes.Long(), mySerde, sub_topic)
.groupByKey(Serdes.Long(), mySerde)
.reduce(myReducer, my_store);
to this
KTable<Long, myClass> myKTable = this.streamBuilder
.stream(sub_topic, Consumed.with(Serdes.Long(), mySerde))
.groupByKey(Serialized.with(Serdes.Long(), mySerde))
.reduce(myReducer, Materialized.as(my_store));
My stream throws an error while serializing in groupByKey. The Serialized.with() does not use the keySerde provided and defaults back to byteArray. And this byteArray serde then encounters my key which is a Long and throws a cast error.
Has anyone else encountered this error in the 1.0.0 version of kafka. The first code with the outdated version of kafka works fine. But updating the code to use Serialized.with() does not seem to work. Any help is greatly appreciated.
Can you share the stack trace? I actually think the issue is with reduce() -- you need to specify the Serdes via Materialized again.
This is kinda regression on the new API and got fixed recently in trunk: https://github.com/apache/kafka/pull/4919 Thus, upcoming 2.0 release will contain the fix.

ALIZE/LIA_RAL Toolkit 3.0 Tutorial

I am trying get started with ALIZE/LIA_RAL Toolkit. Tried to run ALIZE3.0 Tutorial 02_i-vector_system_with_ALIZE3.0 on my system. But there might be some problem with data. Data folder with the I-vector system is empty and I have used data you provided with GMM-UBM tutorial. IvTest_WCCN_Cosine.log file is showing the following error -
[ Exception 0x77ce00 ]
message = "Matrix is not positive definite"
source file = DoubleSquareMatrix.cpp
line number = 177
Can you please help me out to fix this problem?

NullPointer with playOrm 1.4.1 when persisting entity

I have mapped entities in playORM and my project was running fine with my entities mapped the way they were. However, after installing playORM 1.4.1, the lastest version released in maven, I got the null pointer bellow.
I want to find the error, but have no clue of where to start looking.
Any hint?
INFO: found meta=User locally
2012-11-09 17:32:22,918 com.alvazan.orm.layer9z.spi.db.cassandra.ColumnFamilyHelper waitForNodesToBeUpToDate
INFO: LOOP until all nodes have same schema version OR timeout in 300000 milliseconds
2012-11-09 17:32:22,939 com.alvazan.orm.layer9z.spi.db.cassandra.ColumnFamilyHelper tryToLoadColumnFamilyImpl
INFO: Well, we did NOT find any column family=User to load in cassandra(from virt=User)
2012-11-09 17:32:22,939 com.alvazan.orm.layer9z.spi.db.cassandra.ColumnFamilyHelper tryToLoadColumnFamilyVirt
INFO: Total time to LOAD column family meta from cassandra=21
java.lang.NullPointerException
at com.alvazan.orm.impl.meta.data.MetaEmbeddedSimple.translateToColumnImpl(MetaEmbeddedSimple.java:105)
at com.alvazan.orm.impl.meta.data.MetaEmbeddedSimple.translateToColumn(MetaEmbeddedSimple.java:93)
at com.alvazan.orm.impl.meta.data.MetaClassSingle.translateToRow(MetaClassSingle.java:82)
at com.alvazan.orm.layer0.base.BaseEntityManagerImpl.putImpl(BaseEntityManagerImpl.java:102)
at com.alvazan.orm.layer0.base.BaseEntityManagerImpl.put(BaseEntityManagerImpl.java:68)
at com.s1mbi0se.dmp.da.dao.UserDao.insertOrUpdateUser(UserDao.java:23)
at com.s1mbi0se.dmp.module.UserModule.persistData(UserModule.java:116)
at com.s1mbi0se.dmp.processor.mapred.SelectorReducer.reduce(SelectorReducer.java:60)
at com.s1mbi0se.dmp.processor.mapred.SelectorReducer.reduce(SelectorReducer.java:1)
at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:176)
at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:649)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:417)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:260)
17:32:22,946 WARN Thread-3 mapred.LocalJobRunner:298 - job_local_0001
java.lang.InterruptedException
at com.s1mbi0se.dmp.processor.mapred.SelectorReducer.reduce(SelectorReducer.java:63)
at com.s1mbi0se.dmp.processor.mapred.SelectorReducer.reduce(SelectorReducer.java:1)
at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:176)
at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:649)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:417)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:260)
2012-11-09 17:32:27,237 com.s1mbi0se.dmp.processor.main.DmpProcessorRunner run
EDIT: This is fixed in master branch and soon to be released. 11/27/12
The log formatting seems a bit off but this is the important part
java.lang.NullPointerException at com.alvazan.orm.impl.meta.data.MetaEmbeddedSimple.translateToColumnImpl(MetaEmbeddedSimple.java:105)
line 105 finds this code...
for(T val : toBeAdded) {
byte[] name = formTheName(val);
Column c = new Column();
c.setName(name);
row.getColumns().add(c);
}
specifically line 105 is the first line so toBeAdded is null for some reason....looking at who called this method.
hmmm, it turns out ONE of your entities has a null list of something. We need to add code in here so if your entity has a null list we create an empty one instead. Can you file a ticket and link to this URL. We can fix this one easily.
NOTE: I have a habit of every entity with a field like so
private List something;
I 100% always define it like this
private List something = new ArrayList();
That avoids Nullpointers all over the place which is why I missed this one :( :(.....anyways, we will fix to allow this null lists.
thanks,
Dean
This is fixed in release 1.4.2 which is available in Maven repo