Failed to import file - graphdb

I have a RDF file that can be imported without any issues in another RDF store (Stardog) but keeps failing in GraphDB with this error :
15:58:18.900 [import-task-3-thread-1] ERROR c.o.f.i.MultipartFileImportRunnableTask - Could not import file
java.lang.NullPointerException: null
at java.util.concurrent.ConcurrentHashMap.get(ConcurrentHashMap.java:936)
at org.eclipse.rdf4j.common.lang.service.ServiceRegistry.get(ServiceRegistry.java:95)
at org.eclipse.rdf4j.rio.Rio.createParser(Rio.java:100)
at org.eclipse.rdf4j.rio.Rio.createParser(Rio.java:118)
at org.eclipse.rdf4j.repository.util.RDFLoader.loadInputStreamOrReader(RDFLoader.java:279)
at org.eclipse.rdf4j.repository.util.RDFLoader.load(RDFLoader.java:197)
at org.eclipse.rdf4j.repository.base.AbstractRepositoryConnection.add(AbstractRepositoryConnection.java:329)
at com.ontotext.trree.monitorRepository.MonitorRepositoryConnection.add(MonitorRepositoryConnection.java:159)
at com.ontotext.trree.parallel.ParallelRDFLoader.add(ParallelRDFLoader.java:125)
at com.ontotext.forest.impex.ParallelAwareImporter.lambda$add$3(ParallelAwareImporter.java:48)
at com.ontotext.forest.impex.ParallelAwareImporter.wrapInBeginCommit(ParallelAwareImporter.java:66)
at com.ontotext.forest.impex.ParallelAwareImporter.add(ParallelAwareImporter.java:48)
at com.ontotext.forest.impex.MultipartFileImportRunnableTask.load(MultipartFileImportRunnableTask.java:38)
at com.ontotext.forest.impex.ImportRunnableTask.run(ImportRunnableTask.java:80)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
This file can be found here : http://boetik-artistik.be/humidity_by_city.owls
All referenced ontologies are resolvable from my machine.
Thanks or your help.
Kind regards,
Johan,

I have just tried this out myself on GraphDB 8.3.1. I got a similar error when I allowed GraphDB to auto detect the import format. However, when I selected the format as "RDF/XML", it imported without a problem.
The problem is with the file extension. It should be .rdf rather than .owls.

Related

Unable to produce data to hazelcast in apache camel

I have the following route configured in apache-camel
from("direct:hazelCast")
.setHeader(HazelcastConstants.OPERATION, constant(HazelcastConstants.PUT_OPERATION))
.toF("hazelcast:map:testHazel", HazelcastConstants.MAP_PREFIX);
But, when the above route is invoked I'm getting the following error:
java.lang.NullPointerException: Null key is not allowed!
at com.hazelcast.map.impl.proxy.MapProxyImpl.put(MapProxyImpl.java:95)
at com.hazelcast.map.impl.proxy.MapProxyImpl.put(MapProxyImpl.java:89)
at org.apache.camel.component.hazelcast.map.HazelcastMapProducer.put(HazelcastMapProducer.java:125)
at org.apache.camel.component.hazelcast.map.HazelcastMapProducer.process(HazelcastMapProducer.java:60)
at org.apache.camel.util.AsyncProcessorConverterHelper$ProcessorToAsyncProcessorBridge.process(AsyncProcessorConverterHelper.java:61)
at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:141)
at org.apache.camel.management.InstrumentationProcessor.process(InstrumentationProcessor.java:77)
at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:460)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:190)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:121)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:83)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:190)
at org.apache.camel.component.direct.DirectProducer.process(DirectProducer.java:62)
at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:141)
at org.apache.camel.management.InstrumentationProcessor.process(InstrumentationProcessor.java:77)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:190)
at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:460)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:190)
at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:109)
at org.apache.camel.processor.MulticastProcessor.doProcessParallel(MulticastProcessor.java:814)
at org.apache.camel.processor.MulticastProcessor.access$200(MulticastProcessor.java:84)
at org.apache.camel.processor.MulticastProcessor$1.call(MulticastProcessor.java:314)
at org.apache.camel.processor.MulticastProcessor$1.call(MulticastProcessor.java:299)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
The code that I used was almost similar to what was there is camel docs http://camel.apache.org/hazelcast-component.html
The following is the code which produces the data to hazelcast
The following is the code snippet that I have used to produce the data to hazelcast in camel:
from("direct:hazelCast")
.setHeader(HazelcastConstants.OPERATION, constant(HazelcastConstants.PUT_OPERATION))
.setHeader(HazelcastConstants.OBJECT_ID, constant("SOME BLA BLA"))
.split()
.tokenizeXML(<SOMEValidTag>).streaming()
.unmarshal(jaxb)
.convertBodyTo(<Valid>.class)
.marshal().json(JsonLibrary.Jackson)
.toF("hazelcast:%stestHazel", HazelcastConstants.MAP_PREFIX);
Note: We need to convert the body to class which is should be serializeable
You need to set the objectid it is missing

SparkSQL error Table Not Found

I converted an RDD[myClass] to dataframe and then register it as an
SQL table
my_rdd.toDF().registerTempTable("my_rdd")
This table is callable and can be demonstrated with following command
%sql
SELECT * from my_rdd limit 5
But the next step gives error, saying Table Not Found: my_rdd
val my_df = sqlContext.sql("SELECT * from my_rdd limit 5")
Quite newbie for Spark. Do not understand why this is happening. Can anyone help me out of this?
java.lang.RuntimeException: Table Not Found: my_rdd
at scala.sys.package$.error(package.scala:27)
at org.apache.spark.sql.catalyst.analysis.SimpleCatalog$$anonfun$1.apply(Catalog.scala:111)
at org.apache.spark.sql.catalyst.analysis.SimpleCatalog$$anonfun$1.apply(Catalog.scala:111)
at scala.collection.MapLike$class.getOrElse(MapLike.scala:128)
at scala.collection.AbstractMap.getOrElse(Map.scala:58)
at org.apache.spark.sql.catalyst.analysis.SimpleCatalog.lookupRelation(Catalog.scala:111)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.getTable(Analyzer.scala:175)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$6.applyOrElse(Analyzer.scala:187)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$6.applyOrElse(Analyzer.scala:182)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:187)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:187)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:50)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:186)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:207)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)
at scala.collection.AbstractIterator.to(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformChildrenDown(TreeNode.scala:236)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:192)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:207)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)
at scala.collection.AbstractIterator.to(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformChildrenDown(TreeNode.scala:236)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:192)
at org.apache.spark.sql.catalyst.trees.TreeNode.transform(TreeNode.scala:177)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.apply(Analyzer.scala:182)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.apply(Analyzer.scala:172)
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$apply$1$$anonfun$apply$2.apply(RuleExecutor.scala:61)
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$apply$1$$anonfun$apply$2.apply(RuleExecutor.scala:59)
at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:111)
at scala.collection.immutable.List.foldLeft(List.scala:84)
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$apply$1.apply(RuleExecutor.scala:59)
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$apply$1.apply(RuleExecutor.scala:51)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.apache.spark.sql.catalyst.rules.RuleExecutor.apply(RuleExecutor.scala:51)
at org.apache.spark.sql.SQLContext$QueryExecution.analyzed$lzycompute(SQLContext.scala:1071)
at org.apache.spark.sql.SQLContext$QueryExecution.analyzed(SQLContext.scala:1071)
at org.apache.spark.sql.SQLContext$QueryExecution.assertAnalyzed(SQLContext.scala:1069)
at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:133)
at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:915)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:68)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:73)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:75)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:77)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:79)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:81)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:83)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:85)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:87)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:89)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:91)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:93)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:95)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:97)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:99)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:101)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:103)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:105)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:107)
at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:109)
at $iwC$$iwC$$iwC$$iwC.<init>(<console>:111)
at $iwC$$iwC$$iwC.<init>(<console>:113)
at $iwC$$iwC.<init>(<console>:115)
at $iwC.<init>(<console>:117)
at <init>(<console>:119)
at .<init>(<console>:123)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.zeppelin.spark.SparkInterpreter.interpretInput(SparkInterpreter.java:556)
at org.apache.zeppelin.spark.SparkInterpreter.interpret(SparkInterpreter.java:532)
at org.apache.zeppelin.spark.SparkInterpreter.interpret(SparkInterpreter.java:525)
at org.apache.zeppelin.interpreter.ClassloaderInterpreter.interpret(ClassloaderInterpreter.java:57)
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:93)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:264)
at org.apache.zeppelin.scheduler.Job.run(Job.java:170)
at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:118)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Make sure to import the implicits._ from the same SQLContext. Temporary tables are kept in-memory in one specific SQLContext.
val sqlContext = new SQLContext(sc)
import sqlContext.implicits._
my_rdd.toDF().registerTempTable("my_rdd")
val my_df = sqlContext.sql("SELECT * from my_rdd LIMIT 5")
my_df.collect().foreach(println)
I found it easy to cause problems with temptables if there is more than one open zeppelin session, either in your own browser, or from someone else using the same server. The variable sqlContext is shared across those sessions and its easy to overwrite the value of it.
So as the solution of this problem I copied core-site.xml,hive-site.xml and hdfs-site.xml files into conf directoy.
I faced with a similar problem. I was loading a table which was not present in the warehouse folder whereas Hive console was showing me the table name. You can check the detailed description of the table you are loading using describe formatted table_name. You don't need to copy any file to spark/conf folder. It is already integrated.
I have met the same error but in different case ,by solve it with using the same context. If you use hiveContext, make sure that you use it all the time, for example first sqlContext.sql("load data input XXX") , and then if you use hiveContext.sql("select * from XXX"), you will meet this problem.
Every context has it`s lifecycle. So do not use two context with the same dataFrame .

OWL 2 Reasoners and custom datatypes not working

I am trying to run the Protege Reasoner, there are two reasoner available Fact++ and HermiT 1.3.7.
I tried to run the both but their is window comes and suddenly goes. It's difficult to see that one so I use the screen recorder to get that but it doesn't contain any information.
There is no error message or log message I can found.
The reasoner is not being started.
I tried to use the option Export inferred axioms as ontology and then I get the expected error message "No Reasoner intialized"
Please suggest.
EDIT 1:
Error log due to the new data type defined:
org.semanticweb.owlapi.reasoner.ReasonerInternalException: Unsupported datatype
'http://www.semanticweb.org/q49f318b/ontologies/2014/6/untitled-ontology-6#perce
ntage'
at uk.ac.manchester.cs.factplusplus.FaCTPlusPlus.getBuiltInDataType(Nati
ve Method)
at uk.ac.manchester.cs.factplusplus.owlapiv3.FaCTPlusPlusReasoner.toData
TypePointer(Unknown Source)
at uk.ac.manchester.cs.factplusplus.owlapiv3.FaCTPlusPlusReasoner$AxiomT
ranslator$DeclarationVisitorEx.visit(Unknown Source)
at uk.ac.manchester.cs.factplusplus.owlapiv3.FaCTPlusPlusReasoner$AxiomT
ranslator$DeclarationVisitorEx.visit(Unknown Source)
at uk.ac.manchester.cs.owl.owlapi.OWLDatatypeImpl.accept(OWLDatatypeImpl
.java:338)
at uk.ac.manchester.cs.factplusplus.owlapiv3.FaCTPlusPlusReasoner$AxiomT
ranslator.visit(Unknown Source)
at uk.ac.manchester.cs.factplusplus.owlapiv3.FaCTPlusPlusReasoner$AxiomT
ranslator.visit(Unknown Source)
at uk.ac.manchester.cs.owl.owlapi.OWLDeclarationAxiomImpl.accept(OWLDecl
arationAxiomImpl.java:128)
at uk.ac.manchester.cs.factplusplus.owlapiv3.FaCTPlusPlusReasoner.loadAx
iom(Unknown Source)
at uk.ac.manchester.cs.factplusplus.owlapiv3.FaCTPlusPlusReasoner.loadRe
asonerAxioms(Unknown Source)
at uk.ac.manchester.cs.factplusplus.owlapiv3.FaCTPlusPlusReasoner.<init>
(Unknown Source)
at uk.ac.manchester.cs.factplusplus.owlapiv3.FaCTPlusPlusReasonerFactory
.createReasoner(Unknown Source)
at uk.ac.manchester.cs.factplusplus.owlapiv3.FaCTPlusPlusReasonerFactory
.createReasoner(Unknown Source)
at org.protege.editor.owl.model.inference.ReasonerUtilities.createReason
er(ReasonerUtilities.java:21)
at org.protege.editor.owl.model.inference.OWLReasonerManagerImpl$Classif
icationRunner.ensureRunningReasonerInitialized(OWLReasonerManagerImpl.java:398)
at org.protege.editor.owl.model.inference.OWLReasonerManagerImpl$Classif
icationRunner.run(OWLReasonerManagerImpl.java:354)
at java.lang.Thread.run(Thread.java:745)
Ontology in use:
http://pastebin.com/L0heDLBy
The stack trace confirms that FaCT++ does not allow custom datatypes to be specified - this is currently a known limitation. I don't know about HermiT but there's a good chance the problem is the same.
Pellet used to have more flexible datatype treatment, you can try installing it and use it instead of HermiT or FaCT++

apache.commons.configuration Unable to load the configuration from the URL

For a project I'm working on I need to change some property values at run time and save them. Looking for a solution I found the apache commons configuration.
I have looked at some others topics solving my first problems but now, an error that says:
org.apache.commons.configuration.ConfigurationException: Unable to load the configuration from the URL file:/C:/Users/sensor/Documents/NetBeansProjects/HermanWijn/dist/run1776677076/HermanWijn.jar!/Database/DataProptester.properties
Looking at this toppic: Property file not reflecting the modified changes using Apache Commons Configuration
My code should be working but for some reason I get an error.
The JDBCWijnDAO is a class in the same package as the properties file, Of course I want to do more than just load the prop file, but at the moment when creating the new propconfig, gives an error which I need to solve.
code:
URL resource = JDBCWijnDAO.class.getResource("DataProptester.properties");
PropertiesConfiguration config = new PropertiesConfiguration(resource.getPath());
Okay, i think i "know" what the problem is but I don't really know how to fix it, in the trace there is a:
Caused by: java.io.FileNotFoundException: C:\Users\Sensor\Documents\NetBeansProjects\HermanWijn\dist\run1776677076\HermanWijn.jar!\Database\DataProptester.properties (The system cannot find the path specified)
I assume there is something wrong with the total path.
errors:
org.apache.commons.configuration.ConfigurationException: Unable to load the configuration from the URL file:/C:/Users/Sander/Documents/NetBeansProjects/HermanWijn/dist/run1776677076/HermanWijn.jar!/Database/DataProptester.properties
at org.apache.commons.configuration.DefaultFileSystem.getInputStream(DefaultFileSystem.java:86)
at org.apache.commons.configuration.AbstractFileConfiguration.load(AbstractFileConfiguration.java:323)
at org.apache.commons.configuration.AbstractFileConfiguration.load(AbstractFileConfiguration.java:261)
at org.apache.commons.configuration.AbstractFileConfiguration.load(AbstractFileConfiguration.java:238)
at org.apache.commons.configuration.AbstractFileConfiguration.<init>(AbstractFileConfiguration.java:158)
at org.apache.commons.configuration.PropertiesConfiguration.<init>(PropertiesConfiguration.java:253)
at hermanwijn.knophandlers.DataBestandLocatieSelectieHandler.handle(DataBestandLocatieSelectieHandler.java:49)
at hermanwijn.knophandlers.DataBestandLocatieSelectieHandler.handle(DataBestandLocatieSelectieHandler.java:29)
at com.sun.javafx.event.CompositeEventHandler.dispatchBubblingEvent(CompositeEventHandler.java:69)
at com.sun.javafx.event.EventHandlerManager.dispatchBubblingEvent(EventHandlerManager.java:217)
at com.sun.javafx.event.EventHandlerManager.dispatchBubblingEvent(EventHandlerManager.java:170)
at com.sun.javafx.event.CompositeEventDispatcher.dispatchBubblingEvent(CompositeEventDispatcher.java:38)
at com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:37)
at com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:92)
at com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:35)
at com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:92)
at com.sun.javafx.event.EventUtil.fireEventImpl(EventUtil.java:53)
at com.sun.javafx.event.EventUtil.fireEvent(EventUtil.java:28)
at javafx.event.Event.fireEvent(Event.java:171)
at javafx.scene.Node.fireEvent(Node.java:6863)
at javafx.scene.control.Button.fire(Button.java:179)
at com.sun.javafx.scene.control.behavior.ButtonBehavior.mouseReleased(ButtonBehavior.java:193)
at com.sun.javafx.scene.control.skin.SkinBase$4.handle(SkinBase.java:336)
at com.sun.javafx.scene.control.skin.SkinBase$4.handle(SkinBase.java:329)
at com.sun.javafx.event.CompositeEventHandler.dispatchBubblingEvent(CompositeEventHandler.java:64)
at com.sun.javafx.event.EventHandlerManager.dispatchBubblingEvent(EventHandlerManager.java:217)
at com.sun.javafx.event.EventHandlerManager.dispatchBubblingEvent(EventHandlerManager.java:170)
at com.sun.javafx.event.CompositeEventDispatcher.dispatchBubblingEvent(CompositeEventDispatcher.java:38)
at com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:37)
at com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:92)
at com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:35)
at com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:92)
at com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:35)
at com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:92)
at com.sun.javafx.event.EventUtil.fireEventImpl(EventUtil.java:53)
at com.sun.javafx.event.EventUtil.fireEvent(EventUtil.java:33)
at javafx.event.Event.fireEvent(Event.java:171)
at javafx.scene.Scene$MouseHandler.process(Scene.java:3324)
at javafx.scene.Scene$MouseHandler.process(Scene.java:3164)
at javafx.scene.Scene$MouseHandler.access$1900(Scene.java:3119)
at javafx.scene.Scene.impl_processMouseEvent(Scene.java:1559)
at javafx.scene.Scene$ScenePeerListener.mouseEvent(Scene.java:2261)
at com.sun.javafx.tk.quantum.GlassViewEventHandler.handleMouseEvent(GlassViewEventHandler.java:228)
at com.sun.glass.ui.View.handleMouseEvent(View.java:528)
at com.sun.glass.ui.View.notifyMouse(View.java:922)
at com.sun.glass.ui.win.WinApplication._runLoop(Native Method)
at com.sun.glass.ui.win.WinApplication.access$100(WinApplication.java:29)
at com.sun.glass.ui.win.WinApplication$2$1.run(WinApplication.java:67)
at java.lang.Thread.run(Thread.java:722)
Caused by: java.io.FileNotFoundException: C:\Users\Sander\Documents\NetBeansProjects\HermanWijn\dist\run1776677076\HermanWijn.jar!\Database\DataProptester.properties (The system cannot find the path specified)
at java.io.FileInputStream.open(Native Method)
at java.io.FileInputStream.<init>(FileInputStream.java:138)
at java.io.FileInputStream.<init>(FileInputStream.java:97)
at sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:90)
at sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:188)
at java.net.URL.openStream(URL.java:1035)
at org.apache.commons.configuration.DefaultFileSystem.getInputStream(DefaultFileSystem.java:82)
Fixed, the problem was the package name was not included, it must be:
URL resource = JDBCWijnDAO.class.getResource("/Database/DataProptester.properties");

WSO2 Identity Server: Error while loading identity configurations after editing identity.xml

I'm trying to set up an WSO2-Identity Server. I've downloaded the Version 4.0.0 Binary. It started correctly and I was able to use it with LDAP.
However, if I want to insert the correct ServerURL into the identity.xml file I get an error.
I inserted the follwoing into the identity.xml:
<OpenIDServerUrl>https://server.vm.uni-freiburg.de:9443/openidserver</OpenIDServerUrl>
<OpenIDUserPattern>https://server.vm.uni-freiburg.de:9443/openid/</OpenIDUserPattern>
and when starting the wso2 IS the following error is thrown:
[2012-11-23 10:34:41,510] ERROR {org.wso2.carbon.identity.core.util.IdentityConfigParser} - Error while loading Identity Configurations
org.apache.axiom.om.OMException: com.ctc.wstx.exc.WstxIOException: Stream Closed
at org.apache.axiom.om.impl.builder.StAXOMBuilder.next(StAXOMBuilder.java:296)
at org.apache.axiom.om.impl.llom.OMElementImpl.getNextOMSibling(OMElementImpl.java:336)
at org.apache.axiom.om.impl.traverse.OMChildElementIterator.next(OMChildElementIterator.java:104)
at org.wso2.carbon.identity.core.util.IdentityConfigParser.readChildElements(IdentityConfigParser.java:154)
at org.wso2.carbon.identity.core.util.IdentityConfigParser.<init>(IdentityConfigParser.java:60)
at org.wso2.carbon.identity.core.util.IdentityConfigParser.getInstance(IdentityConfigParser.java:71)
at org.wso2.carbon.identity.core.util.IdentityUtil.populateProperties(IdentityUtil.java:58)
at org.wso2.carbon.identity.sso.saml.ui.internal.SAMLSSOUIBundleActivator.start(SAMLSSOUIBundleActivator.java:33)
at org.eclipse.osgi.framework.internal.core.BundleContextImpl$1.run(BundleContextImpl.java:782)
at java.security.AccessController.doPrivileged(Native Method)
at org.eclipse.osgi.framework.internal.core.BundleContextImpl.startActivator(BundleContextImpl.java:773)
at org.eclipse.osgi.framework.internal.core.BundleContextImpl.start(BundleContextImpl.java:754)
at org.eclipse.osgi.framework.internal.core.BundleHost.startWorker(BundleHost.java:352)
at org.eclipse.osgi.framework.internal.core.AbstractBundle.resume(AbstractBundle.java:370)
at org.eclipse.osgi.framework.internal.core.Framework.resumeBundle(Framework.java:1068)
at org.eclipse.osgi.framework.internal.core.StartLevelManager.resumeBundles(StartLevelManager.java:557)
at org.eclipse.osgi.framework.internal.core.StartLevelManager.incFWSL(StartLevelManager.java:464)
at org.eclipse.osgi.framework.internal.core.StartLevelManager.doSetStartLevel(StartLevelManager.java:248)
at org.eclipse.osgi.framework.internal.core.StartLevelManager.dispatchEvent(StartLevelManager.java:445)
at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEvent(EventManager.java:220)
at org.eclipse.osgi.framework.eventmgr.EventManager$EventThread.run(EventManager.java:330)
Caused by: com.ctc.wstx.exc.WstxIOException: Stream Closed
at com.ctc.wstx.sr.StreamScanner.throwFromIOE(StreamScanner.java:708)
at com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1086)
at org.apache.axiom.util.stax.wrapper.XMLStreamReaderWrapper.next(XMLStreamReaderWrapper.java:225)
at org.apache.axiom.om.impl.builder.StAXOMBuilder.parserNext(StAXOMBuilder.java:681)
at org.apache.axiom.om.impl.builder.StAXOMBuilder.next(StAXOMBuilder.java:214)
... 20 more
Caused by: java.io.IOException: Stream Closed
at java.io.FileInputStream.readBytes(Native Method)
at java.io.FileInputStream.read(FileInputStream.java:214)
at com.ctc.wstx.io.ISOLatinReader.read(ISOLatinReader.java:79)
at com.ctc.wstx.io.ReaderSource.readInto(ReaderSource.java:84)
at com.ctc.wstx.io.BranchingReaderSource.readInto(BranchingReaderSource.java:57)
at com.ctc.wstx.sr.StreamScanner.loadMoreFromCurrent(StreamScanner.java:1046)
at com.ctc.wstx.sr.StreamScanner.loadMoreFromCurrent(StreamScanner.java:1053)
at com.ctc.wstx.sr.StreamScanner.getNextCharFromCurrent(StreamScanner.java:811)
at com.ctc.wstx.sr.BasicStreamReader.readEndElem(BasicStreamReader.java:3206)
at com.ctc.wstx.sr.BasicStreamReader.nextFromTree(BasicStreamReader.java:2832)
at com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1019)
... 23 more
However, if I leave the default values:
<OpenIDServerUrl>https://localhost:9443/openidserver</OpenIDServerUrl>
<OpenIDUserPattern>https://localhost:9443/openid/</OpenIDUserPattern>
the server starts normally.
I've found a Bug-Report from https://wso2.org/jira/browse/IDENTITY-407 which may be connected with the problem but is not the same.
What am I doing wrong? Thanks in advance!