Fusuki: Not able to upload owl file - sparql

While uploading .owl file to fusuki server I'm getting errror saying
2.1kb
Result: failed with message "SyntaxError: Unexpected token < in JSON at position 0"]2]2
and my .owl file is
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE rdf:RDF [
<!ENTITY xsd "http://www.w3.org/2001/XMLSchema#">
<!ENTITY rdf "http://www.w3.org/1999/02/22-rdf-syntax-ns#">
<!ENTITY rdfs "http://www.w3.org/2000/01/rdf-schema#">
<!ENTITY owl "http://www.w3.org/2002/07/owl#">
<!ENTITY ns_transport "file://www.ibm.com/WSRR/Transport#">
<!ENTITY wsrr "http://www.ibm.com/xmlns/prod/serviceregistry/6/1/model#">
]>
<rdf:RDF
xmlns:xsd="&xsd;"
xmlns:rdf="&rdf;"
xmlns:rdfs="&rdfs;"
xmlns:owl="&owl;"
xmlns:ns_transport="&ns_transport;"
xmlns:wsrr="&wsrr;"
>
<owl:Ontology rdf:about="&ns_transport;TransportOntology">
<owl:imports rdf:resource="http://www.ibm.com/xmlns/prod/serviceregistry/6/1/model"/>
<wsrr:prefix rdf:datatype="http://www.w3.org/2001/XMLSchema#string">transport</wsrr:prefix>
<rdfs:label>A transport classification system.</rdfs:label>
<rdfs:comment>Cars and buses and some superclasses.</rdfs:comment>
</owl:Ontology>
<owl:Class rdf:about="&ns_transport;Transport">
<rdfs:label>Transport</rdfs:label>
<rdfs:comment>Top-level root class for transport.</rdfs:comment>
</owl:Class>
<owl:Class rdf:about="&ns_transport;LandTransport">
<rdfs:subClassOf rdf:resource="&ns_transport;Transport"/>
<rdfs:label>Land Transport.</rdfs:label>
<rdfs:comment>Middle-level land transport class.</rdfs:comment>
</owl:Class>
<owl:Class rdf:about="&ns_transport;AirTransport">
<rdfs:subClassOf rdf:resource="&ns_transport;Transport"/>
<rdfs:label>Air Transport.</rdfs:label>
<rdfs:comment>Middle-level air transport class.</rdfs:comment>
</owl:Class>
<owl:Class rdf:about="&ns_transport;Bus">
<rdfs:subClassOf rdf:resource="&ns_transport;LandTransport"/>
<rdfs:label>Bus.</rdfs:label>
<rdfs:comment>Bottom-level bus class.</rdfs:comment>
</owl:Class>
<owl:Class rdf:about="&ns_transport;Car">
<rdfs:subClassOf rdf:resource="&ns_transport;LandTransport"/>
<rdfs:label>Car.</rdfs:label>
<rdfs:comment>Bottom-level car class.</rdfs:comment>
</owl:Class>
</rdf:RDF>
While uploading .owl file to fusuki server I'm getting errror saying
2.1kb
While uploading .owl file to fusuki server I'm getting errror saying
2.1kb
While uploading .owl file to fusuki server I'm getting errror saying
2.1kb
While uploading .owl file to fusuki server I'm getting errror saying
2.1kb
While uploading .owl file to fusuki server I'm getting errror saying
2.1kb

I had the same problem in apache-jena-fuseki-3.5.0 version, but this error has handled in apache-jena-fuseki-3.6.0

Related

Apache Ignite Structured Logging

I am looking to enable structured logging for Ignite.
Ignite runs inside a docker container.
I enabled the log4j2 module and added a log4j2 configuration file that tries to use <JsonTemplateLayout.../> as described here but in the logs i get the message:
Console contains an invalid element or attribute "JsonTemplateLayout"
Which is probably caused by not having the log4j-layout-template-json dependency available inside ignite. Is there a way how to add the dependency to Ignite or is there another option on how to get structured logging working?
Ignite configuration:
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:util="http://www.springframework.org/schema/util"
xsi:schemaLocation="
http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/util
http://www.springframework.org/schema/util/spring-util.xsd">
<bean class="org.apache.ignite.configuration.IgniteConfiguration">
...
<property name="gridLogger">
<bean class="org.apache.ignite.logger.log4j2.Log4J2Logger">
<constructor-arg type="java.lang.String" value="config/ignite-log4j2-custom.xml"/>
</bean>
</property>
</bean>
</beans>
log4j2 configuration:
<?xml version="1.0" encoding="UTF-8"?>
<Configuration monitorInterval="60" status="debug">
<Appenders>
<Console name="CONSOLE" target="SYSTEM_OUT">
<!-- <PatternLayout pattern="[%d{ISO8601}][%-5p][%t][%c{1}]%notEmpty{[%markerSimpleName]} %m%n"/> -->
<ThresholdFilter level="ERROR" onMatch="DENY" onMismatch="ACCEPT"/>
<JsonTemplateLayout eventTemplateUri="classpath:EcsLayout.json"/>
</Console>
<Console name="CONSOLE_ERR" target="SYSTEM_ERR">
<!-- <PatternLayout pattern="[%d{ISO8601}][%-5p][%t][%c{1}]%notEmpty{[%markerSimpleName]} %m%n"/> -->
<JsonTemplateLayout eventTemplateUri="classpath:EcsLayout.json"/>
</Console>
<File name="CONSISTENCY" fileName="${sys:IGNITE_HOME}/work/log/consistency.log">
<PatternLayout>
<Pattern>"[%d{ISO8601}][%-5p][%t][%c{1}] %m%n"</Pattern>
</PatternLayout>
</File>
<Routing name="FILE">
<Routes pattern="$${sys:nodeId}">
<Route>
<RollingFile name="Rolling-${sys:nodeId}" fileName="${sys:IGNITE_HOME}/work/log/${sys:appId}-${sys:nodeId}.log"
filePattern="${sys:IGNITE_HOME}/work/log/${sys:appId}-${sys:nodeId}-%i-%d{yyyy-MM-dd}.log.gz">
<PatternLayout pattern="[%d{ISO8601}][%-5p][%t][%c{1}]%notEmpty{[%markerSimpleName]} %m%n"/>
<Policies>
<TimeBasedTriggeringPolicy interval="6" modulate="true" />
<SizeBasedTriggeringPolicy size="10 MB" />
</Policies>
</RollingFile>
</Route>
</Routes>
</Routing>
</Appenders>
<Loggers>
<!-- <Logger name="org.apache.ignite" level="INFO"/> -->
<!--
Uncomment to disable courtesy notices, such as SPI configuration
consistency warnings.
-->
<!--
<Logger name="org.apache.ignite.CourtesyConfigNotice" level=OFF/>
-->
<Logger name="org.springframework" level="WARN"/>
<Logger name="org.eclipse.jetty" level="WARN"/>
<Logger name="org.apache.ignite.internal.visor.consistency" additivity="false" level="INFO">
<AppenderRef ref="CONSISTENCY"/>
</Logger>
<!--
Avoid warnings about failed bind attempt when multiple nodes running on the same host.
-->
<Logger name="org.eclipse.jetty.util.log" level="ERROR"/>
<Logger name="org.eclipse.jetty.util.component" level="ERROR"/>
<Logger name="com.amazonaws" level="WARN"/>
<Root level="INFO">
<!-- Uncomment to enable logging to console. -->
<AppenderRef ref="CONSOLE" level="INFO"/>
<AppenderRef ref="CONSOLE_ERR" level="ERROR"/>
<AppenderRef ref="FILE" level="DEBUG"/>
</Root>
</Loggers>
</Configuration>
When adding the JAR to libs (as suggested by Stanislav below) i get a step further but also get an error (not a java person so any hint is highly appreciated):
main ERROR An exception occurred processing Appender CONSOLE org.apache.logging.log4j.core.appender.AppenderLoggingException: java.lang.IllegalAccessError: class org.apache.logging.log4j.layout.template.json.JsonTemplateLayout$StringBuilderEncoder tried to access method 'void org.apache.logging.log4j.core.layout.TextEncoderHelper.encodeText(java.nio.charset.CharsetEncoder, java.nio.CharBuffer, java.nio.ByteBuffer, java.lang.StringBuilder, org.apache.logging.log4j.core.layout.ByteBufferDestination)' (org.apache.logging.log4j.layout.template.json.JsonTemplateLayout$StringBuilderEncoder and org.apache.logging.log4j.core.layout.TextEncoderHelper are in unnamed module of loader 'app')
at org.apache.logging.log4j.core.config.AppenderControl.tryCallAppender(AppenderControl.java:165)
at org.apache.logging.log4j.core.config.AppenderControl.callAppender0(AppenderControl.java:134)
at org.apache.logging.log4j.core.config.AppenderControl.callAppenderPreventRecursion(AppenderControl.java:125)
at org.apache.logging.log4j.core.config.AppenderControl.callAppender(AppenderControl.java:89)
at org.apache.logging.log4j.core.config.LoggerConfig.callAppenders(LoggerConfig.java:542)
at org.apache.logging.log4j.core.config.LoggerConfig.processLogEvent(LoggerConfig.java:500)
at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:483)
at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:417)
at org.apache.logging.log4j.core.config.AwaitCompletionReliabilityStrategy.log(AwaitCompletionReliabilityStrategy.java:82)
at org.apache.logging.log4j.core.Logger.log(Logger.java:161)
at org.apache.logging.log4j.spi.AbstractLogger.tryLogMessage(AbstractLogger.java:2205)
at org.apache.logging.log4j.spi.AbstractLogger.logMessageTrackRecursion(AbstractLogger.java:2159)
at org.apache.logging.log4j.spi.AbstractLogger.logMessageSafely(AbstractLogger.java:2142)
at org.apache.logging.log4j.spi.AbstractLogger.logMessage(AbstractLogger.java:2017)
at org.apache.logging.log4j.spi.AbstractLogger.logIfEnabled(AbstractLogger.java:1983)
at org.apache.logging.log4j.spi.AbstractLogger.info(AbstractLogger.java:1275)
at org.apache.ignite.logger.log4j2.Log4J2Logger.info(Log4J2Logger.java:472)
at org.apache.ignite.logger.log4j2.Log4J2Logger.info(Log4J2Logger.java:464)
at org.apache.ignite.internal.GridLoggerProxy.info(GridLoggerProxy.java:137)
at org.apache.ignite.internal.plugin.IgniteLogInfoProviderImpl.ackConfiguration(IgniteLogInfoProviderImpl.java:222)
at org.apache.ignite.internal.plugin.IgniteLogInfoProviderImpl.ackKernalInited(IgniteLogInfoProviderImpl.java:98)
at org.apache.ignite.internal.IgniteKernal.start(IgniteKernal.java:902)
at org.apache.ignite.internal.IgnitionEx$IgniteNamedInstance.start0(IgnitionEx.java:1799)
at org.apache.ignite.internal.IgnitionEx$IgniteNamedInstance.start(IgnitionEx.java:1721)
at org.apache.ignite.internal.IgnitionEx.start0(IgnitionEx.java:1160)
at org.apache.ignite.internal.IgnitionEx.startConfigurations(IgnitionEx.java:1054)
at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:940)
at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:839)
at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:709)
at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:678)
at org.apache.ignite.Ignition.start(Ignition.java:353)
at org.apache.ignite.startup.cmdline.CommandLineStartup.main(CommandLineStartup.java:365)
Caused by: java.lang.IllegalAccessError: class org.apache.logging.log4j.layout.template.json.JsonTemplateLayout$StringBuilderEncoder tried to access method 'void org.apache.logging.log4j.core.layout.TextEncoderHelper.encodeText(java.nio.charset.CharsetEncoder, java.nio.CharBuffer, java.nio.ByteBuffer, java.lang.StringBuilder, org.apache.logging.log4j.core.layout.ByteBufferDestination)' (org.apache.logging.log4j.layout.template.json.JsonTemplateLayout$StringBuilderEncoder and org.apache.logging.log4j.core.layout.TextEncoderHelper are in unnamed module of loader 'app')
at org.apache.logging.log4j.layout.template.json.JsonTemplateLayout$StringBuilderEncoder.encode(JsonTemplateLayout.java:241)
at org.apache.logging.log4j.layout.template.json.JsonTemplateLayout$StringBuilderEncoder.encode(JsonTemplateLayout.java:216)
at org.apache.logging.log4j.layout.template.json.JsonTemplateLayout.encode(JsonTemplateLayout.java:304)
at org.apache.logging.log4j.layout.template.json.JsonTemplateLayout.encode(JsonTemplateLayout.java:58)
at org.apache.logging.log4j.core.appender.AbstractOutputStreamAppender.directEncodeEvent(AbstractOutputStreamAppender.java:197)
at org.apache.logging.log4j.core.appender.AbstractOutputStreamAppender.tryAppend(AbstractOutputStreamAppender.java:190)
at org.apache.logging.log4j.core.appender.AbstractOutputStreamAppender.append(AbstractOutputStreamAppender.java:181)
at org.apache.logging.log4j.core.config.AppenderControl.tryCallAppender(AppenderControl.java:161)
... 31 more
Solution
As Stanislav Lukyanov (see accepted answer) suggested the solution was to just download the JAR and place it below $IGNITE_HOME/libs. The error mentioned above was caused by a version mismatch. Having the following JARs with correct version made it work:
log4j-api-2.17.1.jar (default provided by ignite distribution)
log4j-core-2.17.1.jar (default provided by ignite distribution)
ignite-log4j2-2.13.0.jar (default provided by ignite distribution)
log4j-layout-template-json-2.17.1.jar (added, did not work with version 2.18.x)
If you run Ignite using Maven, you'll need to add the required dependency to your application POM, as described in the documentation:
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-layout-template-json</artifactId>
<version>2.18.0</version>
</dependency>
If you run Ignite using a ZIP distribution, you'll need to download the dependency as a JAR, e.g. from here and add it to the $IGNITE_HOME/libs.

Select single ontology class and all of its axioms and annotations as a subset

Is there a SPARQL or other method to extract a single class and all of its associated axioms and annotations from an ontology? For example, assume one had a list of classes from one ontology that they wanted to add to another ontology.
For example, consider the following class from IDO: http://purl.obolibrary.org/obo/IDO_0000406 (see below). In this example, it might be easier to simply parse the contents of the ontology's OWL file using a regex pattern such as /(<owl:Class .+?\s\s\s\s\s\n<\/owl:Class>/ to capture all of the details of every class after which the classes of interest could simply be filtered out and appended to a new OWL file.
<owl:Class rdf:about="http://purl.obolibrary.org/obo/IDO_...">
...
</owl:Class>
...
<owl:Class rdf:about="http://purl.obolibrary.org/obo/IDO_0000406">
<rdfs:subClassOf rdf:resource="http://purl.obolibrary.org/obo/IDO_0000452"/>
<rdfs:subClassOf>
<owl:Class>
<owl:intersectionOf rdf:parseType="Collection">
<owl:Restriction>
<owl:onProperty rdf:resource="http://purl.obolibrary.org/obo/RO_0000052"/>
<owl:someValuesFrom rdf:resource="http://purl.obolibrary.org/obo/OBI_0100026"/>
</owl:Restriction>
<owl:Restriction>
<owl:onProperty rdf:resource="http://purl.obolibrary.org/obo/BFO_0000054"/>
<owl:allValuesFrom>
<owl:Class>
<owl:intersectionOf rdf:parseType="Collection">
<rdf:Description rdf:about="http://purl.obolibrary.org/obo/BFO_0000015"/>
<owl:Restriction>
<owl:onProperty rdf:resource="http://purl.obolibrary.org/obo/BFO_0000051"/>
<owl:someValuesFrom rdf:resource="http://purl.obolibrary.org/obo/IDO_0000625"/>
</owl:Restriction>
<owl:Restriction>
<owl:onProperty rdf:resource="http://purl.obolibrary.org/obo/BFO_0000051"/>
<owl:someValuesFrom rdf:resource="http://purl.obolibrary.org/obo/TRANS_0000000"/>
</owl:Restriction>
<owl:Restriction>
<owl:onProperty rdf:resource="http://purl.obolibrary.org/obo/BFO_0000051"/>
<owl:someValuesFrom>
<owl:Class>
<owl:intersectionOf rdf:parseType="Collection">
<rdf:Description rdf:about="http://purl.obolibrary.org/obo/IDO_0000626"/>
<owl:Class>
<owl:complementOf>
<owl:Restriction>
<owl:onProperty rdf:resource="http://purl.obolibrary.org/obo/BFO_0000066"/>
<owl:someValuesFrom rdf:resource="http://purl.obolibrary.org/obo/IDO_0000457"/>
</owl:Restriction>
</owl:complementOf>
</owl:Class>
</owl:intersectionOf>
</owl:Class>
</owl:someValuesFrom>
</owl:Restriction>
</owl:intersectionOf>
</owl:Class>
</owl:allValuesFrom>
</owl:Restriction>
</owl:intersectionOf>
</owl:Class>
</rdfs:subClassOf>
<obo:IAO_0000115 xml:lang="en">An infectious disposition to become part of a disorder only in organisms whose defenses are compromised.</obo:IAO_0000115>
<obo:IAO_0000117>Albert Goldfain</obo:IAO_0000117>
<obo:IAO_0000117>Alexander Diehl</obo:IAO_0000117>
<obo:IAO_0000117>Lindsay Cowell</obo:IAO_0000117>
<obo:IAO_0000118 xml:lang="en">opportunitistic pathogenic disposition</obo:IAO_0000118>
<rdfs:comment>The disposition is realized in a process by which the bearer becomes part of a disorder in an immunocompromised host.</rdfs:comment>
<rdfs:comment xml:lang="en">This includes individuals who are immunocompromised or who have damaged barriers that normally protect against infection (e.g. skin).</rdfs:comment>
<rdfs:label xml:lang="en">opportunistic infectious disposition</rdfs:label>
</owl:Class>
EDIT: I tried using rdflib to achieve this by loading the source ontology as a Graph(). Using a for loop to iterate through statements in the loaded graph, it's easy enough to find the triples which are directly connected to a class, say, ido:IDO_0000406 -> obo:IAO_0000117 -> Alexander Diehl. These can then be added to the target ontology, which is also loaded as a graph. However, using the IDO_0000406 class example, it is clear that triples not on the first sub-level (i.e. anything within the clause <rdfs:subClassOf><owl:Class>...) will not come through as expected. For example:
g_tgt = Graph()
classes = ['http://purl.obolibrary.org/obo/IDO_0000406'
'http://purl.obolibrary.org/obo/IDO_0000407',
'http://purl.obolibrary.org/obo/IDO_0000408',
'http://purl.obolibrary.org/obo/IDO_0000409']
g_src = Graph()
g_src.parse('ido.owl')
for stmt in g_src: # stmt: (subject, predicate, object)
if str(stmt[0]) in classes:
# adds all first-level triples to target graph
g_tgt.add((stmt[0], stmt[1], stmt[2]))
My thought is to approach non-first-level nodes in a recursive fashion and will update if this is successful.
EDIT 2: It should be possible to extract the classes using ROBOT (http://robot.obolibrary.org/extract). For example:
robot extract --method STAR \
--input filtered.owl \
--term-file uberon_module.txt \
--output results/uberon_module.owl
where uberon_module.txt would include the list of classes to be extracted.

Is there a way to query using SPARQL for more than one value of object when the subject and predicate are the same?

For context I am using GraphDB (SPARQL 1.1), and I have extended the What To Make ontology. I'd like to construct a query that can select ONLY the subjects that contain all of the objects with matching values for a particular predicate.
In my data (below) selecting the individual &so;KitchenDrawer's so:contains predicates and matching those against only those '&wtm;Recipe''s that contain just the values for wtm:hasIngredient. The subject and predicate are the same but there are alternating objects.
When I run the query (with &so;KitchenDrawer only containing &ind;Basil and &ind;Tomato), I would like to only get the subject's &ind;JustTomatoAndBasil and &ind;JustTomato, omitting &ind;Pesto as it has ingredients other than basil and tomato.
RDF Data:
<?xml version="1.0"?>
<!DOCTYPE rdf:RDF [
<!ENTITY rdf "http://www.w3.org/1999/02/22-rdf-syntax-ns#" >
<!ENTITY rdfs "http://www.w3.org/2000/01/rdf-schema#" >
<!ENTITY owl "http://www.w3.org/2002/07/owl#" >
<!ENTITY xsd "http://www.w3.org/2001/XMLSchema#" >
<!ENTITY dct "http://purl.org/dc/terms/" >
<!ENTITY obo "http://purl.obolibrary.org/obo/" >
<!ENTITY skos "http://www.w3.org/2004/02/skos/core#" >
<!ENTITY wtm "http://purl.org/heals/food/" >
<!ENTITY ind "http://purl.org/heals/ingredient/" >
<!ENTITY prov "http://www.w3.org/ns/prov#">
<!ENTITY sm "http://www.omg.org/techprocess/ab/SpecificationMetadata/">
<!ENTITY so "http://example.com/storage-ontology/">
]>
<rdf:RDF xml:base="http://purl.org/heals/ingredient/"
xmlns:so="http://example.com/storage-ontology/"
xmlns:wtm="http://purl.org/heals/food/"
xmlns:dct="http://purl.org/dc/terms/"
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:owl="http://www.w3.org/2002/07/owl#"
xmlns:xml="http://www.w3.org/XML/1998/namespace"
xmlns:xsd="http://www.w3.org/2001/XMLSchema#"
xmlns:skos="http://www.w3.org/2004/02/skos/core#"
xmlns:rdfs="http://www.w3.org/2000/01/rdf-schema#"
xmlns:obo="http://purl.obolibrary.org/obo/"
xmlns:prov="http://www.w3.org/ns/prov#"
xmlns:ind="http://purl.org/heals/ingredient/"
xmlns:sm="https://www.omg.org/techprocess/ab/SpecificationMetadata/">
<owl:Ontology rdf:about="http://purl.org/heals/ingredient/">
<owl:imports rdf:resource="http://purl.org/heals/food/"/>
<owl:imports rdf:resource="http://www.omg.org/techprocess/ab/SpecificationMetadata/"/>
<rdfs:label>What to Make Individuals Extensions</rdfs:label>
<dct:license rdf:datatype="&xsd;anyURI">http://opensource.org/licenses/MIT</dct:license>
</owl:Ontology>
<owl:NamedIndividual rdf:about="&ind;JustTomato">
<rdf:type rdf:resource="&wtm;Recipe"/>
<wtm:hasIngredient rdf:resource="&ind;Tomato"/>
<wtm:isRecommendedForMeal rdf:resource="&wtm;Lunch"/>
<wtm:isRecommendedForMeal rdf:resource="&wtm;Dinner"/>
<wtm:isRecommendedForMeal rdf:resource="&wtm;Snack"/>
<skos:scopeNote>recipe</skos:scopeNote>
</owl:NamedIndividual>
<owl:NamedIndividual rdf:about="&ind;JustTomatoAndBasil">
<rdf:type rdf:resource="&wtm;Recipe"/>
<wtm:hasIngredient rdf:resource="&ind;Tomato"/>
<wtm:hasIngredient rdf:resource="&ind;Basil"/>
<wtm:isRecommendedForMeal rdf:resource="&wtm;Lunch"/>
<wtm:isRecommendedForMeal rdf:resource="&wtm;Dinner"/>
<wtm:isRecommendedForMeal rdf:resource="&wtm;Snack"/>
<skos:scopeNote>recipe</skos:scopeNote>
</owl:NamedIndividual>
<owl:NamedIndividual rdf:about="&ind;Pesto">
<rdf:type rdf:resource="&wtm;Recipe"/>
<wtm:hasIngredient rdf:resource="&ind;Basil"/>
<wtm:hasIngredient rdf:resource="&ind;PineNuts"/>
<wtm:hasIngredient rdf:resource="&ind;Parmesan"/>
<wtm:hasIngredient rdf:resource="&ind;Garlic"/>
<wtm:hasIngredient rdf:resource="&ind;OliveOil"/>
<wtm:isRecommendedForMeal rdf:resource="&wtm;Lunch"/>
<wtm:isRecommendedForMeal rdf:resource="&wtm;Dinner"/>
<wtm:isRecommendedForMeal rdf:resource="&wtm;Snack"/>
<wtm:serves rdf:datatype="&xsd;integer">4</wtm:serves>
<rdfs:label>basil pesto</rdfs:label>
<skos:definition>a traditional pesto made from paresan cheese and basil</skos:definition>
<skos:scopeNote>recipe</skos:scopeNote>
</owl:NamedIndividual>
<owl:Class rdf:about="&so;StorageLocation">
</owl:Class>
<owl:Class rdf:about="&so;Drawer">
<rdfs:subClassOf rdf:resource="&so;StorageLocation"/>
</owl:Class>
<owl:ObjectProperty rdf:about="&so;contains">
</owl:ObjectProperty>
<owl:NamedIndividual rdf:about="&so;KitchenDrawer">
<rdf:type rdf:resource="&so;Rack"/>
<so:contains rdf:resource="&ind;Basil"></so:contains>
<so:contains rdf:resource="&ind;Tomato"></so:contains>
</owl:NamedIndividual>
</rdf:RDF>

ActiveMQ Artemis not starting without SSL enabled configuration

I'm using ActiveMQ Artemis 2.18.0. Firstly I configured setup with SSL and artemis was starting without errors then I wanted to test my setup without SSL, I removed all SSL related settings from broker.xml and bootstrap.xml and now when I trying to run Artemis I'm getting this:
2021-10-13 07:34:26,047 INFO [org.apache.activemq.artemis.core.server] AMQ221001: Apache ActiveMQ Artemis Message Broker version 2.18.0 [amq1, nodeID=bee15e5b-2bf7-11ec-887f-0800277c53f8]
2021-10-13 07:34:26,263 INFO [org.apache.activemq.hawtio.branding.PluginContextListener] Initialized activemq-branding plugin
2021-10-13 07:34:26,297 INFO [org.apache.activemq.hawtio.plugin.PluginContextListener] Initialized artemis-plugin plugin
2021-10-13 07:34:26,548 INFO [io.hawt.HawtioContextListener] Initialising hawtio services
2021-10-13 07:34:26,571 INFO [io.hawt.system.ConfigManager] Configuration will be discovered via system properties
2021-10-13 07:34:26,573 INFO [io.hawt.jmx.JmxTreeWatcher] Welcome to Hawtio 2.13.5
2021-10-13 07:34:26,580 INFO [io.hawt.web.auth.AuthenticationConfiguration] Starting hawtio authentication filter, JAAS realm: "activemq" authorized role(s): "amq" role principal classes: "org.apache.activemq.artemis.spi.core.security.jaas.RolePrincipal"
2021-10-13 07:34:26,595 INFO [io.hawt.web.proxy.ProxyServlet] Proxy servlet is disabled
2021-10-13 07:34:26,600 INFO [io.hawt.web.servlets.JolokiaConfiguredAgentServlet] Jolokia overridden property: [key=policyLocation, value=file:/home/vagrant/artemis-broker/etc/jolokia-access.xml]
java.lang.IllegalStateException: /home/vagrant/artemis-broker/etc/keystore.jks is not a valid keystore
at org.eclipse.jetty.util.security.CertificateUtils.getKeyStore(CertificateUtils.java:50)
at org.eclipse.jetty.util.ssl.SslContextFactory.loadKeyStore(SslContextFactory.java:1203)
at org.eclipse.jetty.util.ssl.SslContextFactory.load(SslContextFactory.java:322)
at org.eclipse.jetty.util.ssl.SslContextFactory.doStart(SslContextFactory.java:244)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73)
at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:169)
at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:117)
at org.eclipse.jetty.server.SslConnectionFactory.doStart(SslConnectionFactory.java:97)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73)
at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:169)
at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:117)
at org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:321)
at org.eclipse.jetty.server.AbstractNetworkConnector.doStart(AbstractNetworkConnector.java:81)
at org.eclipse.jetty.server.ServerConnector.doStart(ServerConnector.java:234)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73)
at org.eclipse.jetty.server.Server.doStart(Server.java:401)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73)
at org.apache.activemq.artemis.component.WebServerComponent.start(WebServerComponent.java:263)
at org.apache.activemq.artemis.core.server.impl.ActiveMQServerImpl.addExternalComponent(ActiveMQServerImpl.java:908)
at org.apache.activemq.artemis.cli.commands.Run.execute(Run.java:126)
at org.apache.activemq.artemis.cli.Artemis.internalExecute(Artemis.java:155)
at org.apache.activemq.artemis.cli.Artemis.execute(Artemis.java:103)
at org.apache.activemq.artemis.cli.Artemis.execute(Artemis.java:130)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at org.apache.activemq.artemis.boot.Artemis.execute(Artemis.java:134)
at org.apache.activemq.artemis.boot.Artemis.main(Artemis.java:50)
2021-10-13 07:34:26,847 INFO [io.hawt.web.auth.AuthenticationFilter] Destroying hawtio authentication filter
2021-10-13 07:34:26,848 INFO [io.hawt.HawtioContextListener] Destroying hawtio services
2021-10-13 07:34:26,875 INFO [org.apache.activemq.hawtio.plugin.PluginContextListener] Destroyed artemis-plugin plugin
2021-10-13 07:34:26,878 INFO [org.apache.activemq.hawtio.branding.PluginContextListener] Destroyed activemq-branding plugin
2021-10-13 07:34:26,902 INFO [org.apache.activemq.artemis.core.server] AMQ221002: Apache ActiveMQ Artemis Message Broker version 2.18.0 [bee15e5b-2bf7-11ec-887f-0800277c53f8] stopped, uptime 11.619 seconds
Not sure what Jolokia property is overriden. Did I forgot to do something else?
bootstrap.xml:
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<broker xmlns="http://activemq.org/schema">
<jaas-security domain="activemq"/>
<server configuration="file:/home/vagrant/artemis-broker/etc//broker.xml"/>
<web bind="https://0.0.0.0:8161" path="web">
<app url="activemq-branding" war="activemq-branding.war"/>
<app url="artemis-plugin" war="artemis-plugin.war"/>
<app url="console" war="console.war"/>
</web>
</broker>
broker.xml:
<?xml version='1.0'?>
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
<configuration xmlns="urn:activemq"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:xi="http://www.w3.org/2001/XInclude"
xsi:schemaLocation="urn:activemq /schema/artemis-configuration.xsd">
<core xmlns="urn:activemq:core" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="urn:activemq:core ">
<name>amq1</name>
<persistence-enabled>true</persistence-enabled>
<!-- this could be ASYNCIO, MAPPED, NIO
ASYNCIO: Linux Libaio
MAPPED: mmap files
NIO: Plain Java Files
-->
<journal-type>ASYNCIO</journal-type>
<paging-directory>data/paging</paging-directory>
<bindings-directory>data/bindings</bindings-directory>
<journal-directory>data/journal</journal-directory>
<large-messages-directory>data/large-messages</large-messages-directory>
<journal-datasync>true</journal-datasync>
<journal-min-files>2</journal-min-files>
<journal-pool-files>10</journal-pool-files>
<journal-device-block-size>4096</journal-device-block-size>
<journal-file-size>10M</journal-file-size>
<!--
This value was determined through a calculation.
Your system could perform 31.25 writes per millisecond
on the current journal configuration.
That translates as a sync write every 32000 nanoseconds.
Note: If you specify 0 the system will perform writes directly to the disk.
We recommend this to be 0 if you are using journalType=MAPPED and journal-datasync=false.
-->
<journal-buffer-timeout>28000</journal-buffer-timeout>
<!--
When using ASYNCIO, this will determine the writing queue depth for libaio.
-->
<journal-max-io>4096</journal-max-io>
<max-disk-usage>100</max-disk-usage>
<!-- should the broker detect dead locks and other issues -->
<critical-analyzer>true</critical-analyzer>
<critical-analyzer-timeout>150000</critical-analyzer-timeout>
<critical-analyzer-check-period>60000</critical-analyzer-check-period>
<critical-analyzer-policy>HALT</critical-analyzer-policy>
<page-sync-timeout>1628000</page-sync-timeout>
<global-max-size>204Mb</global-max-size>
<connectors>
<connector name="amq1">tcp://amq1:61616</connector>
<connector name="amq2">tcp://amq2:61616</connector>
<connector name="amq3">tcp://amq3:61616</connector>
<connector name="amq4">tcp://amq4:61616</connector>
<connector name="amq5">tcp://amq5:61616</connector>
<connector name="amq6">tcp://amq6:61616</connector>
</connectors>
<acceptors>
<acceptor name="artemis">tcp://0.0.0.0:61616?tcpSendBufferSize=1048576;tcpReceiveBufferSize=1048576;amqpMinLargeMessageSize=102400;protocols=CORE,AMQP,STOMP,HORNETQ,MQTT,OPENWIRE;useEpoll=true;amqpCredits=1000;amqpLowCredits=300;amqpDuplicateDetection=true</acceptor>
<acceptor name="amqp">tcp://0.0.0.0:5672?tcpSendBufferSize=1048576;tcpReceiveBufferSize=1048576;protocols=AMQP;useEpoll=true;amqpCredits=1000;amqpLowCredits=300;amqpMinLargeMessageSize=102400;amqpDuplicateDetection=true</acceptor>
<acceptor name="stomp">tcp://0.0.0.0:61613?tcpSendBufferSize=1048576;tcpReceiveBufferSize=1048576;protocols=STOMP;useEpoll=true</acceptor>
<acceptor name="hornetq">tcp://0.0.0.0:5445?anycastPrefix=jms.queue.;multicastPrefix=jms.topic.;protocols=HORNETQ,STOMP;useEpoll=true</acceptor>
<acceptor name="mqtt">tcp://0.0.0.0:1883?tcpSendBufferSize=1048576;tcpReceiveBufferSize=1048576;protocols=MQTT;useEpoll=true</acceptor>
</acceptors>
<broadcast-groups>
<broadcast-group name="artemis-broadcast-group">
<group-address>231.7.7.7</group-address>
<group-port>9876</group-port>
<broadcast-period>2000</broadcast-period>
<connector-ref>amq1</connector-ref>
</broadcast-group>
</broadcast-groups>
<discovery-groups>
<discovery-group name="artemis-discovery-group">
<group-address>231.7.7.7</group-address>
<group-port>9876</group-port>
<refresh-timeout>10000</refresh-timeout>
</discovery-group>
</discovery-groups>
<cluster-user>admin</cluster-user>
<cluster-password>admin</cluster-password>
<cluster-connections>
<cluster-connection name="artemis-cluster">
<connector-ref>amq1</connector-ref>
<retry-interval>1000</retry-interval>
<retry-interval-multiplier>3</retry-interval-multiplier>
<max-retry-interval>5000</max-retry-interval>
<initial-connect-attempts>-1</initial-connect-attempts>
<reconnect-attempts>-1</reconnect-attempts>
<use-duplicate-detection>true</use-duplicate-detection>
<message-load-balancing>STRICT</message-load-balancing>
<max-hops>1</max-hops>
<discovery-group-ref discovery-group-name="artemis-discovery-group"/>
</cluster-connection>
</cluster-connections>
<!-- Other config -->
<ha-policy>
<replication>
<master>
<group-name>artemis-group-1</group-name>
<quorum-vote-wait>12</quorum-vote-wait>
<vote-on-replication-failure>true</vote-on-replication-failure>
<!--for auto failback -->
<check-for-live-server>true</check-for-live-server>
</master>
</replication>
</ha-policy>
<security-settings>
<security-setting match="#">
<permission type="createNonDurableQueue" roles="amq"/>
<permission type="deleteNonDurableQueue" roles="amq"/>
<permission type="createDurableQueue" roles="amq"/>
<permission type="deleteDurableQueue" roles="amq"/>
<permission type="createAddress" roles="amq"/>
<permission type="deleteAddress" roles="amq"/>
<permission type="consume" roles="amq"/>
<permission type="browse" roles="amq"/>
<permission type="send" roles="amq"/>
<!-- we need this otherwise ./artemis data imp wouldn't work -->
<permission type="manage" roles="amq"/>
</security-setting>
</security-settings>
<addresses>
<address name="exampleQueue">
<anycast>
<queue name="exampleQueue"/>
</anycast>
</address>
<address name="DLQ">
</address>
<address name="ExpiryQueue">
<anycast>
<queue name="ExpiryQueue" />
</anycast>
</address>
</addresses>
<address-settings>
<!-- if you define auto-create on certain queues, management has to be auto-create -->
<address-setting match="activemq.management#">
<dead-letter-address>DLQ</dead-letter-address>
<expiry-address>ExpiryQueue</expiry-address>
<redelivery-delay>0</redelivery-delay>
<!-- with -1 only the global-max-size is in use for limiting -->
<max-size-bytes>-1</max-size-bytes>
<message-counter-history-day-limit>10</message-counter-history-day-limit>
<address-full-policy>PAGE</address-full-policy>
<auto-create-queues>true</auto-create-queues>
<auto-create-addresses>true</auto-create-addresses>
<auto-create-jms-queues>true</auto-create-jms-queues>
<auto-create-jms-topics>true</auto-create-jms-topics>
</address-setting>
<!--default for catch all-->
<address-setting match="#">
<dead-letter-address>DLQ</dead-letter-address>
<expiry-address>ExpiryQueue</expiry-address>
<redelivery-delay>0</redelivery-delay>
<auto-create-dead-letter-resources>true</auto-create-dead-letter-resources>
<!-- with -1 only the global-max-size is in use for limiting -->
<max-size-bytes>-1</max-size-bytes>
<message-counter-history-day-limit>10</message-counter-history-day-limit>
<address-full-policy>PAGE</address-full-policy>
<auto-create-queues>true</auto-create-queues>
<auto-create-addresses>true</auto-create-addresses>
<auto-create-jms-queues>true</auto-create-jms-queues>
<auto-create-jms-topics>true</auto-create-jms-topics>
</address-setting>
<address-setting match="exampleQueue">
<dead-letter-address>DLQ</dead-letter-address>
<redelivery-delay>1000</redelivery-delay>
<max-delivery-attempts>3</max-delivery-attempts>
<max-size-bytes>-1</max-size-bytes>
<page-size-bytes>1048576</page-size-bytes>
<message-counter-history-day-limit>10</message-counter-history-day-limit>
<address-full-policy>PAGE</address-full-policy>
</address-setting>
</address-settings>
<!-- Uncomment the following if you want to use the Standard LoggingActiveMQServerPlugin pluging to log in events
<broker-plugins>
<broker-plugin class-name="org.apache.activemq.artemis.core.server.plugin.impl.LoggingActiveMQServerPlugin">
<property key="LOG_ALL_EVENTS" value="true"/>
<property key="LOG_CONNECTION_EVENTS" value="true"/>
<property key="LOG_SESSION_EVENTS" value="true"/>
<property key="LOG_CONSUMER_EVENTS" value="true"/>
<property key="LOG_DELIVERING_EVENTS" value="true"/>
<property key="LOG_SENDING_EVENTS" value="true"/>
<property key="LOG_INTERNAL_EVENTS" value="true"/>
</broker-plugin>
</broker-plugins>
-->
</core>
</configuration>
ActiveMQ Artemis is failing because the bind attribute of the web element is using the HTTPS protocol:
<web bind="https://0.0.0.0:8161" path="web">
To fix this issue the bind attribute should use HTTP protocol:
<web bind="http://0.0.0.0:8161" path="web">

How can I create an owl:intersectionOf two owl:Classes?

For a school exercise, I have an RDF file and an OWL file.
There is an owl:Class Lecturer and an owl:Class Researcher. The intersection of both should be a Professor. I have put my RDF and OWL file below.
Problem is: when I do my query, no resource is of type Professor, while in the RDF file we can see that Laura should be a Professor.
Reduced version of rdf file:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE rdf:RDF [
<!ENTITY humans "http://www.inria.fr/2007/09/11/humans.rdfs">
<!ENTITY xsd "http://www.w3.org/2001/XMLSchema#"> ]>
<rdf:RDF
xmlns:rdf ="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:rdfs="http://www.w3.org/2000/01/rdf-schema#"
xmlns:xsd ="&xsd;"
xmlns ="&humans;#"
xml:base ="&humans;-instances" >
<Person rdf:ID="Laura">
<name>Laura</name>
</Person>
<Lecturer rdf:about="#Laura"/>
<Researcher rdf:about="#Laura">
<name>Laura</name>
</Researcher>
</rdf:RDF>
Reduced version of owl file:
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:rdfs="http://www.w3.org/2000/01/rdf-schema#"
xmlns="http://www.w3.org/2000/01/rdf-schema#"
xml:base="http://www.inria.fr/2007/09/11/humans.rdfs"
xmlns:owl="http://www.w3.org/2002/07/owl#">
<owl:Class rdf:ID="Person">
</owl:Class>
<owl:Class rdf:ID="Lecturer">
<subClassOf rdf:resource="#Person"/>
</owl:Class>
<owl:Class rdf:ID="Researcher">
<subClassOf rdf:resource="#Person"/>
</owl:Class>
<owl:Class rdf:id="Professor">
<owl:intersectionOf rdf:parseType="Collection">
<owl:Class rdf:about="#Lecturer"/>
<owl:Class rdf:about="#Researcher"/>
</owl:intersectionOf>
</owl:Class>
</rdf:RDF>
The query I used was the defautl query:
select * where {
?x ?p ?y
}
But what I actually would expect to do is the following:
select * where {
?x a <http://www.inria.fr/2007/09/11/humans.rdfs#Professor>
}
I did look at this answer: Why do we need to use rdf:parseType="Collection" with owl:intersectionOf? but I don't understand in which way it should be used for my specific problem.
I hope somebody can help. By the way, it's my first post here, so let me know if something's missing.
Paraphrased from comments by #stanislav-kralin:
Use correct capitalization of rdf:ID (not rdf:id), and enable "OWL-Max" reasoning when loading your RDF into GraphDB.