I want to use GraphDB for ontology which uses SWRL rules. Is it possible to add SWRL rules set into GraphDB?
How can one do it?
Related
I am trying to extend SPARQL by introducing new clauses in the syntax. Is there a way to do this using graphdb?
I need a headstart for this
GraphDB uses the Eclipse RDF4J SPARQL parser, which is open-source, so you can extend that easily enough. The parser is defined as a JavaCC grammar file (see sparql.jjt in rdf4j github), so the easiest way to exend is to add your extensions to that grammar file and then recompile the parser using JavaCC.
However, given that, once you've parsed something, you need to transform it into an algebra model that then the underlying triplestore can actually do something with, extending the parser is the easy part.
Is there a plug-in or other means to create and edit SPARQL/SPIN constraints and constructors in Protege?
As I understand it, to capture SPIN constraints in RDF, the SPARQL code for the ASK or CONSTRUCT queries needs to be parsed and encoded. It's not stored as an opaque string. Therefore, it would seem that some plugin with knowledge of SPARQL and SPIN would be required.
I've loaded RDF from Topbraid Composer including SPIN constraints into Protege 4.3.0, and it seems to see the constraints as annotations, but I cannot seem to find all of the details, critically including all of the underlying SPARQL code. I do see it when text editing the RDF file.
In the broad sense, I'm trying to find a way to create/edit SPIN constraints and constructors and load them into Sesame to have them operate on individuals instantiated from my classes. I posted another question about the path from TopBraid Composer into Sesame. I'm trying to keep my questions more specific since I'm a newbie on Stack Overflow.
BTW, no I don't want to use SWRL instead. I've had trouble expressing the constraints I need using SWRL. I've had success using SPARQL.
Thanks.
In some versions TopBraid Composer will store SPIN constraints in RDF by default. Given that the query is stored as RDF triples, there should be no problem storing them in any RDF data store. Applying the SPIN constraints is a different issue, as the system will need to know how to interpret the queries for different SPIN properties.
Are you certain you cannot "see" them in Protégé or Sesame? The constraints are defined on the class using the property spin:constraint and should appear as a bnode. Make sure you also import http://spinrdf.org/spin, or at least define a property named spin:constraint. In the very least, the following should always work to find your constraints:
SELECT ?constraint ?class
WHERE {
?class <http://spinrdf.org/spin#constraint> ?constraint
}
...where ?constraint is bound to a bnode representing the constraint in RDF and ?class is the class the constraint is defined for.
Also, if you would rather store the constraints as SPARQL strings, see Preferences > TopBraid Composer > SPIN and check one of the boxes in "Generate sp:text...". Then you can get the query text via the following query:
SELECT ?query ?class
WHERE {
?class <http://spinrdf.org/spin#constraint> ?constraint .
?constraint <http://spinrdf.org/sp#text> ?query
}
I am using SPARQL plugin in protege to query my ontology, and I found out that it only works for asserted statements and not inferred ones. How can I change this?
SPARQL is defined by several standards. SPARQL 1.1 Query, the main standard, only shallowly relies on RDF semantics. A typical SPARQL query engine is not inferring anything from the RDF/RDFS terms like rdfs:subClassOf, rdfs:range, etc. However, the SPARQL standards also define SPARQL 1.1 Entailment Regimes, which defines how SPARQL engines should answer queries when they do implement inference, which is optional. In order to know whether a SPARQL query engine implements an entailment regime (such as RDFS or OWL DL), you may have to look at the documentation of the engine, or there may be a SPARQL service description available in RDF. SPARQL 1.1 Service Description is yet another SPARQL standard that provides an RDF vocabulary, and a standard way to interpret it, for knowing what features a SPARQL engine is implementing.
When using an endpoint that is hosted in Virtuoso, (like DBpedia endpoint), there are a predefined set of rules that can be used (accessible through the Inference rules link on the top right).
If I need to use one of these rules I can include as the following within the query space at the endpoint:
define input:inference 'ldp'
However, when I try to include an external inference rules set, which is not predefined at the previous list, it triggers an error, as the following:
define input:inference <http://purl.org/goodrelations/v1>
Virtuoso 37000 Error SP031: SPARQL compiler: 'define input:inference refers to undefined inference rule set "http://purl.org/goodrelations/v1"
QUESTION:
Is it possible to include external rules from other vocabularies? and if yes, how?
The DBpedia instance (and any other Virtuoso instance, for that matter) includes a list of preloaded inference rules. Naturally, for a variety of reasons (security, fair use, etc.), we don't allow ad-hoc inclusion of inference rules from external sources.
Note: An inference rule in Virtuoso is a mapping between a Rule and an Ontology (see Using British Royal Family Data Snippets — to demonstrate SPARQL Query Language-based Reasoning & Inference). It's the Rule Name that's used in the Inference Rule pragma of the query, which then indicates the following to the SPARQL processor:
Need to invoke inference context
Specific Rules (again, mappings to an Ontology where relation semantics are defined) to be invoked.
Is there any semantic web reasoner (e.g. Pellet) that accepts rules (SWRL) on the fly ?
or rules must be hard coded before starting the reasoner
It is unclear what you mean by "on the fly". You can edit a reasoner's rule base, so it is not something you need to set in stone from the outset. But the reasoner will need to perform some book keeping if you add new rules, or any other axioms, before it can be used to answer queries. There's no way around that.