I am trying to retrieve the data from Neptune DB by using SPARQL queries. I connected to the EC2 instance which has same VPC as Neptune from local Jupyter Notebook. But the query is not retrieving any data and stdout is empty, I'm not sure where I am going wrong. Any help would be greatly appreciated. Please find the query below.
stdin, stdout, stderr = ssh.exec_command(
' curl https://.cluster-.us-east1.neptune.amazonaws.com:8182/sparql \-d "query=PREFIX mol: <http://www.semanticweb.org/25#>\
SELECT * WHERE {?s ?p ?o } LIMIT 1" \ -H "Accept: text/csv"')
Thank you in Advance.
You maybe running into issues with cURL. cURL has issues with multiple line inputs (such as a SPARQL query). Best to use the following format:
curl 'https://cluster.cluster.us-east-2.neptune.amazonaws.com:8182/sparql' \
-H "Accept: text/csv" \
--data-urlencode query='
PREFIX mol: <http://www.semanticweb.org/25#>
SELECT * WHERE { ?s ?p ?o } LIMIT 1'
Related
Can anyone please let me know how to save the results of SPARQL queries in AWS Neptune to CSV file. I am using sagemaker notebook to connect to database cluster. Please find the below query. Any help would be greatly appreciated.
%%sparql
PREFIX mo: <>
SELECT (strafter(str(?s), '#') as ?sample)WHERE { ?s mo:belongs_to_project ?o.
FILTER(regex(str(?o), "PROJECT_00001"))}
The results from a query can be saved to a Python variable using:
%%sparql --store-to result
Then in another cell you could write a little Python code that takes the result (which will be in JSON) and creates a CSV containing whatever data you need from the result using the Python CSV helper classes and methods (https://docs.python.org/3/library/csv.html).
UPDATED: to add that if you just want the whole result as a CSV file you can achieve that by running a curl command from a cell using the %%bash magic. Here is an example:
%%bash
curl -X POST --data-binary 'query=SELECT ?s ?p ?o WHERE {?s ?p ?o} LIMIT 10' -H "Accept: text/csv" https://{cluster endpoint}:8182/sparql > results.csv
How to create a REST query to get all roles assigned to a user?
As this is many to many relation in the opposite direction the regular $relatedTo operator seems to be not enough...
finally I found the solution which seems to be much easier than I was afraid of :) As I found similar questions on SO and github I hope it will help others.
the curl query to get all the roles directly assigned to a user is:
curl -X GET \
-H "X-Parse-Application-Id: ${APPLICATION_ID}" \
-H "X-Parse-REST-API-Key: ${REST_API_KEY}" \
https://parseapi.back4app.com/roles \
--data-urlencode \
'where={"users":{"__type":"Pointer","className":"_User","objectId":"<objectId>"}}'
Seen many related posts but nothing has helped me resolve my issue. I have a CURL command in a bash script. The Data element contains a SQL command. The SQL command includes a param that must be single quoted. So, it looks like this:
jsonData=$(curl --request POST \
--url $uribase/Redrock/Query \
--header 'content-type: application/json' \
--header 'X-NATIVE-CLIENT: 1' \
--data '{"Script":"'"Select Server.ID, Server.ComputerClass, Server.FQDN, Server.Name, Server.SessionType from Server WHERE Server.ComputerClass='Unix' COLLATE NOCASE"'"}')
I've tried so many variations on quoting the Server.ComputerClass value to no avail. I can't get it to resolve to ='Unix'. Even tried \u0027 unicode.
Any help appreciated.
Rather than trying to quote it correctly, feed it to curl from a here document via standard input. Use an array to allow you to organize the options more cleanly as well.
curl_opts=( --request POST
--url "$uribase/Redrock/Query"
--header 'content-type: application/json'
--header 'X-NATIVE-CLIENT: 1'
--data #-
)
The #- tells curl to read the data from a file named -, which is a curl-defined alias for standard input.
jsonData=$(curl "${curl_opts[#]}" <<EOF
{ "Script": "Select Server.ID, Server.ComputerClass, Server.FQDN, Server.Name, Server.SessionType from Server WHERE Server.ComputerClass='Unix' COLLATE NOCASE"}
EOF
)
Quoting gets confusing fast. Yes, using a file to store the statement will work ; if you prefer not to do that, then here is something you could try.
Create a function :
sql_request()
{
printf "'{"
printf '"Script"'
...
}
You get the general idea : break it down in as many pieces as needed ; when you need a single quote enclose it in double quotes, when you need a double quote, enclose it in single quotes.
You can then call that function as needed to get your request :
"$(sql_request)"
You can even easily feed arguments into this function to change portions of the string.
I'm trying to issue a complicated query against a fuseki server that I'm running locally through a browser but it keeps crashing- is it possible to do it through a python script? If so- how?
You can use any suitable command line tool, for example curl:
curl http://localhost:3030/your_service/sparql --data 'query=ASK { ?s ?p ?o . }'
If you want to use Python specifically, you can use SPARQLWrapper, or just the Requests package.
Example using Requests:
import requests
response = requests.post('http://localhost:3030/your_service/sparql',
data={'query': 'ASK { ?s ?p ?o . }'})
print(response.json())
./s-query --service=http://localhost:3030/myDataset/query --query=/home/matthias/EIS/EDSA/27/18.05/queryFile.rq
With the above command it can also work.
Follow the ideas from the SOH - SPARQL over HTTP page, i.e.
SOH SPARQL Query
s-query --service=endpointURL 'query string'
s-query --service=endpointURL --query=queryFile.rq
Im currently trying to get Parse's geo-query system to work. On my data browser, I have an installation with a key "location" of type geo-point that has a geo-point with latitude 30.27263636013176 and longitude -97.75766807716373 set in it.
However, if I try to query with the following code, I always get "results":[].
curl -X GET \
-H "X-Parse-Application-Id: myAppKey" \
-H "X-Parse-REST-API-Key: myAPIKey" \
-G \ --data-urlencode 'limit=10' \
--data-urlencode 'where={
"location": {
"$nearSphere": {
"__type": "GeoPoint",
"latitude": 30,
"longitude": -97
}
}
}' \
https://api.parse.com/1/classes/PlaceObject
Note that the query is running successfully; there are no errors. The problem is the installation I have should come up.
If I change the latitude and longitude in the query to exactly the latitude and longitude shown in the data browser, I still get empty results. What is the reason for this? Is it not possible to query for device installations near a point?
The Installation class can't be queried from the client, for good reason. Too much sensitive information is stored in the Installation class to allow querying from a client.
Either move the location property to another class you can query, or query it in a Cloud Function.
You can query it in Cloud Code if you use Parse.Cloud.useMasterKey();, though I strongly recommend using a different class to store the location.
Should the url be https://api.parse.com/1/classes/Installation instead of https://api.parse.com/1/classes/PlaceObject ?