I want to change the data in my triple store using SPARQL. Is DELETE and INSERT the only way?
Related
Is there a convenient way (Python, web UI, or CLI) for inserting a new column into an existing BigQuery table (that already has 100 columns or so) and update the schema accordingly?
Say I want to insert it after column 49. If I do this via a query, I will have to type every single column name, will I not?
Update: the suggested answer does not make it clear how this applies to BigQuery. Furthermore, the documentation does not seem to cover
ALTER TABLE `tablename` ADD `column_name1` TEXT NOT NULL AFTER `column_name2`;
Syntax. A test confirmed that the AFTER identifier does not work for BigQuery.
I think that is not possible to perform this action in a simple way, I thought in some workarounds to reach this such as:
Create a view after adding your column.
Creating a table from a query result after adding your column.
On the other hand, I can't catch how this is useful, the only scenario I can think for this requirement is if you are using SELECT * which is not recommended when using BigQuery according with the Bigquery best practices. If is not the case share your case of use to get a better understanding of it.
Since this is not a current feature of BigQuery you can file a feature request asking for this feature.
Is it possible to update multiple rows with one query? Like in insert i can pass an array of objects and each key refers each column. Is there anything like that for an update query?
I have an array of objects (id, value) and i want to update all the fields that match id from the object with the value from the same object.
There is a way to update it using PostgreSQL query and it was answered here. But it requires some magic and ugly .raw code to use this with knex. So, I'd recommend using multiple update statements in one transaction. And synchronize them using Promise.all([updates]).
I have a database and i want to change it to rdf
for now I just want to change the data from one table to rdf (later I will do the same with the whole database)
i have an access to that table from sql developer, it is an oracle sql table
it has many columns but mainly i am interested in
ID column and name column
so the data is like this:
ID name
1 oneNameSomething
2 anotherDicName
but the table has 34000 row
is it possible to do sometihng like:
prefix:idValueFromTable a SpecificClass .
prefix:idValueFromTable prefix:hasName prefix:NameFromTable
i know i can build a tool to do that, but giving a quick search on internet https://www.w3.org/wiki/ConverterToRdf#SQL sound like there are already tools, but i don't know which one works in my case, i thought i ask you first
This page doesn't have the tool
https://www.w3.org/TR/r2rml/#overview
For a "one-off" translation, you could just execute an SQL query, and then get the resulting rows, and write output as Turtle or N-Triples. That would probably be the quickest way to get some RDF.
If you need a more principled approach, and for more than just one table (and with cross referencing between tables), I'd look into some of the tools for mapping relational databases to RDF, such as D2RQ. The mapping schemes are relatively flexible, you can get RDF back without a whole lot of setup.
To get the data back out, you could use a SPARQL construct query, if you want to precisely control what you get, or (as I just learned from your comment), you can use the dump-rdf tool to get an RDF dump of the database.
Is it possible to save all data firstly in a data structure and then save this data structure in a table in data base.
I have a process, in which I should write rows very often in a table. I want to save these rows in a data structure, and then save this datastucture only one time in table in access
Sure, you can create your own data structures by using
the Type statement or
by creating Class Modules.
The former is simpler, the latter more flexible.
Unfortunately, there is no built-in way to serialize your data structure into a table row, so you will have to write that code yourself, either by using an INSERT statement or a parameterized append query.
We receive bulk data from our customers in a spread sheet. I loaded them in a temporary table as of now. I tried to normalize the data and create a parent table and each parent to have 4 or 5 child rows. Is there a way to insert all the parents and their children using queries? I dont want to write an Application to do that
If you're specifically looking to write only queries to perform this task, check out the following article that outlines a way to set up a linked server to an excel doc.
http://www.sql-server-helper.com/tips/read-import-excel-file-p01.aspx
If you get this set up correctly (which can be a pain) it's as easy as querying the data directly.