Is it possible to generate a json object using the column name as keys automatically?
I have a table with many columns and I need to dump it into a json object.
I know I can do this using the JSON_OBJECT function but I was looking for a more condensed syntax that would allow me to do this without having to specify the name of all the columns
SELECT JSON_OBJECT("col_a", m.col_a, "col_b", m.col_b, "col_c", m.col_c, ...)
FROM largetable AS m
Something like this?
SELECT JSON_OBJECT(m.*)
FROM largetable AS m
I'm using MariaDB version 10.8.2
Json objects make sense in other languages like javascript, C#... There are many libraries to convert the result of a MariaDB query into a list of json objects in most languages.
Also, a good practice is to make the database engine do as little effort as possible when performing queries and processing the result in the application.
This is of course not possible, since the parser would not accept an odd number of parameters for the JSON_OBJECT function.
To do that in pure SQL, you can't do that within a single statement, since you need to retrieve the column names from information_schema first:
select #statement:=concat("SELECT JSON_OBJECT(", group_concat(concat("\"",column_name,"\"", ",", column_name)),") FROM mytable") from information_schema.columns where table_name="mytable" and table_schema="test";
prepare my_statement from statement;
execute my;
Much easier and faster is to convert the result in your application, for example in Python:
import mariadb, json
conn= mariadb.connect(db="test")
cursor= conn.cursor(dictionary=True)
cursor.execute("select * from mytable")
json_row= json.dumps(cursor.fetchone())
Related
I am evaluating if we can migrate from plain JDBC to jOOQ for our project. Most of it looks promising, but I am wondering currently about one specific flow: nested rows. Let me explain.
Say you have the following two tables:
class(id, name)
student(id, name, class_id)
(We assume that a student can only be part of one class.)
Let's create a response type for these tables. I will be using these in the queries below.
create type type_student as(id integer, name text);
create type type_class as(id integer, name text, students type_student[]);
Now let's fetch all classes with its student by using nested rows:
select row(class.id, class.name, array
(
select row(student.id, student.name)::type_student
from student
where student.class_id = class.id
))::type_class
from class
A useful variant is to use only nested rows in arrays:
select class.id, class.name, array
(
select row(student.id, student.name)::type_student
from student
where student.class_id = class.id
) as students
from class
I am wondering if jOOQ has an elegant approach to parse such results containing nested rows?
Your usage of the word "parse" could mean several things, and I'll answer them all in case anyone finds this question looking for "jOOQ" / "parse" / "row".
Does the org.jooq.Parser support row value expressions?
Not yet (as of jOOQ 3.10 and 3.11). jOOQ ships with a SQL parser that parses (almost) anything that can be represented using the jOOQ API. This has various benefits, including:
Being able to reverse engineer DDL scripts for the code generator
Translating SQL between dialects (see an online version here: https://www.jooq.org/translate)
Unfortunately, it cannot parse row value expressions in the projection yet i.e. in the SELECT clause.
Does the jOOQ API support ("parse") row value expressions?
Yes, you can use them using the various DSL.row() constructors, mainly for predicates, but also for projections by wrapping them in a Field using DSL.rowField(). As of jOOQ 3.11, this is still a bit experimental as there are many edge cases in PostgreSQL itself, related to what is allowed and what isn't. But in principle, queries like yours should be possible
Does jOOQ support parsing the serialised version of a PostgreSQL record
PostgreSQL supports these anonymous record types, as well as named "composite" types. And arrays thereof. And nesting of arrays and composite types. jOOQ can serialise and deserialise these types if type information is available to jOOQ, i.e. if you're using the code generator. For instance, if your query is stored as a view
create view test as
select row(class.id, class.name, array
(
select row(student.id, student.name)::type_student
from student
where student.class_id = class.id
))::type_class
from class
Then, the code generator will produce the appropriate types, including:
TypeStudentRecord
TypeClassRecord
Which can be serialised as expected. In principle, this would be possible also without the code generator, but you'd have to create the above types yourself, manually, so why not just use the code generator.
Yes it does: https://www.jooq.org/doc/latest/manual/sql-building/table-expressions/nested-selects/
Field<Object> records =
create.select(student.id, student.name)
.from(student)
.where(student.class_id.eq(class.id)
.asField("students");
create.select(class.id, class.name, array, records)
.from(class)
.fetch();
The above example might not work directly as I have not tried, but just wanted to give a general idea.
Note: that the object records is not executed alone. When fetch is called in the second statement, JOOQ should create one SQL statement internally.
I have a postgres table with jsonb array elements and I'm trying to do sql queries to extract the matching elements. I have the raw SQL query running from the postgres command line interface:
select * from movies where director #> any (array ['70', '45']::jsonb[])
This returns the results I'm looking for (all records from the movies table where the director jsonb elements contain any of the elements in the input element).
In the code, the value for ['70, '45'] would be a dynamic variable ie. fixArr and the length of the array is unknown.
I'm trying to build this into my Bookshelf code but haven't been able to find any examples that address the complexity of the use case. I've tried the following approaches but none of them work:
models.Movies.where('director', '#> any', '(array' + JSON.stringify(fixArr) + '::jsonb[])').fetchAll()
ERROR: The operator "#> any" is not permitted
db.knex.raw('select * from movies where director #> any(array'+[JSON.stringify(fixArr)]+'::jsonb[])')
ERROR: column "45" does not exist
models.Movies.query('where', 'director', '#>', 'any (array', JSON.stringify(fixArr) + '::jsonb[])').fetchAll()
ERROR: invalid input syntax for type json
Can anyone help with this?
As you have noticed, knex nor bookshelf doesn't bring any support for making jsonb queries easier. As far as I know the only knex based ORM that supports jsonb queries etc. nicely is Objection.js
In your case I suppose better operator to find if jsonb column contains any of the given values would be ?|, so query would be like:
const idsAsString = ids.map(val => `'${val}'`).join(',');
db.knex.raw(`select * from movies where director \\?| array[${idsAsString}]`);
More info how to deal with jsonb queries and indexing with knex can be found here https://www.vincit.fi/en/blog/objection-js-postgresql-power-json-queries/
No, you're just running into the limitations of that particular query builder and ORM.
The easiest way is using bookshelf.Model.query and knex.raw (whereRaw, etc.). Alias with AS and subclass your Bookshelf model to add these aliased attributes if you care about such things.
If you want things to look clean and abstracted through Bookshelf, you'll just need to denormalize the JSONB into flat tables. This might be the best approach if your JSONB is mostly flat and simple.
If you end up using lots of JSONB (it can be quite performant with appropriate indexes) then Bookshelf ORM is wasted effort. The knex query builder is only not a waste of time insofar as it handles escaping, quoting, etc.
I have this JSON stored in DB:
Column name: json
- '{"brand":"1","year":"2008","model":"2","price":"2001212","category":"Category Example"}'
- '{"brand":"1","year":"2008","model":"2","price":"2001212","category":"Category Example2"}'
I want to make a search using Like operator to find all categories with "Category" word:
At this moment Im doing it this way, but only return a complete phrase:
select * from table where json like '%"category":"Category Example"%';
How can I build a query that returns all categories with "Category word"?
Updated:
I'm using MySQL
Thanks
While undeclared this looks like a Postgres question.
There are hardly any JSON-processing tool in the current version 9.2.
But a whole set of tools will be shipped with the upcoming Postgres 9.3 currently in beta.
I interpret your question as:
Find all rows where the json column contains one or more fields named 'category' holding a value that contains the string 'Category'.
One ore more? Not sure if Postgres enforces uniqueness, I don't have a 9.3 installation at hand.
With Postgres 9.3, your query could look like this:
SELECT *
FROM tbl
WHERE json->>'category' LIKE '%Category%'
->> .. "Get JSON object field as text"
Use ILIKE for a case insensitive search.
More in this related answer:
How do I query using fields inside the new PostgreSQL JSON datatype?
Can you use a library? The "common schema" library offers a function that does just what you need:
http://common-schema.googlecode.com/svn/trunk/common_schema/doc/html/extract_json_value.html
Maybe I asked a really bad question, because I could make the search using Regexp.
I found this solution. Maybe this is not the fastest way, but does what I need:
select * from table where json regexp '"category":"([^"]*)Category([^"]*)"';
Thanks
I hope this helps.
select * from table where json #> '{"category":"Category Example"}';
I'm building an abstract gem. i need a sql query that looks like this
SELECT * FROM my_table WHERE * LIKE '%my_search%'
is that possible?
edit:
I don't care about querys performance because it's a feature function of a admin panel, which is used once a month. I also don't know what columns the table has because it's so abstract. Sure i could use some rails ActiveRecord functions to find all the columns but i hoped to avoid adding this logic and just using the *. It's going to be a gem, and i can't know what db is going to be used with it. Maybe there is a sexy rails function that helps me out here.
As I understand the question, basically you are trying to build a sql statement which should check for a condition across all columns in that table. A dirty hack, but this generates the required Sql.
condition_string = MyTable.column_names.join(' LIKE ? OR ')
MyTable.all(:conditions => [condition_string, '%my_search%'])
However, this is not tested. This might work.
* LIKE '...' isn't valid according to the SQL standards, and not supported by any RDBMS I'm aware of. You could try using a function like CONCAT to make the left argument of LIKE, though performance won't be good. As for SELECT *, it's generally something to be avoided.
No, SQL does not support that syntax.
To search all columns you need to use procedures or dynamic SQL. Here's another SO question which may help:
SQL: search for a string in every varchar column in a database
EDIT: Sorry, the question I linked to is looking for a field name, not the data, but it might help you write some dynamically SQL to build the query you need.
You didn't say which database you are using, as there might be a vendor specific solution.
Its only an Idea, but i think it worth testing!
It depends on your DB you can get all Columns of a table, in MSSQL for example you can use somethink like:
select name from syscolumns where id=object_id('Tablename')
Under Oracle guess its like:
select column_name from USER_TAB_COLUMNS where TABLE_NAME = 'Tablename'
and then you will have to go through these columns usign a procedure and maby a cursor so you can check for each Column if the data your searching for is in there:
if ((select count(*) from Tablename where Colname = 'searchingdata') > 0)
then keep the results in a separated table(ColnameWhereFound, RecNrWhereFound).
The matter of Datatye may be an Issue if you try to compare strings with numbers, but if you notice for instance under SQL-Server the syscolumns table contains a column called "usertype" which contains a number seems to refer to the Datatype stored in the Columne, like 2 means string and 7 means int, and 2 means smallint, guess Oracle would have something similar too.
Hope this helps.
This is hopefully just a simple question involving performance optimizations when it comes to queries in Sql 2008.
I've worked for companies that use Stored Procs a lot for their ETL processes as well as some of their websites. I've seen the scenario where they need to retrieve specific records based on a finite set of key values. I've seen it handled in 3 different ways, illustrated via pseudo-code below.
Dynamic Sql that concatinates a string and executes it.
EXEC('SELECT * FROM TableX WHERE xId IN (' + #Parameter + ')'
Using a user defined function to split a delimited string into a table
SELECT * FROM TableY INNER JOIN SPLIT(#Parameter) ON yID = splitId
USING XML as the Parameter instead of a delimited varchar value
SELECT * FROM TableZ JOIN #Parameter.Nodes(xpath) AS x (y) ON ...
While I know creating the dynamic sql in the first snippet is a bad idea for a large number of reasons, my curiosity comes from the last 2 examples. Is it more proficient to do the due diligence in my code to pass such lists via XML as in snippet 3 or is it better to just delimit the values and use an udf to take care of it?
There is now a 4th option - table valued parameters, whereby you can actually pass a table of values in to a sproc as a parameter and then use that as you would normally a table variable. I'd be preferring this approach over the XML (or CSV parsing approach)
I can't quote performance figures between all the different approaches, but that's one I'd be trying - I'd recommend doing some real performance tests on them.
Edit:
A little more on TVPs. In order to pass the values in to your sproc, you just define a SqlParameter (SqlDbType.Structured) - the value of this can be set to any IEnumerable, DataTable or DbDataReader source. So presumably, you already have the list of values in a list/array of some sort - you don't need to do anything to transform it into XML or CSV.
I think this also makes the sproc clearer, simpler and more maintainable, providing a more natural way to achieve the end result. One of the main points is that SQL performs best at set based/not looping/non string manipulation activities.
That's not to say it will perform great with a large set of values passed in. But with smaller sets (up to ~1000) it should be fine.
UDF invocation is a little bit more costly than splitting the XML using the built-in function.
However, this only needs to be done once per query, so the performance difference will be negligible.