SQL LIKE operator to find words in stored JSON - sql

I have this JSON stored in DB:
Column name: json
- '{"brand":"1","year":"2008","model":"2","price":"2001212","category":"Category Example"}'
- '{"brand":"1","year":"2008","model":"2","price":"2001212","category":"Category Example2"}'
I want to make a search using Like operator to find all categories with "Category" word:
At this moment Im doing it this way, but only return a complete phrase:
select * from table where json like '%"category":"Category Example"%';
How can I build a query that returns all categories with "Category word"?
Updated:
I'm using MySQL
Thanks

While undeclared this looks like a Postgres question.
There are hardly any JSON-processing tool in the current version 9.2.
But a whole set of tools will be shipped with the upcoming Postgres 9.3 currently in beta.
I interpret your question as:
Find all rows where the json column contains one or more fields named 'category' holding a value that contains the string 'Category'.
One ore more? Not sure if Postgres enforces uniqueness, I don't have a 9.3 installation at hand.
With Postgres 9.3, your query could look like this:
SELECT *
FROM tbl
WHERE json->>'category' LIKE '%Category%'
->> .. "Get JSON object field as text"
Use ILIKE for a case insensitive search.
More in this related answer:
How do I query using fields inside the new PostgreSQL JSON datatype?

Can you use a library? The "common schema" library offers a function that does just what you need:
http://common-schema.googlecode.com/svn/trunk/common_schema/doc/html/extract_json_value.html

Maybe I asked a really bad question, because I could make the search using Regexp.
I found this solution. Maybe this is not the fastest way, but does what I need:
select * from table where json regexp '"category":"([^"]*)Category([^"]*)"';
Thanks

I hope this helps.
select * from table where json #> '{"category":"Category Example"}';

Related

Generate a json object using column names as keys

Is it possible to generate a json object using the column name as keys automatically?
I have a table with many columns and I need to dump it into a json object.
I know I can do this using the JSON_OBJECT function but I was looking for a more condensed syntax that would allow me to do this without having to specify the name of all the columns
SELECT JSON_OBJECT("col_a", m.col_a, "col_b", m.col_b, "col_c", m.col_c, ...)
FROM largetable AS m
Something like this?
SELECT JSON_OBJECT(m.*)
FROM largetable AS m
I'm using MariaDB version 10.8.2
Json objects make sense in other languages ​​like javascript, C#... There are many libraries to convert the result of a MariaDB query into a list of json objects in most languages.
Also, a good practice is to make the database engine do as little effort as possible when performing queries and processing the result in the application.
This is of course not possible, since the parser would not accept an odd number of parameters for the JSON_OBJECT function.
To do that in pure SQL, you can't do that within a single statement, since you need to retrieve the column names from information_schema first:
select #statement:=concat("SELECT JSON_OBJECT(", group_concat(concat("\"",column_name,"\"", ",", column_name)),") FROM mytable") from information_schema.columns where table_name="mytable" and table_schema="test";
prepare my_statement from statement;
execute my;
Much easier and faster is to convert the result in your application, for example in Python:
import mariadb, json
conn= mariadb.connect(db="test")
cursor= conn.cursor(dictionary=True)
cursor.execute("select * from mytable")
json_row= json.dumps(cursor.fetchone())

Can you use DOES NOT CONTAIN in SQL to replace not like?

I have a table called logs.lognotes, and I want to find a faster way to search for customers who do not have a specific word or phrase in the note. I know I can use "not like", but my question is, can you use DOES NOT CONTAINS to replace not like, in the same way you can use:
SELECT *
FROM table
WHERE CONTAINS (column, ‘searchword’)
Yes, you should be able to use NOT on any boolean expression, as mentioned in the SQL Server Docs here. And, it would look something like this:
SELECT *
FROM table
WHERE NOT CONTAINS (column, ‘searchword’)
To search for records that do not contain the 'searchword' in the column. And, according to
Performance of like '%Query%' vs full text search CONTAINS query
this method should be faster than using LIKE with wildcards.
You can also simply use this:
select * from tablename where not(columnname like '%value%')

Ignore special characters in the WHERE clause

I have a table named artists with a record with the value 'Miró' in the name column. When I do this request:
SELECT "artists".* FROM "artists" WHERE name = 'Miró'
I have one result, so it works.
Now, when I do this request (without the special ó) :
SELECT "artists".* FROM "artists" WHERE name = 'Miro'
I don't find anything. I want to ignore the special char. Is there a way to do it?
I have postgres 9.1.9.
For a more targeted pattern matching, you can use the function unaccent(), provided by the additional module unaccent:
SELECT * FROM artists WHERE unaccent(name) = 'Miro';
To make this fast, create a functional index. You have to overcome the obstacle that the function is only STABLE, not IMMUTABLE. I wrote a comprehensive answer with instructions (including installation) and links recently:
Does PostgreSQL support "accent insensitive" collations?
You could try using LIKE instead...
SELECT "artists".* FROM "artists" WHERE name like 'Mir%'

Search in every column

I'm building an abstract gem. i need a sql query that looks like this
SELECT * FROM my_table WHERE * LIKE '%my_search%'
is that possible?
edit:
I don't care about querys performance because it's a feature function of a admin panel, which is used once a month. I also don't know what columns the table has because it's so abstract. Sure i could use some rails ActiveRecord functions to find all the columns but i hoped to avoid adding this logic and just using the *. It's going to be a gem, and i can't know what db is going to be used with it. Maybe there is a sexy rails function that helps me out here.
As I understand the question, basically you are trying to build a sql statement which should check for a condition across all columns in that table. A dirty hack, but this generates the required Sql.
condition_string = MyTable.column_names.join(' LIKE ? OR ')
MyTable.all(:conditions => [condition_string, '%my_search%'])
However, this is not tested. This might work.
* LIKE '...' isn't valid according to the SQL standards, and not supported by any RDBMS I'm aware of. You could try using a function like CONCAT to make the left argument of LIKE, though performance won't be good. As for SELECT *, it's generally something to be avoided.
No, SQL does not support that syntax.
To search all columns you need to use procedures or dynamic SQL. Here's another SO question which may help:
SQL: search for a string in every varchar column in a database
EDIT: Sorry, the question I linked to is looking for a field name, not the data, but it might help you write some dynamically SQL to build the query you need.
You didn't say which database you are using, as there might be a vendor specific solution.
Its only an Idea, but i think it worth testing!
It depends on your DB you can get all Columns of a table, in MSSQL for example you can use somethink like:
select name from syscolumns where id=object_id('Tablename')
Under Oracle guess its like:
select column_name from USER_TAB_COLUMNS where TABLE_NAME = 'Tablename'
and then you will have to go through these columns usign a procedure and maby a cursor so you can check for each Column if the data your searching for is in there:
if ((select count(*) from Tablename where Colname = 'searchingdata') > 0)
then keep the results in a separated table(ColnameWhereFound, RecNrWhereFound).
The matter of Datatye may be an Issue if you try to compare strings with numbers, but if you notice for instance under SQL-Server the syscolumns table contains a column called "usertype" which contains a number seems to refer to the Datatype stored in the Columne, like 2 means string and 7 means int, and 2 means smallint, guess Oracle would have something similar too.
Hope this helps.

How to search for obtaining the "best" result

suppose someone enter this search (on an form):
Nicole Kidman films
Which SQL i can use to find "the best" results ?
I suppose something like this :
SELECT * FROM myTable WHERE ( Field='%Nicole Kidman Films%' OR Field='%Nicole%' OR Field='%Kidman%' OR Field='%Films%' )
My question is how to get most relevant result ?
Thank you very much!
Full-Text Search:
SELECT * FROM myTable WHERE MATCH(Field) AGAINST('Nicole Kidman Films')
This query will return rows in order of relevancy, as defined by the full-text search algorithm.
Note: This is for MySQL, other DBMS have similar functionality.
What you're looking for is often called a "full text search" or a "natural language search". Unfortunately it's not standard SQL. Here's a tutorial on how to do it in mysql: http://devzone.zend.com/article/1304
You should be able to find examples for other database engines.
In SQL, the equals sign doesn't support wildcards in it - your query should really be:
SELECT *
FROM myTable
WHERE Field LIKE '%Nicole Kidman Films%'
OR Field LIKE '%Nicole%'
OR Field LIKE '%Kidman%'
OR Field LIKE '%Films%'
But wildcarding the left side won't use an index, if one exists.
A better approach is to use Full Text Searching, which most databases provide natively but there are 3rd party vendors like Sphinx. Each has it's own algorithm to assign a rank/score based on the criteria searched on in order to display what the algorithm deems most relevant.