Problem with data into MariaDB using the SELECT clause in WHERE section - sql

I don't know how to explain but I'll try, into my database in a table, I have one record with many fields.
The username field, for example, contains the value = 'any-user-test' but if I execute a "SELECT" clause and in the WHERE section I compare username='any-user-test' the result does not contain the record.
But if I compare using username LIKE '%any-user-test' the record is returned.
And as further proof using:
WHERE CONVERT(username USING ASCII) = 'any-user-test'
the record is returned too.
The database is MariaDB in a server Ubuntu using encryption and CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci.
Any idea how to identify the problem?

Related

Get rid of just the UPPER form of a word in SQL

I want to get rid of a specific word that exists in an UPPER and Capitalize manner. E.g the word is CONDIMENTS and Condiments. It exists in both ways and I want the first( i.e "CONDIMENTS") to be deleted. If I use case when X IN('CONDIMENTS') then NULL, both values disappear. Any ideas?
The RDBMS is sybase SQL Anywhere
Essentially I want to make my IN operator case sensitive. Thanks in advance!!
Can't remember it clearly -- as I used ASA 12 about 3 years before and got the similar issue as yours...Here's what I can remember --
ASA instance can't change the property of "case sensitive" after creation -- there's a parameter "case [respect|ignore]" only in "create database" command and after the DB was created -- this option can't be changed otherwise the index especially the index over string columns would be corrupted.
So you can only check this DB property with
select db_property('CaseSensitive') -- if it's OFF then your DB was created as no case sensitive.
If you really need your DB case sensitive -- you have to recreate this ASA db and export/import data from your original db...

PostgreSQL database considering the script as ANSI instead of UTF8

I have executed a script that updates a column in a database and that worked well.
The script would be having an update statement as below. It is trying to update the display_name with an inverted comma in it.
Update table1
Set display_name = 'I'm Kumar'
Where internal_name = 'IK';
When I executed the same script in another database, it is updating the display name with some special character in place of an inverted comma. Seems like the script is being considered as Ansi encoded format instead of UTF-8 format.
Please help me to understand why is this happening. Will there be any setting at the database level to change.
Yes, and that setting is client_encoding.
The default value is specified in the server configuration, and the client has to override it if desired:
SET client_encoding = 'UTF8';

How to make SQL server return only the exact matches?

I am using mssql database with sequelize for my node.js application. I want only the exact match of my query string.but I am getting the same strings with different cases also.
For example, I want to fetch the record where some column = 'work'. But if I give 'WoRK' or something like that it should not return anything. But it matches with the 'work' in the database.
How can I change that?
Can anyone help me out?
SQL Server's string comparisons are by default case-insensitive (unless the table column is defined with a proper collation).
You can do this in the query by setting a collation for the equality - for example:
where some_column = 'work' collate Latin1_General_CS_AS

SSMS 2014 - DB Collation for unicode/ multiple languages

In SSMS 2014, I have a DB with collation set as Chinese_Simplified_Stroke_Order_100_CI_AI, in which I create a table for a regular process with about 50 columns in it.
It basically contains sales data of some products. Few of these columns have integers as values and others contain English text.
Two of these columns, however contain values in the form of text which is in Chinese. A sample script that I update the column with is as below:
ALTER TABLE table_xyz
ALTER COLUMN comments NVarchar (max)
COLLATE Chinese_Simplified_Stroke_Order_100_CI_AI
UPDATE table_xyz
SET comments =
CONCAT('以下的邮件是专为您的最终客户所准备。', Account_Name)
This has been working fine till now.
I have a similar table in a second DB with collation set as Japanese_CI_AS_KS_WS and accordingly the comments in this table are in Japanese. Sample update statement as below:
ALTER TABLE table_abc
ALTER COLUMN comments NVarchar (max) COLLATE Japanese_CI_AS_KS_WS
UPDATE table_xyz
SET comments =
CONCAT('次の電子メールは顧客のためのテンプレートです', Account_Name)
Now, I have been tasked to transfer these tables to an existing DB that has SQL_Latin1_General_CP1_CI_AS collation set as default. The problem is I whenever I update the above two tables in this new DB, all I get in the output is '???????'
I have tried searching for solutions and have observed the below:
Many suggestions include converting the datatype to unicode.
A few people at my workplace suggested changing the collation of the column.
Use UTF-8 as default character set
As per my knowledge the first two are already taken care of when I run the Alter table statement. The third point seems valid for MySQL and not SQL Server.
Also, if I import the table from the respective DBs directly along with the data, the column values are displayed correctly (in Chinese and Japanese text). However, when I truncate and try to load the data I face the problem. I would be unable to import data in this way, since the end objective is to keep all tables in a single DB and purge the remaining DBs.
Hope I've made the problem statement clear.
You need to use N at the beginning of string literal containing UNICODE characters.
The code in your first example should be like this.
CONCAT(N'以下的邮件是专为您的最终客户所准备。', Account_Name)

SQL PWDENCRYPT & PWDCOMPARE 2008 vs 2012 Return Different Results [duplicate]

I have a web application done in ASP.NET MVC 4. It has users, that are stored in SQL Server database in tables webpages_UserProfile and webpages_Membership, etc.
I have another application, and what I need to do is to query the table webpages_Membership, where password of users are stored encrypted, and compare them to a plain text password.
So I tried doing something like
SELECT *
FROM webpages_Membership
WHERE PwdCompare('mypasswordsend', Password) = 1
But it doesn't works. I know the column is a nvarchar(128).
How can I compare it?
Let's look at the second argument to PwdCompare (emphasis mine):
password_hash
Is the encryption hash of a password. password_hash is *varbinary(128)*.
So, if your column is storing the password in plain text, or is storing a string representation of the binary hash, it's not going to work. You should either change the column to be correct or you will need to convert it first, e.g. check this script:
SELECT PWDENCRYPT(N'mypassword');
Yields:
0x0200D422C0365A196E308777C96CBEF3854818601DDB516CADA98DBDF6A5F23922DC0FADD29B806121EA1A26AED86F57FCCB4DDF98F0EFBF44CA6BA864E9E58A818785FDDEDF
If we try to compare that value as a string, we get 0:
SELECT PWDCOMPARE(N'mypassword', N'0x0200D422C0365A196E308777C96CBEF3854818601DDB516CADA98DBDF6A5F23922DC0FADD29B806121EA1A26AED86F57FCCB4DDF98F0EFBF44CA6BA864E9E58A818785FDDEDF');
If we try to compare it as a varbinary value, we get 1:
SELECT PWDCOMPARE(N'mypassword', 0x0200D422C0365A196E308777C96CBEF3854818601DDB516CADA98DBDF6A5F23922DC0FADD29B806121EA1A26AED86F57FCCB4DDF98F0EFBF44CA6BA864E9E58A818785FDDEDF);
If you can't fix the table, then you can perform this expensive explicit conversion in your query every time (note that the trailing ,1 is important):
SELECT PWDCOMPARE(N'mypassword',
CONVERT(VARBINARY(128), N'0x0200D422C0365A196E308777C96CBEF3854818601DDB516CADA98DBDF6A5F23922DC0FADD29B806121EA1A26AED86F57FCCB4DDF98F0EFBF44CA6BA864E9E58A818785FDDEDF'
, 1));