I've got an Sqlite DB where the data looks purely numeric (integers) and the column is typed TEXT. I would like to type it as INTEGER if possible.
What query can check if every cell of a certain column can be successfully casted to INT?
SELECT * FROM table WHERE INT(column) != NULL
Alternatively I would like to check if the cells are numeric (don't have any letters/symbols)
SELECT * FROM table WHERE column NOT LIKE "%a-z%"
As a side note, I wanted to do this to reduce the size of the DB, but since Sqlite uses dynamic typing (per cell typing) would this have ANY effect on the size of the DB?
You have to check whether all values can be converted into an integer:
SELECT *
FROM MyTable
WHERE CAST(MyColumn AS INTEGER) IS NOT MyColumn;
Related
I have a table with one column of type text which contains a json string. I would like to do a query where I select a bunch of rows (around 50) and for these rows update a single value in the json that is saved in the text field. So lets say it currently looks like this
{"amount":"45","level":1}
I want to update the amount value for every one of these rows, to for example "level" * 5.
I can't figure out a way to do this with one query since it does not seem possible to alter a single value for a text type field like this. Or am I missing something? Otherwise i will just have to alter it manually for every single row I need to change which would be a pain.
You need to first cast the value to a proper jsonb value, then you can manipulate it using JSON functions.
update the_table
set the_column = jsonb_set(the_column::jsonb, '{level}', to_jsonb((the_column::jsonb ->> 'level')::int * 5))::text
where ....
The expression (the_column::jsonb ->> 'level')::int * 5 extracts the current value of level converts it into an integer and multiplies it with 5. The to_jsonb() around it is necessary because jsonb_set() requires a jsonb value as the last parameter
The '{level}' parameter tells jsonb_set() to put the new value (see above) into the (top level) key level
And finally the whole jsonb value is converted back to a text value.
If you really store JSON in that column, you should think about converting that column to the jsonb data type to avoid all that casting back and forth.
Or maybe think about a properly normalized model where this would be as simple as set level = level * 5
i have subquery, that returns me varchar column, in some cases this column contains only numeric values and in this cases i need to cast this column to bigint, i`ve trying to use CAST(case...) construction, but CASE is an expression that returns a single result and regardless of the path it always needs to result in the same data type (or implicitly convertible to the same data type). Is there any tricky way to change column datatype depending on condition in PostgreSQL or not? google cant help me((
SELECT
prefix,
module,
postfix,
id,
created_date
FROM
(SELECT
s."prefix",
coalesce(m."replica", to_char(CAST((m."id_type" * 10 ^ 12) AS bigint) + m."id", 'FM0000000000000000')) "module",
s."postfix",
s."id",
s."created_date"
FROM some_subquery
There is really no way to do what you want.
A SQL query returns a fixed set of columns, with the names and types being fixed. So, a priori what you want to do does not fit well within SQL.
You could work around this, by inventing your own type, that is either a big integer or a string. You could store the value as JSON. But those are work-arounds. The SQL query itself is really returning one "type" for each column; that is how SQL works.
I have two a table and a view . The table if of two rows of datatypes nvarchar and money. I have being updating the table by selecting from the view like below.
Insert into MyTable
Select * from MyView
Recently, this update fails due to an error "String or binary data would be truncated." However, when i modified by select statement to something like.
Select * from Myview WHERE Column is not null
OR
Select * from Myview WHERE Column > 0
The above work with a warning saying Warning: Null value is eliminated by an aggregate or other SET operation. . It occurred to me that may may be one of the null value records contain something that's not null. My table column is of money type and accept null. I presumed the error may be due to something that's not of money data type. The record is huge. Is there any way i can filter and return those aliens records?
I also i learnt that i can eliminate the error by turning ANSI WARNING SETTION ON & OFF Here . My concern is wouldn't that result in loss of data. Please any help would be appreciated.
String or binary data would be truncated happened because the data coming from the MyView is larger than the column size in MyTable
Use
Select Max(Len(FieldName)) From MyTable
to check the maximum length of the nvarchar field in the MyTable
Or you can use Left when inserting data something Llike this
Insert into MyTable
Select Left(FieldName,50), Column1 from MyView
Note the 50 should be the size of the nvarchar field in MyTable
String or binary data would be truncated is a very common error. It usually happens when we try to insert any data in string (varchar,nvarchar,char,nchar) data type column which is more than size of the column. So you need to check the data size with respect to the column width and identify which column is creating problem and fix it.
Here is another thread of the same problem as yours in stackoverflow.
string or binary data would be truncated
Hope this will help.
Regards
looks like the data in some column in table MyView exceeds the limit of the corresponding one in table MyTable
I have an imported database which contain lots of float field.
When I imported it, all the float field converted to string and I need to convert it back.
I try alter table to change the data type but I keep getting error (even if the row is empty or null).
EDIT : The server is curently down so I can't get the error message at the moment. My DBMS is SQL Server.
you can do so if all data are really numeric type.
There may be some data which hv null --no problem here.
There may be some data which are blank(('') but not null--problem is because of this.
First you make a select query to retrieve all data which are blank
say,
Select * from table1 where col=''
now update these rows with null
update table1 set col=null where col=''
After this,I think you can easily convert it to float
References: error converting data type varchar column to float.
You can first add a new field in your table with datatype float and then update it using convert function:
UPDATE #tbl
SET
num = convert(FLOAT, text)
SELECT *
FROM
#tbl
I have a table with an ID field of INT Type.
I am doing some data validation and noticed that
SELECT * from mytable where id=94
and
SELECT * from mytable where id='94'
works perfectly, but if I use
SELECT * from mytable where id='94dssdfdfgsdfg2'
it gave me the same result! How is this possible?
it would be possible if the internal implementation of MySql's string to int function dropped all characters from the string after the first non-numeric when parsing it.
What you've witnessed is called "implicit data conversion".
Implicit, the opposite of "explicit", means that the data type is automatically converted to the data type of the column being compared when possible. In this case, MYTABLE.id is an INTeger data type so MySQL will convert the value being compared to an INT if it is enclosed in single quotes (string based data type to SQL).
Because of the conversion, the data is getting truncated at the end of the last numeric character after starting from the leftmost position in the string.