Pentaho error (Insert / update ETL Out.0 - Incorrect integer value: 'N' for column 'test_column' at row 1) - pentaho

I have a transformation that is flattening all data to one json data.
There is a another column in my table which is test_column tinyint(1) DEFAULT '0'.
Under the insert / update step, its showing error of :
Insert / update ETL Out.0 - Incorrect integer value: 'N' for column 'test_column' at row 1)
Need help on this as it wont insert data to my table due to the error and so far I cant find any related issues on google.

tinyint(1) can hold a 1-byte integer, and 'N' is not a valid integer value.
Check and replace the value you assign to test_column.
R

I just found the solution to my problem. I just need to use the Transform Columns step

Related

Adding “null” by default using drived transformation

I have a sql table which data type is int and it doesn’t accept null value.
So what I want to do is add “null” using derived column.
What I did is used drived transformation and add a new column and use expression (DT_WSTR,10) “null”
And then used data conversion and changed the data type into DT_14 but the data conversion fails upon execution.
Is there any other way to do this?
You can't do what you're trying to do. An INTEGER NOT NULL column will throw an error if you try to insert a text value into it, as you've seen.
There are really only two options.
Insert a zero for any NULL values that come through.
Insert a dummy value that's out of the range of values for the column, such as 999999 or the minimum or maximum values for an integer data type.
Or, of course, as Gordon suggested in the comments, drop the NOT NULL constraint on the column and insert the NULL values.

Incorrect value inserted into SQL server table when inserted value is greater than column length

Situation:
I have an existing legacy table [dbo].[Values] which sits on SQL server 2017.
In this table I have 3 columns.
int tableid (Primary Key)
char (8) code
char (7) description
Code and description are both custom data types but both are just char (8) ,char (7) with no additional logic.
Action
If I insert into this table where the inserted value to column code is greater than 8 char's I get * inserted into that column.
No error or warning is given.
I have looked in triggers, constraints, policies, the table's creation script, custom datatypes. I cannot find any where there is logic that says if truncate set value = *
Question
What part of sql server would modify values before it is saved into the table?
You would see this if you are also using the incorrect datatype and getting an implicit cast from int.
CREATE TABLE #T(C CHAR(8))
INSERT INTO #T VALUES (1111111111);
SELECT *
FROM #T
It is documented behaviour here
Solution. Use a string
INSERT INTO #T VALUES ('1111111111');
/*String or binary data would be truncated.*/
This is legacy behaviour that is unlikely to change

postgresql INSERTs NULL values from SELECT COS(another field) query

If I run
SELECT
(cos(radians(spa.spa_zenithangle)))
FROM generic.spa;
I get a sensible range of results from -1 to 1. but if I run this insert statement all the resulting values in the spa.spa_cos_zenith field are NULLs
INSERT INTO generic.spa
(spa_cos_zenith)
SELECT
(cos(radians(spa.spa_zenithangle)))
FROM generic.spa;
The table definition is:
CREATE TABLE generic.spa (
spaid INTEGER DEFAULT nextval('generic.spa_id_seq'::regclass) NOT NULL,
measurementdatetime TIMESTAMP WITHOUT TIME ZONE,
spa_zenithangle NUMERIC(7,3),
spa_cos_zenith DOUBLE PRECISION,
CONSTRAINT spa_pk PRIMARY KEY(spaid)
)
WITH (oids = false);
Anyone know why the COS functions returns results ok but they cant be inserted into another field?
I suspect you want update, not insert:
UPDATE generic.spa
SET spa_cos_zenith = cos(radians(spa.spa_zenithangle));
INSERT inserts new rows, so you are duplicating the rows. The only column in the new rows is the COS() value. Nothing changes in the old rows.

Insertion SQL and NOT NULL values

I've created a table schema and specified that for some attributes, values cannot be null. For one column of this table, values are to be imported from a column of some another table but the problem i am facing is that when i use insert statement to copy values from that column of another table to the column of this newly created table, the attributes of this new column start screaming because they kind of have a constraint on them that while insertion their values cannot be NULL!
How do i cope with this?
One solution is that for other attributes, just for time being, i can state that null values can be accommodated so that i can successfully import values from column of other table and then later on put condition on the remaining attributes that values are not be NULL. But how do i do do this?
You need to convert NULL to some DEFAULT values while importing.
I am not sure which DB engine you are using, in mysql:
Use something like IFNULL(column_name, "").
Reference
You may simply be looking for the default clause. When you define a column, you can specify;
intcol int not null default 0
If the column is not specified for an insert, then it will default to 0. In some databases, if a NULL value is supplied, it will also get the default value.

SQL Not Empty instead of Not NULL

I am using postgreSQL. I have a column that:
NOT NULL
However when I want to insert a row with an empty string as like:
''
it doesn't give me an error and accepts. How can I check insert value should be not empty? (Neither empty nor null)
PS: My column defined as:
"ads" character varying(60) NOT NULL
Add a constraint to column definition. For example something like:
ads character varying(60) NOT NULL CHECK (ads <> '')
For more, see http://www.postgresql.org/docs/current/static/ddl-constraints.html
Found in the current documentation of postgreSQL you can do the following to achieve what you want:
CREATE TABLE distributors (
did integer PRIMARY KEY DEFAULT nextval('serial'),
name varchar(40) NOT NULL CHECK (name <> '')
);
From the documentation:
CHECK ( expression )
The CHECK clause specifies an expression producing a Boolean result
which new or updated rows must satisfy for an insert or update
operation to succeed. Expressions evaluating to TRUE or UNKNOWN
succeed. Should any row of an insert or update operation produce a
FALSE result an error exception is raised and the insert or update
does not alter the database. A check constraint specified as a column
constraint should reference that column's value only, while an
expression appearing in a table constraint may reference multiple
columns.
Currently, CHECK expressions cannot contain subqueries nor refer to variables other than columns of the current row.