Boolean type with PostgreSQL in Yii - yii

Simple Postgres table:
CREATE TABLE public.test (
id INTEGER NOT NULL,
val BOOLEAN NOT NULL,
CONSTRAINT test_pkey PRIMARY KEY(id)
);
Do this:
Yii::app()->db->createCommand()->insert('test', array(
'id'=>1,
'val'=>true,
));
Everithing's allright:
Executing SQL: INSERT INTO "test" ("id", "val") VALUES (:id, :val). Bound with :id=1, :val=true
But doing this
Yii::app()->db->createCommand()->insert('test', array(
'id'=>1,
'val'=>false,
));
I'm getting the error:
SQLSTATE[22P02]: Invalid text representation: 7 ERROR: invalid input syntax for type boolean: ""
LINE 1: INSERT INTO "test" ("id", "val") VALUES ('1', '')
^. The SQL statement executed was: INSERT INTO "test" ("id", "val") VALUES (:id, :val). Bound with :id=1, :val=false
Am I wrong?

In your 'db' component, set emulatePrepare property to false/null. Setting it to true usually triggers this error.
Postgres Supports prepared statements so there is no need to emulate.

Related

PostgreSQL - how to insert into array of composite type

CREATE TYPE myenum AS ENUM ('title', 'link', 'text');
CREATE TYPE struct AS (
type myenum,
content varchar(300)
);
CREATE TABLE "mytable" (
id serial PRIMARY KEY,
array struct[]
);
INSERT INTO "mytable" VALUES (
DEFAULT,
'{"(\"title\",\"my title\")","(\"link\",\"www.google.com\")"}'
);
INSERT INTO "mytable" VALUES (
DEFAULT,
ARRAY['(\"title\",\"my title\")', '(\"link\",\"www.google.com\")']
);
INSERT INTO "mytable" VALUES (
DEFAULT,
ARRAY[('title','my title'), ('link','www.google.com')]
);
i want to insert some data , and i tried many forms but they all can't insert success,
and there are some error messages
error: malformed array literal: "{"("title","my title")","("link","www.google.com")"}"
error: malformed array literal: "{("title","my title"), ("link","www.google.com")}"
error: column "placeholder_array" is of type placeholder_struct[] but expression is of type text[]
error: column "placeholder_array" is of type placeholder_struct[] but expression is of type record[]
need helppppp!
that's so hard and messy for me, thank you very much!!!!
1). Don't call your column array, that's a keyword.
2). Use ::struct to cast your tuples to that datatype.
CREATE TYPE myenum AS ENUM ('title', 'link', 'text');
CREATE TYPE struct AS (
type myenum,
content varchar(300)
);
CREATE TABLE mytable (
id serial PRIMARY KEY,
val_array struct[]
);
INSERT INTO mytable VALUES (
DEFAULT,
ARRAY[('title','my title')::struct, ('link','www.google.com')::struct]
);
db<>fiddle demo

Apache NiFi: failed INSERT statement caused by NOT NULL CONSTRAINT

I have a JSON
{
"username" : "ChokkiAST",
"login_date" : "2021-01-15",
"active": true
}
I want to INSERT this data into database table with structure:
CREATE TABLE web.accounts_activity (
username text NOT NULL,
login_date date NOT NULL,
active bool NULL DEFAULT false,
id SERIAL NOT NULL,
CONSTRAINT web.accounts_activity_username_key UNIQUE (username),
CONSTRAINT cabinet_account_pkey PRIMARY KEY (id)
);
My JSON doesn't contain id field because database should auto generate values (I perform transffering from Spring JPA code). I tried with PutDatabaseRecord with INSERT statement and I got the error: CONSTRAINT ERROR: value NULL at column "id". Tried with UPSERT statement - same error.
Also I tried to use PutSQL processor with following SQL script (values from attributes):
INSERT INTO web.accounts_activity (username, login_date, active, id)
VALUES (${username}, ${login.date}, ${is.active}, DEFAULT);
And I got the error:
ERROR: current transaction is aborted, commands ignored until end of transaction block
So, how to exactly insert this id with Apache NiFi?
Remove id completely from the lists of columns and values. It will auto-generate a value, this is how serial works.
INSERT INTO web.accounts_activity (username, login_date, active)
VALUES (${username}, ${login.date}, ${is.active});

Migrating data from old table to new table Postgres with extra column

Table Structure:
Old Table Structure:
New Table Structure:
Query:
INSERT INTO hotel (id, name, hotel_type, active, parent_hotel_id)
SELECT id, name, hotel_type, active, parent_hotel_id
FROM dblink('demopostgres', 'SELECT id, name, hotel_type, active, parent_hotel_id FROM hotel')
AS data(id bigint, name character varying, hotel_type character varying, active boolean, parent_hotel_id bigint);
Following error occurs:
ERROR: null value in column "created_by" violates not-null constraint
DETAIL: Failing row contains (1, Test Hotel, THREE_STAR, t, null,
null, null, null, null, null). SQL state: 23502
I tried to insert other required columns
Note: created_by as Jsonb
created_by = '{
"id": 1,
"email": "tes#localhost",
"login": "test",
"lastName": "Test",
"firstName": "Test",
"displayName": "test"
}'
created_date = '2020-02-22 16:09:08.346'
How can I pass default values for created_by and created_date column while moving data from the old table?
There are several choices.
First the INSERT is failing because the field is NOT NULL. You could ALTER TABLE(https://www.postgresql.org/docs/12/sql-altertable.html)as to unset that for the import, update the fields with values and the reset NOT NULL.
ALTER [ COLUMN ] column_name { SET | DROP } NOT NULL
Two, as #XraySensei said you could add DEFAULT values to the table using ALTER TABLE:
ALTER TABLE [ IF EXISTS ] [ ONLY ] name [ * ]
action [, ... ]
ALTER [ COLUMN ] column_name SET DEFAULT expression
Third option is to embed the defaults into the query:
create table orig_test(id integer NOT NULL, fld_1 varchar, fld_2 integer NOT NULL);
insert into orig_test(id, fld_1, fld_2) values (1, 'test', 4);
insert into orig_test(id, fld_1, fld_2) values (2, 'test', 7);
insert into default_test (id, fld_1, fld_2) select id, fld_1, fld_2 from orig_test ;
ERROR: null value in column "fld_3" violates not-null constraint
DETAIL: Failing row contains (1, test, 4, null).
insert into default_test (id, fld_1, fld_2, fld_3) select id, fld_1, fld_2, '06/14/2020' AS fld_3 from orig_test ;
INSERT 0 2

SQL - Inserting into postgresql table produces error on semi-colon

I'm trying to insert some test data into a table to check the functionality of a web servlet, however, using pgAdmin4 to do the insert, I am running into an issue I'm not sure how to rectify. What I want to see is the last value (an image byte stream) is null for this test info. Here is my insert statement:
INSERT INTO schema.tablename("Test Title", "Test Content", "OldWhovian", "2016-07-29 09:13:00", "1469808871694", "null");
I get back:
ERROR: syntax error at or near ";"
LINE 1: ...ldWhovian", "2016-07-29 09:13:00", "1469808871694", "null");
^
********** Error **********
ERROR: syntax error at or near ";"
SQL state: 42601
Character: 122
I've tried removing the semi-colon just for kicks, and it instead errors on the close parenthesis. Is it an issue related to the null? I tried doing this without putting quotations around the null and I get back the same error but on the null instead of the semi-colon. Any help is appreciated, I am new to DBA/DBD related activities.
Related: Using PostgreSql 9.6
The insert statement usually has first part where you specify into which columns you want to insert and second part where you specify what values you want to insert.
INSERT INTO table_name (column1, column2) VALUES (value1, value2);
You do not need to specify into which columns part only if you supply all values in the second part. If you have a table with seven columns you can omit the first part if in the second part you supply seven values.
INSERT INTO table_name VALUES (value1, value2, value3, ...);
Example:
drop table if exists my_table;
create table my_table (
id int not null,
username varchar(10) not null,
nockname varchar(10),
created timestamptz
);
INSERT INTO my_table (id, username) VALUES (1, 'user01');
You insert into columns id and username. The column created has default value specified so when you do not supply value in insert the default is used instead. Nickname and identification_number can accept null values. When no value is supplied NULL is used.
INSERT INTO my_table VALUES (2, 'user02', NULL, NULL, current_timestamp);
That is the same as the previous but here is omitted the fist part so you must supply values for all columns. If you did not you would get an error.
If you want insert multiple values you can use several statements.
INSERT INTO my_table (id, username, identification_number) VALUES (3, 'user03', 'BD5678');
INSERT INTO my_table (id, username, created) VALUES (4, 'user04', '2016-07-30 09:26:57');
Or you can use the postgres simplification for such inserts.
INSERT INTO my_table (id, username, nickname, identification_number) VALUES
(5, 'user05', 'fifth', 'SX59445'),
(6, 'user06', NULL, NULL),
(7, 'user07', NULL, 'AG1123');
At the beginning I have written that you can omit the first part (where you specify columns) only if you supply values for all columns in the second part. It is not completely true. In special cases when you have table that has nullable columns (columns that can contain NULL value) or you have specified DEFAUL values you can also omit the first part.
create sequence my_seq start 101;
create table my_table2 (
id int not null default nextval('my_seq'),
username varchar(10) not null default 'default',
nickname varchar(10),
identification_number varchar(10),
created timestamptz default current_timestamp
);
INSERT INTO my_table2 DEFAULT VALUES;
INSERT INTO my_table2 DEFAULT VALUES;
INSERT INTO my_table2 DEFAULT VALUES;
Result:
101 default NULL NULL 2016-07-30 10:28:27.797+02
102 default NULL NULL 2016-07-30 10:28:27.797+02
103 default NULL NULL 2016-07-30 10:28:27.797+02
When you do not specify values defaults are used or null. In the example above the id column has default value from sequence, username has default string "default", nickname and identification_number are null if not specified and created has default value current timestamp.
More information:
PostgreSQL INSERT

sql insert fails on postgresql database

I am trying to run the following insert statement using pgadmin3:
INSERT INTO device
VALUES
(12345,
'asdf',
'OY8YuDFLYdv',
'2',
'myname',
'2013-04-24 11:30:08',
Null,Null)
But I keep getting the following error message:
ERROR: invalid input syntax for integer: "asdf"
LINE 4: 'asdf',
^
********** Error **********
ERROR: invalid input syntax for integer: "asdf"
SQL state: 22P02
Character: 42
Here's the table definition:
CREATE TABLE device
(
device_id integer NOT NULL DEFAULT nextval('device_device_id_seq'::regclass),
userid integer NOT NULL,
description character varying(255),
password character varying(255) NOT NULL,
user_id integer NOT NULL,
createdname character varying(255),
createddatetime timestamp without time zone,
updatedname character varying(255),
updateddatetime timestamp without time zone,
CONSTRAINT device_pkey PRIMARY KEY (device_id )
)
WITH (
OIDS=FALSE
);
ALTER TABLE device
OWNER TO appadmin;
Can you tell me where I'm going wrong? I've tried changing the single quotes to double quotes but that didn't help.
I don't want to have to list all the column names in the INSERT if I dont have to.
Thanks.
Apparently you're expecting the INSERT to skip device_id since it is the primary key and has a default that comes from a sequence. That's not going to happen so PostgreSQL thinks you mean this:
insert into device (device_id, userid, ...)
values (12345, 'asdf', ...);
If you insist on not listing your columns explicitly (and making the people that get to maintain your code suffer needlessly) then you can specify DEFAULT in the VALUES to tell PostgreSQL to use the PK's default value; from the fine manual:
INSERT INTO table_name [ ( column_name [, ...] ) ]
{ DEFAULT VALUES | VALUES ( { expression | DEFAULT } [, ...] ) [, ...] | query }
[ RETURNING * | output_expression [ [ AS ] output_name ] [, ...] ]
[...]
DEFAULT
The corresponding column will be filled with its default value.
For example:
INSERT INTO device
VALUES
(DEFAULT,
12345,
'asdf',
...
But really, you should just specify the columns to make the SQL easier to understand and more robust when the schema changes.