I have created a table in PostgreSQL using this:
CREATE TABLE TEST (MULTIPROCESS VARCHAR(20), HTTP_REFERER VARCHAR(50));
I try to insert JSON array into Table. Like below
INSERT INTO TEST
SELECT MULTIPROCESS, HTTP_REFERER
FROM json_populate_record(
NULL::TEST_POS,
'[{"multiprocess":true,"http_referer": "http://localhost:9000/"}, {"multiprocess": false,"http_referer": "http://localhost:9002/"}]'
);
It throws error :
[Error Code: 0, SQL State: 22023] ERROR: cannot call json_populate_record on an array
How can I insert JSON array data into Table as below
MULTIPROCESS HTTP_REFERER
true http://localhost:9000/
false http://localhost:9002/
Related
For example, I have a column with type int.
The raw data source has integer values, but the null values, instead of being empty (''), is 'NIL'
How would I handle those values when trying to Bulk Insert into MSSQL?
My code is
create table test (nid INT);
bulk insert test from #FILEPATH with (format="CSV", firstrow=2);
the first 5 rows of my .csv file looks like
1
2
3
NIL
7
You can replace the nil with " (empty string) directly in your data source file or insert the data into a staging table and transform it:
BULK INSERT staging_sample_data
FROM '\\data\sample_data.dat';
INSERT INTO [sample_data]
SELECT NULLIF(ColA, 'nil'), NULLIF(ColB, 'nil'),...
Of course if your field is for example a numeric, the staging table should have a string field. Then, you can do as Larnu offers: 'TRY_CONVERT(INT, ColA)'.
*Note: if there are default constraints you may need to check how to keep nulls
I have a hive table created using the following query:
create table arraytbl (id string, model string, cost int, colors array <string>,size array <float>)
row format delimited fields terminated by ',' collection items terminated by '#';
Now , while trying to insert a row:
insert into mobilephones values
("AA","AAA",5600,colors("red","blue","green"),size(5.6,4.3));
I get the following error:
FAILED: SemanticException [Error 10293]: Unable to create temp file for insert values Expression of type TOK_FUNCTION not supported in insert/values
How can I resolve this issue?
The syantax to enter values in complex datatype if kinda bit weird, however this is my personal opinion.
You need a dummy table to insert values into hive table with complex datatype.
insert into arraytbl select "AA","AAA",5600, array("red","blue","green"), array(CAST(5.6 AS FLOAT),CAST(4.3 AS FLOAT)) from (select 'a') x;
And this is how it looks after insert.
hive> select * from arraytbl;
OK
AA AAA 5600 ["red","blue","green"] [5.6,4.3]
I'm wondering how to ensure that the data inserted into a json or jsonb column is an object, not an array (or an array of objects).
Example:
-- ok
insert into users (settings) values ('{ "theme": "cobalt" }')
-- ok
insert into users (settings) values ('{}')
-- error!
insert into users (settings) values ('[]')
-- error!
insert into users (settings) values ('[{}]')
Thanks!
you could do smth like:
t=# create table so16(j jsonb check (left(ltrim(j::text), 1) <> '['));
CREATE TABLE
t=# insert into so16 values('{"b":[1,2,3]}');
INSERT 0 1
t=# insert into so16 values('[1,2,3]');
ERROR: new row for relation "so16" violates check constraint "so16_j_check"
DETAIL: Failing row contains ([1, 2, 3]).
t=# insert into so16 values(' [1,2,3]');
ERROR: new row for relation "so16" violates check constraint "so16_j_check"
DETAIL: Failing row contains ([1, 2, 3]).
This seems like a trivial question. And it is. But I have googled for over a day now, and still no answer:
I wish to do a bulk insert where for a column whose datatype is varchar(100), I wish to insert an empty string. Not Null but empty. For example for the table:
create table temp(columnName varchar(100))
I wish to insert an empty string as the value:
BULK INSERT sandbox..temp FROM
'file.txt' WITH ( FIELDTERMINATOR = '|#', ROWTERMINATOR = '|:' );
And the file contents would be row1|:row2|:|:|:. So it contains 4 rows where last two rows are intended to be empty string. But they get inserted as NULL.
This question is not the same as the duplicate marked question: In a column, I wish to have the capacity to insert both: NULL and also empty-string. The answer's provided does only one of them but not both.
Well instead of inserting empty string explicitly like this why not let your table column have a default value of empty string and in your bulk insert don't pass any values for those columns. Something like
create table temp(columnName varchar(100) default '')
i have created a custom Postgres type with :
CREATE TYPE new_type AS (new_date timestamp, some_int bigint);
i have a table that store arrays of new_type like:
CREATE TABLE new_table (
table_id uuid primary key,
new_type_list new_type[] not null
)
and i insert data in this table with something like this:
INSERT INTO new_table VALUES (
'*inApplicationGeneratedRandomUUID*',
ARRAY[[NOW()::timestamp, '146252'::bigint],
[NOW()::timestamp, '526685'::bigint]]::new_type[]
)
and i get this error
ERROR: cannot cast type timestamp without time zone to new_type
What am I missing?
I've also tried array syntax that uses {} but nothing better.
The easiest way would probably be:
INSERT INTO new_table VALUES (
'9fd92c53-d0d8-4aba-8925-1bd648d565f2'::uuid,
ARRAY[ row(now(), 146252)::new_type,
row(now(), 526685)::new_type
] );
Note that you have to cast the row type to ::new_type.
As an alternative, you could also write:
INSERT INTO new_table VALUES (
'9fd92c53-d0d8-4aba-7925-1ad648d565f2'::uuid,
ARRAY['("now", 146252)'::new_type,
'("now", 526685)'::new_type
] );
Check PostgreSQL documentation about Composite Value Input.