Insert ARRAY in PostgreSQL table with apostrophes in values - sql

I have a Python script with SQL insert statements that insert parsed data from file to PostgreSQL table. In case data has apostrophes, execution fails. In detail:
<name>RYAZAN'</name>
or
<name>CHYUL'BYU</name>
I found the solution when it is a string -> adding extra ' to apostrophe so it transforms to 'RYAZAN''' or 'CHYUL''BYU' in INSERT statement.
But in case my values are in list (from python script) like city = ["RYAZAN''", "CHYUL''BYU"] Python automatically puts double quotes instead of single quotes. As a result when trying to insert
INSERT INTO City (uuid, name) VALUES (uuid_generate_v4(), unnest(array{city}))
SQL fails with error
ERROR: column "RYAZAN''.." does not exist
because sql reads the double quotes as a column name or whatever. Is there a way to insert ARRAY with values that contain apostrophes?

Try using $$ dollar-quoted strings
CREATE TEMP TABLE t (f TEXT);
INSERT INTO t VALUES ($$<name>CHYUL'BYU</name>$$),
($$<name>RYAZAN'</name>$$);
SELECT * FROM t;
f
------------------------
<name>CHYUL'BYU</name>
<name>RYAZAN'</name>
(2 Zeilen)
There are also many other ways to escape quotes
INSERT INTO t
VALUES (E'\'foo'),('''bar'''),
('"the answer is" -> ''42'''),($$It's "only"" $1.99$$);
SELECT * FROM t;
f
-------------------------
'foo
'bar'
"the answer is" -> '42'
It's "only"" $1.99
(4 Zeilen)
Demo: db<>fiddle

Related

PostgreSQL 12 `quote_literal` function explanation

I want to understand how quote_literal() function works.
This is my table:
CREATE TABLE temp_emp (
id integer,
name text
);
INSERT INTO TEMP_EMP (id, name) VALUES (1, 'Super Pavel');
When I do:
SELECT * FROM "public".temp_emp WHERE name like '%Pavel%';
I have 1 row in result.
However, when I do:
SELECT * FROM "public".temp_emp WHERE name like quote_literal('%Pavel%');
I have 0 rows in result.
At the same time:
SELECT * FROM quote_literal('%Pavel%');
returns '%Pavel%'.
Could anyone explain why like '%Pavel%' and like quote_literal('%Pavel%') give different results?
The purpose of quote_literal() -- as explained in the documentation -- is to quote values for dynamic SQL. Dynamic SQL means that you are putting SQL into a string.
If you run this on different values, you will see that it includes the single quotes:
select str, '"' || quote_literal(str) || '"'
from (values ('abc'), ('abc def'), ('abc '' def')) v(str);
This returns:
abc "'abc'"
abc def "'abc def'"
abc ' def "'abc '' def'"
In particular, the single quotes are inside the string -- the double quotes are there too, but just to illustrate the boundaries of the string.
Clearly, although your data might have 'Pavel' embedded in them, none of your rows have 'Pavel' with single quotes.

Insert a column with single quote or Apostrophe in Oracle

I am trying to insert into Table Users from Person table.
However, The first_name column in the person table contains apostrophe in the name (Eg- Rus'sell) which is preventing me from successful insertion. How do I fix this?
INSERT INTO USERS VALUES (SELECT FIRST_NAME,.........FROM PERSON);
INSERT INTO USERS VALUES (SELECT FIRST_NAME,.........FROM PERSON);
First of all, your insert statement is syntactically incorrect. It will raise ORA-00936: missing expression. The correct syntax to insert multiple records from source table is:
INSERT INTO table_name SELECT columns_list FROM source_table;
The VALUES keyword is used to insert a single record into table using following syntax:
INSERT INTO table_name(columns_list) VALUES (expressions_list);
If you already have the value stored in another table, then simple INSERT INTO..SELECT FROM should work without any issues. However, if you are trying to INSERT INTO..VALUES having single quotation marks, then the best way is to use Quoting string literal technique The syntax is q'[...]', where the "[" and "]" characters can be any of the following as long as they do not already appear in the string.
!
[ ]
{ }
( )
< >
You don't have to worry about the single-quotation marks within the string.
create table t(name varchar2(100));
insert into t values (q'[Rus'sell]');
insert into t values (q'[There's a ' quote and here's some more ' ' ']');
select * from t;
NAME
-----------------------------------------------
Rus'sell
There's a ' quote and here's some more ' ' '
I don't think your question is showing the complete details, because I can execute the following statements without any problem:
create table person( first_name varchar2(100));
create table users( first_name varchar2(100));
insert into person values ('Rus''sell');
insert into users select first_name from person;
Apologies for the obscurity if any in the question. The query I was working with was a long insert query with multiple joins.
To sum it was a stored proc where I was doing an insert, for which the data is given by long select query with multiple joins. One of the column is the FIRST_NAME column which had some values with Apostrophe in it (Rus'sell, Sa'm).
The Insert statement values were being generated as below which was causing an 'ORA-00917: missing comma' error.
INSERT INTO TABLE_NAME values (314159,0,'Rus'sell','Parks','...........)
I fixed this by Replacing the column in the select from a single quote to two single quotes, before giving it to the insert statement which basically solved the issue.
REPLACE(FIRST_NAME,'''','''''') AS FIRST_NAME
Hope it helps.

Getting Error 10293 while inserting a row to a hive table having array as one of the fileds

I have a hive table created using the following query:
create table arraytbl (id string, model string, cost int, colors array <string>,size array <float>)
row format delimited fields terminated by ',' collection items terminated by '#';
Now , while trying to insert a row:
insert into mobilephones values
("AA","AAA",5600,colors("red","blue","green"),size(5.6,4.3));
I get the following error:
FAILED: SemanticException [Error 10293]: Unable to create temp file for insert values Expression of type TOK_FUNCTION not supported in insert/values
How can I resolve this issue?
The syantax to enter values in complex datatype if kinda bit weird, however this is my personal opinion.
You need a dummy table to insert values into hive table with complex datatype.
insert into arraytbl select "AA","AAA",5600, array("red","blue","green"), array(CAST(5.6 AS FLOAT),CAST(4.3 AS FLOAT)) from (select 'a') x;
And this is how it looks after insert.
hive> select * from arraytbl;
OK
AA AAA 5600 ["red","blue","green"] [5.6,4.3]

syntax for inserting hstore arrays in PostgreSQL

New to Postgres, just wondering how the syntax would be like. For example, I have the following table:
CREATE TABLE test
(
field1 hstore[],
field2 text[],
field3 hstore
)
...
For inserting arrays, syntax is like
INSERT INTO test (field2) VALUES (' {"abc","def"} ');
and for inserting hstore, syntax is like
INSERT INTO test (field3) VALUES (' "a"=>1.0, "b"=>2.4 ');
but,,, for insertions on 'field1', what do I do? Something like below gives me errors:
INSERT INTO test (field1)
VALUES (`{'"a"=>1.0, "b"=>2.0', '"a"=>3.0, "b"=>4.0' }`)
Any fixes? Thanks!
==EDIT==
Just figured it out.
INSERT INTO test (field1)
VALUES ('{"a=>1.0, b=>2.0", "a=>3.0, b=>4.0"}' )
The answer below helps as well, but in this particular case, a string(instead of an Array structure) works better with my existing code.
I think you'll have a much easier time with the array constructor syntax:
The ARRAY constructor syntax can also be used:
INSERT INTO sal_emp
VALUES ('Bill',
ARRAY[10000, 10000, 10000, 10000],
ARRAY[['meeting', 'lunch'], ['training', 'presentation']]);
Something like this:
INSERT INTO test (field1)
VALUES (array['"a"=>1.0, "b"=>2.0'::hstore, '"a"=>3.0, "b"=>4.0'::hstore]);
You only need the ::hstore cast on the first element in the array but it doesn't hurt to cast them all.
I tend to use the array constructor syntax exclusively because all the string parsing and quoting gives me a headache.
If you can't use the array constructor syntax, you can ask PostgreSQL itself how to do it:
=> select array['"a"=>1.0, "b"=>2.0'::hstore, '"a"=>3.0, "b"=>4.0'::hstore];
array
---------------------------------------------------------------------
{"\"a\"=>\"1.0\", \"b\"=>\"2.0\"","\"a\"=>\"3.0\", \"b\"=>\"4.0\""}
Note that the individual hstores are wrapped in double quotes:
"\"a\"=>\"1.0\", \"b\"=>\"2.0\""
and that they use backslash-escaped double quotes for their internal structure. That gives us:
INSERT INTO test (field1)
VALUES ('{"\"a\"=>\"1.0\", \"b\"=>\"2.0\"","\"a\"=>\"3.0\", \"b\"=>\"4.0\""}');
I'd still try to use the array constructor syntax, all those nested quotes and escapes are nasty.

How to do an insert with multiple rows in Informix SQL?

I want to insert multiple rows with a single insert statement.
The following code inserts one row, and works fine:
create temp table mytmptable
(external_id char(10),
int_id integer,
cost_amount decimal(10,2)
) with no log;
insert into mytmptable values
('7662', 232, 297.26);
select * from mytmptable;
I've tried changing the insert to this, but it gives a syntax error:
insert into mytmptable values
('7662', 232, 297.26),
('7662', 232, 297.26);
Is there a way to get it working, or do I need to run many inserts instead?
You could always do something like this:
insert into mytmptable
select *
from (
select '7662', 232, 297.26 from table(set{1})
union all
select '7662', 232, 297.26 from table(set{1})
)
Pretty sure that's standard SQL and would work on Informix (the derived table is necessary for Informix to accept UNION ALL in INSERT .. SELECT statements).
As you found, you can't use multiple lists of values in a single INSERT statement with Informix.
The simplest solution is to use multiple INSERT statements each with a single list of values.
If you're using an API such as ESQL/C and you are concerned about performance, then you can create an INSERT cursor and use that repeatedly. This saves up the inserts until a buffer is full, or you flush or close the cursor:
$ PREPARE p FROM "INSERT INTO mytmptable VALUES(?, ?, ?)";
$ DECLARE c CURSOR FOR p;
$ OPEN c;
while (...there's more data to process...)
{
$PUT c USING :v1, :v2, :v3;
}
$ CLOSE c;
The variables v1, v2, v3 are host variables to hold the string and numbers to be inserted.
(You can optionally use $ FLUSH c; in the loop if you wish.) Because this buffers the values, it is pretty efficient. Of course, you could also simply use $ EXECUTE p USING :v1, :v2, :v3; in the loop; that foregoes the per-row preparation of the statement, too.
If you don't mind writing verbose SQL, you can use the UNION technique suggested by Matt Hamilton, but you will need a FROM clause in each SELECT with Informix. You might specify:
FROM "informix".systables WHERE tabid = 1, or
FROM sysmaster:"informix".sysdual, or
use some other technique to ensure that the SELECT has a FROM clause but only generates one row of data.
In my databases, I have either a table dual with a single row in it, or a synonym dual that is a synonym for sysmaster:"informix".sysdual. You can get away without the "informix". part of those statements if the database is 'normal'; the owner name is crucial if your database is an Informix MODE ANSI database.
In some versions of Infomix you can build a virtual table using the TABLE keyword followed by a value of one of the COLLECTION data types, such as a LIST collection. In your case, use a LIST of values of Unnamed Row type using the ROW(...) constructor syntax.
Creating a TABLE from COLLECTION value
http://www.ibm.com/support/knowledgecenter/SSGU8G_11.50.0/com.ibm.sqls.doc/ids_sqs_1375.htm
ROW(...) construction syntax, for literals of Unnamed Row data type
http://www.ibm.com/support/knowledgecenter/SSGU8G_11.50.0/com.ibm.sqlr.doc/ids_sqr_136.htm
Example:
select *
from TABLE(LIST{
ROW('7662', 232, 297.26),
ROW('7662', 232, 297.26)
}) T(external_id, int_id, cost_amount)
into temp mytmptable with no log
In the above, the data types are implied by the value, but when needed you can explicitly cast each value to the desired data type in the row constructor, like so:
ROW('7662'::char(10), 232::integer, 297.26::decimal(10,2))
You can also insert multiple rows by storing the values in an external file and executing the following statement in dbaccess:
LOAD FROM "externalfile" INSERT INTO mytmptable;
However, the values would have to be DELIMITED by a pipe "|" symbol, or whatever you set the DBDELIMITER environment variable to be.
If you're using the pipe delimiter, the data in your external file would look like:
7662|232|297.26|
7663|233|297.27|
...
NOTE that the data in the external file must be properly formatted or able to be converted to successfully be inserted into each mytmptable.column datatype.
Here is a simple solution fro bulk insert with SELECT part solving the rest
INSERT INTO cccmte_pp
( cmte, pref, nro, eje, id_tri, id_cuo, fecha, vto1, vto2, id_tit, id_suj, id_bie, id_gru )
SELECT * FROM TABLE (MULTISET {
row('RC', 4, 10, 2020, 1, 5, MDY(05,20,2020), MDY(05,20,2020),MDY(05,27,2020),101, 1, 96, 1 ),
row('RC', 4, 11, 2020, 1, 5, MDY(05,20,2020), MDY(05,20,2020),MDY(05,27,2020),101, 1, 96, 1 ) })
AS t( cmte, pref, nro, eje, id_tri, id_cuo, fecha, vto1, vto2, id_tit, id_suj, id_bie, id_gru )
;