PostgreSQL Inserting Null works using COPY but fails using INSERT - sql

I have a bigint column named mycolumn. I execute SQL scripts using the PSQL command.
Using COPY command:
COPY public.mytable (myothercol, mycolumn) FROM stdin;
1 \N
\.
This works. But the following does not work:
EXECUTE 'insert into public.mytable (myothercol, mycolumn) values ($1,$2);' USING
1,NULL;
This gives me error:
column "mycolumn" is of type bigint but expression is of type text
Why does insert not work for null value, whereas COPY works?

You best tell PostgreSQL to convert the parameter to bigint explicitly:
EXECUTE 'insert into public.mytable (myothercol, mycolumn) values ($1,$2::bigint);'
USING 1,NULL;
The problem is that PostgreSQL does not automatically know what data type a NULL is, so it guesses text. COPY does not have to guess a data type.

Related

What is the right way to handle type string null values in SQL's Bulk Insert?

For example, I have a column with type int.
The raw data source has integer values, but the null values, instead of being empty (''), is 'NIL'
How would I handle those values when trying to Bulk Insert into MSSQL?
My code is
create table test (nid INT);
bulk insert test from #FILEPATH with (format="CSV", firstrow=2);
the first 5 rows of my .csv file looks like
1
2
3
NIL
7
You can replace the nil with " (empty string) directly in your data source file or insert the data into a staging table and transform it:
BULK INSERT staging_sample_data
FROM '\\data\sample_data.dat';
INSERT INTO [sample_data]
SELECT NULLIF(ColA, 'nil'), NULLIF(ColB, 'nil'),...
Of course if your field is for example a numeric, the staging table should have a string field. Then, you can do as Larnu offers: 'TRY_CONVERT(INT, ColA)'.
*Note: if there are default constraints you may need to check how to keep nulls

Failed to execute query. Error: String or binary data would be truncated in table xdbo.user_info', column 'uid'

I have problem inserting values in my SQL server database on Azure, I am getting the following error:
Failed to execute query. Error: String or binary data would be truncated in table 'dummy_app.dbo.user_info', column 'uid'. Truncated value: 'u'.
The statement has been terminated.
I don't understand where I am wrong, I just created the server, and I am trying to experiment but cant fix this.
if not exists (select * from sysobjects where name='user_info' and xtype='U')
create table user_info (
uid varchar unique,
name varchar,
email varchar
)
go;
INSERT INTO dbo.user_info(uid, name, email) VALUES('uids', 'name', 'email') go;
Creating the table works fine, the only thing that doesn't work is the second command INSERT
I suspect that the reason is that you haven't defined a lenght for varchar and it defaults to 1 as length. Therefore your value gets truncated.
Set a varchar length to something like varchar(200) and you should be good to go.
This looks like the fact that the CREATE portion of your procedure for the table doesn't include a length of varchar, so you'd have to specify a length such as varchar(50) since the default is 1. Refer to the official MS docs in the link, in the remarks.
docs.miscrosoft.com
Also, here is the syntax for the CREATE TABLE in Azure which might be helpful as well.
Syntax of Azure CREATE TABLE

Trigger to convert empty string to 'null' before it posts in SQL Server decimal column

I've got a front table that essentially matches our SSMS database table t_myTable. Some columns I'm having problems with are those with numeric data types in the db. They are set to allow null, but from the front end when the user deletes the numeric value and tries to send a blank value, it's not posting to the database. I suspect because this value is sent back as an empty string "" which does not translate to the null allowable data type.
Is there a trigger I can create to convert these empty strings into null on insert and update to the database? Or, perhaps a trigger would already happen too late in the process and I need to handle this on the front end or API portion instead?
We'll call my table t_myTable and the column myNumericColumn.
I could also be wrong and perhaps this 'empty string' issue is not the source of my problem. But I suspect that it is.
As #DaleBurrell noted, the proper place to handle data validation is in the application layer. You can wrap each of the potentially problematic values in a NULLIF function, which will convert the value to a NULL if an empty string is passed to it.
The syntax would be along these lines:
SELECT
...
,NULLIF(ColumnName, '') AS ColumnName
select nullif(Column1, '') from tablename
SQL Server doesn't allow to convert an empty string to the numeric data type. Hence the trigger is useless in this case, even INSTEAD OF one: SQL Server will check the conversion before inserting.
SELECT CAST('' AS numeric(18,2)) -- Error converting data type varchar to numeric
CREATE TABLE tab1 (col1 numeric(18,2) NULL);
INSERT INTO tab1 (col1) VALUES(''); -- Error converting data type varchar to numeric
As you didn't mention this error, the client should pass something other than ''. The problem can be found with SQL Profiler: you need to run it and see what exact SQL statement is executing to insert data into the table.

Unable to delete right to left language columns using stored procedure

I'm using stored procedure to delete a row from MSSQL database based on a column that uses nvarchar(100) and Persian language.
when i want to insert into this column, i use the word N before the record to be able to perform the insert operation :
insert into materialPrice values( N'persian word',1000,100,0,0,0,0)
the problem is when i want to delete the same record, stored procedure does not work :
create proc spRemoveMaterial
#materialName nvarchar(100)
as
begin
delete from materialPrice where materialName = #materialName
end
I've tried to use N before #materialName but it returend syntax error. how could it be done ?
The N is a marker that causes the string literal to be represented in Unicode--implying that you are inserting into a Unicode column.
You should be able to convert the variable to Unicode with cast. Something like:
cast(#materialName as nvarchar(100))
With the correct type (nchar or nvarchar) and length to match the column.
The problem was with the database collation, following code has fixed it :
ALTER database MaterialDB COLLATE Persian_100_CI_AS

SQLSTATE[22P02]: Invalid text representation

I'm using Postgresql and PHP 5.3.x with PDO to access the DB.
I have this the SQL query (stripped down version), with a placeholder for PDO to fill in:
INSERT INTO t_articles (a_article_id) VALUES (?) RETURNING a_id
I want a_article_id to be either a number, like 5, or else it should be the result of the subquery:
((SELECT max(a_article_id) FROM t_articles) + 1)
However, PDO says:
SQLSTATE[22P02]: Invalid text representation: 7 ERROR: invalid input syntax for integer: "(SELECT max(a_article_id) FROM t_articles) + 1"
And I've tried to set the subquery as the default value, but it is not allowed apparently:
ERROR: cannot use sub query in default expression
How can I insert the result of this sub query (or what can be done to achieve the same result)?
You'd have to use INSERT...SELECT for that:
insert into t_articles (a_article_id)
select max(a_article_id) + 1
from t_articles
returning id
Or if you don't need contiguous values for a_article_id, use a sequence for it:
Create a sequence, we'll call it article_id_sequence.
-- Get the current max(a_article_id)+1 to use instead of FIRST_VAL below
create sequence article_id_sequence
start FIRST_VAL
owned by t_articles.a_article_id;
Set the default value for t_articles.a_article_id to nextval('article_id_sequence').
alter table t_articles
alter column a_article_id
set default nextval('article_id_sequence');
Use the default value when inserting:
insert into t_articles (a_article_id)
values (default)
returning id;