Error converting varchar to bigint - sql

I got the error where my data type is varchar, then I want to insert value/input in textboxt = 'smh85670s'.
It appear to be error. As far as I know varchar can accept characters and numbers, but why does it keep throwing this error?
If I insert value '123456' the table can accept that value.
Please guide me. What data type should I use?

Assuming that you are using Stored procedures (which have an insert query) or directly firing an insert query into DB, you must be sending all data as parameters like say #param1, #param2,...
Your insert query will be like
INSERT INTO Sometable ( Amount, textbox,... )
SELECT #param1, #param2 ,...
Just add a cast in this query to make it work
INSERT INTO Sometable ( Amount, textbox,... )
SELECT #param1, CAST(#param2 as varchar),...

Related

Why this simple stored procedure isn't working

Here is the deal, I am receiving an array from C# and I want to insert it into the following table with only 2 columns which are #idUser int and #idRegion int.
The stored procedure needs to receive the array and insert it into the table but somehow it isn't working, it tells me that it cannot convert #idRegion to an int. I tried to use CAST and CONVERT to convert it into int but it isn't working.
The Select From works ok, but not the insert.
Here is the stored procedure (#idUser needs to be the same for all inserted rows):
#idUser int,
#idRegion nvarchar(MAX)
AS
BEGIN
INSERT INTO [UsersRegion] (idUser,IdRegion)
VALUES (#idUser, #idRegion)
SELECT #idUser,cast(value as int) FROM STRING_SPLIT(#idRegion,',')
END
I get this error when running it:
Conversion failed when converting the nvarchar value '1,2,3,4' to data type int.
If you are sending multiple values in #idRegion then when you split them, you may have more than 1 things you need to insert. Therefore, do it like this:
INSERT INTO [UsersRegion] (idUser,IdRegion)
SELECT #idUser, value FROM STRING_SPLIT(#idRegion, ',')
If the target table's IdRegion column is of type int, you need to cast like this:
SELECT #idUser, cast(value as int) FROM STRING_SPLIT(#idRegion, ',')
Above code will insert the same #idUser for every record but a different value for IdRegion depending the splitted items. More on Insert into select from
Your INSERT statement seems to be working with IdRegion while everything else is lowercase id.
However, assuming this is how the actual table column is named and is not a typo...
Your problem is most likely the line that reads:
#idRegion nvarchar(MAX)
Which is declaring the #idRegion variable as a string, while you have stated in the question that it's meant to be an int.
This would explain the casting error.
If you cannot pass it into the procedure as an int from the C# code. Your only other option would be to try to parse it into an int as you have said.

SQL Query Finding From Table DataType Declaration

I got some serial keys to find in sql database, such as “A-B-C”,”D-E-F”,”G-H-I”,”J-K-L” and they are stored in tblTemp using ntext data type. These above keys may store in three columns, colA, colB and colC (sometimes store in one column and the rest are null). Sometimes, two serial keys can find in one column (e.g. A-B-C;D-E-F) using “;” seperated. so i wrote the following sql query.
Declare #sa TABLE(var1 nvarchar(Max));
Insert INTO #sa(var1) VALUES (N’A-B-C’);
Insert INTO #sa(var1) VALUES (N’D-E-F’);
Insert INTO #sa(var1) VALUES (N’G-H-I’);
Insert INTO #sa(var1) VALUES (N’J-K-I’);
SELECT * FROM tblTemp
WHERE colA IN (SELECT var1 FROM #sa);
so i got the following error message.
The data types ntext and nvarchar(max) are incompatible in the equal to operator.
I still need to find for colB and colC. How should write query for this kind of situation?
all suggestions are welcome.
CAST/CONVERT (msdn.microsoft.com) your var1 to NTEXT type in your query so that the types are compatible.
SELECT
*
FROM
tblTemp
WHERE
colA IN (
SELECT
CAST(var1 AS NTEXT)
FROM
#sa
);
You have to convert/cast your search term as an appropriate data type, in this case text.
Try this:
Declare #sa TABLE(var1 nvarchar(Max));
Insert INTO #sa(var1) VALUES (N’A-B-C’);
Insert INTO #sa(var1) VALUES (N’D-E-F’);
Insert INTO #sa(var1) VALUES (N’G-H-I’);
Insert INTO #sa(var1) VALUES (N’J-K-I’);
SELECT *
FROM tblTemp t
WHERE EXISTS (SELECT 1
FROM #sa s
WHERE t.colA like cast('%'+s.var1+'%' as text)
OR t.colB like cast('%'+s.var1+'%' as text)
OR t.colC like cast('%'+s.var1+'%' as text)
);
Since all suggestions are welcome.
How about change the datatype on tblTemp to NVARCHAR(MAX)?
NTEXT was deprecated with the introduction of NVARCHAR(MAX) in 2005.
ALTER TABLE tblTemp ALTER COLUMN colA NVARCHAR(MAX)

How to column schema of arbitrary query in SQL server

I want to get just schema of arbitrary sql query in SQL server.
For example-
Create Table Tabl1(Ta1bID int ,col1 varchar(10))
Create Table Tabl2(Tab1ID int ,col2 varchar(20))
SQL Query -
SELECT col1, col2
FROM Tab1
INNER JOIN Tab2 ON Tab1ID = Tab2ID
Here result will have this schema-
Col1 varchar(10), Col2 varchar(20)
I want to know what will be schema of result.
PS : I have just read access on the server where I am executing this query.
Any way to do this?
The schema, you mean to say datatype right? For resulted datatype will you always know when you operate on table(s). In your case both column is varchar datatype, so it will give character datatype, that you may convert into any character datatype like varchar, nvarchar, char, ntext etc.
Basic thing is, table designing was/is done by, so that time we know why we define datatype, now when you execute query below , you always know what will come with datatype of each column.
SELECT col1, col2
FROM Tab1
INNER JOIN Tab2 ON Tab1ID = Tab2ID
This though will issue when you use dynamic query, where you run time add column as per your requirement and you then execute which may or may not give error, due to mismatch of wrong datatype.
like
declare #sqlstring nvarchar(max)
set #sqlstring = 'declare #t table (name varchar(50))
insert into #t values(''Minh''),(''Tinh''),(''Justin'')
Select * from #t where Name like ''%in%'''
EXEC sp_executesql #sqlstring
I had similar problem before, but I had to create tables (70+) of each query one by one.
If there is a pattern, write a tool to generate create table
statement.
If not and it's one time job, just create them manually.
If it's not one time job, you might be thinking, why would store temp
date in table instead of temp table.

SQL union between NULL and VARCHAR error

I have two views with identical columns. One of the columns on the first view is generated 'on the fly' and set as NULL and the same column on the other view has values stored as varchar. I have a stored proc that looks like this:
ALTER PROCEDURE [dbo].[mi_GetLearners]
(#centrename nvarchar(200))
AS
SELECT [centrename]
,[Name]
,[Username] --generated on the fly as NULL
FROM [DB1].[dbo].[vw_Learners]
WHERE [centrename] = #centrename
UNION
SELECT [centrename]
,[Name]
,[Username] --values stored as varchar
FROM [Linked_Server].[DB2].[dbo].[vw_Learners]
WHERE [centrename] = #centrename
DB1 is on SQL Server 2008 R2
DB2 is on SQL server 2005
When I run the stored proc I get the following error:
Msg 245, Level 16, State 1, Line 1 Conversion failed when converting
the varchar value 'someusername' to data type int.
Why is it trying to convert the value to int datatype as the other column is set as NULL? If I instead change the second column from NULL to ' ' the stored proc works fine... I'm really baffled as to why a union between a varchar column and NULL column generated in a select statement would throw such an error... any ideas?
EDIT: I'm looking for an explanation rather than a solution...
EDIT 2: Running the following code:
CREATE VIEW vw_myview
AS
SELECT NULL AS MyColumn
EXECUTE sp_help vw_myview
Returns:
Type Column_name
int MyColumn
The problem is that NULL is (to within some wiggle room) a member of every data type.
When any SELECT query is being run, the contents of each column must be of one type, and only one type. When there are a mixture of values in a column (including NULLs), the type can obviously be determined by examining the types of the non-NULL values, and appropriate conversions are performed, as necessary.
But, when all rows contain NULL for a particular column, and the NULL hasn't been cast to a particular type, then there's no type information to use. So, SQL Server, somewhat arbitrarily, decides that the type of this column is int.
create view V1
as
select 'abc' as Col1,null as Col2
go
create view V2
as
select 'abc' as Col1,CAST(null as varchar(100)) as Col2
V1 has columns of type varchar(3) and int.
V2 has columns of type varchar(3) and varchar(100).
I'd expect the type of the field to be determined by the first SELECT in the union. You may try changing the order of your two selects or change to CAST(NULL AS VARCHAR(...)).
I think you need to cast to the same datatype:
---
AS
SELECT [centrename]
,[Name]
,CAST(NULL AS VARCHAR(MAX)) AS [Username]
FROM [DB1].[dbo].[vw_Learners]
WHERE [centrename] = #centrename
UNION
---

Double columns in a SQL View

We have a legacy interface that inserts into table T1 that values "BODY_TEXT" (varcharmax), "BODY_BIN"(varbinarymax).
It currently inserts just to one of the columns, and leave the other one NULL.
Now we implemented a new interface - table T2 that has only "BODY"(varbinarymax) column.
I need to create a view V1 that should replace T1, meaning
CREATE VIEW V1 AS
SELECT
T2.UNIQUE_ID AS UNIQUE_ID,
etc…
Now I don't know how to treat T2.BODY column… I need to do something like
T2.BODY AS (whatever is not null(BODY_BIN, BODY_TEXT)). It must also support varcharmax vs. varbinarymax.
I tried implementing COALESCE meaning T2.BODY AS COALESCE(BODY_BIN, BODY_TEXT) but it doesn't work.
Nor does
COALESCE(BODY_BIN, BODY_TEXT) AS BODY
T2.BODY AS BODY
Again - In the legacy table we had T1 with two columns - BODY_BIN and BODY_TEXT. The user inserted one value and left the other one null, since body is either binary or textual but not both. The new interface has a table T2 that has only one column, BODY (varbinarymax), and I was asked to delete table T1 and create a view with the same name. Meaning in order to preserve backward comparability they should still be able to perform "insert into T1 values X,Y" (X is DATA_BIN or NULL, and Y is DATA_TEXT or NULL), but the content (taken from either X or Y) should be translated into ONE column in the T2 table - BODY.
I have no idea how to pull this one up.
Can you help me?
Thanks,
Nili
varbinary to varchar (note the order) will cast implicitly. So this works because ISNULL takes the first datatype
ISNULL(varchar, varbinary)
COALESCE fails because it takes the highest precedence datatype (which is varbinary). The implicit cast is not allowed. ISNULL(varbinary, varchar) would fail too
You need an explicit CAST
DECLARE #foo TABLE (ID int IDENTITY (1,1), charmax varchar(MAX) NULL, binmax varbinary(MAX) NULL)
INSERT #foo (charmax, binmax) VALUES ('text', NULL)
INSERT #foo (charmax, binmax) VALUES (NULL, 0x303131)
INSERT #foo (charmax, binmax) VALUES ('Moretext', NULL)
INSERT #foo (charmax, binmax) VALUES (NULL, 0x414243454647)
SELECT ISNULL(binmax, CONVERT(varbinary(MAX), charmax))
FROM #foo
or
SELECT COALESCE(binmax, CONVERT(varbinary(MAX), charmax))
FROM #foo
Edit: I understand the question now... maybe
DECLARE #foo2 TABLE (ID int IDENTITY (1,1), BODY varbinary(MAX) NULL)
INSERT #foo2 (BODY) VALUES (CAST('text' AS varbinary(MAX)))
INSERT #foo2 (BODY) VALUES (0x303132)
INSERT #foo2 (BODY) VALUES (CAST('Moretext' AS varbinary(MAX)))
INSERT #foo2 (BODY) VALUES (0x414243454647)
SELECT
BODY AS BODY_BIN,
CAST(BODY AS varchar(MAX)) AS BOY_TEXT
FROM
#foo2
Edit2: something like this (not tested) to maintain the same write interface. Normally, I'd only maintain a read interface hence the confusion...
CREATE VIEW OldFoo
AS
SELECT
ID,
BODY AS BODY_BIN,
CAST(BODY AS varchar(MAX)) AS BOY_TEXT
FROM
newFoo
GO
CREATE TRIGGER ON OldFoo INSTEAD OF INSERT
AS
SET NOCOUNT ON
INSERT newFoo (BODY)
SELECT ISNULL(binmax, CONVERT(varbinary(MAX), charmax))
FROM INSERTED
GO
First, this is a bad design. Joining on a varchar(max) or varbinary(max) field is a bad idea since they can't be indexed. Prepare for table scans!
You have inconsistent data types in the same column, which is a problem.
Try:
CAST((COALESCE(BODY_BIN, BODY_TEXT)) as varchar(max))