How do I disable errors on string truncation in SQL Server? - sql-server-2000

How can I tell SQL Server not to raise an error if I insert or update a string longer than the size of the field - I would like silent truncation in this instance.

The thing you have to do is set ANSI WARNINGS to OFF
You can do that by calling
set ANSI_WARNINGS OFF
I have also written a practical example:
create table bla(id varchar(2))
go
insert bla values ('123') --fails
set ANSI_WARNINGS OFF
insert bla values ('123') --succeeds
Do remember to turn the ANSI warnings back ON when you are done.
You can do so by calling:
set ANSI_WARNINGS ON

Try casting the variable to the exact type and length before inserting it. That might do the trick. Casting (and converting) are much more flexible. :)

Related

Syntax issue on the source

I'm migrating a SQL Server 2008 database to SQL Server 2019, I have used Microsoft Data Migration Assistant, to look for search any breaking changes, issues or syntax errors.
I getting errors for some of my procedures:
Object [dbo].[PROCEDURE1] has syntax errors. Must declare the variable or parameter "#SINI". Error number 70590. For more details, please see: Line 9, Column 16.
This is my procedure:
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
ALTER PROCEDURE [dbo].[PROCEDURE1]
#Refer AS varchar,
#Ret Decimal OUTPUT
AS
DECLARE #SIni AS Decimal
SET #SIni= (SELECT Ini FROM Table1 WHERE Refer = #Refer)
SET #Ret = #SINI
Probably you have a server with case sensitive collation. As is explained in the documentation, the identifiers for variables, GOTO labels, temporary stored procedures, and temporary tables are in the default collation of the server instance.. You may check this with the following simple statement:
SELECT SERVERPROPERTY('collation');
But, to fix the error, use the correct case-sensitive variable name:
...
SET #Ret = #SIni
...
As an additional note, declare your data types with the appropriate length (as #Larnu commented). The length attribute is optional, and in case of parameter declaration the SQL Server assigns 1 as length, so the #Refer parameter has data type varchar(1).

SQL Server turn off ANSI_WARNING ON A stored procedure

I need to know how to turn off ANSI warnings on my stored procedure please. I keep getting the error
String or binary data would be truncated.
However, I would rather this be turned off so as I expect this and would rather allow it.
I added the statement
SET ANSI_WARNINGS OFF
GO
right before the stored procedure however, doing this does not seem to suppress the error at all.
For the reason why I have this truncate error to begin with, well one of my stored procs executes dynamic Sql to retrieve values(SQLFIddle showing the code ). And I had to set the length on all of my fields to the length of the max (NVarchar(3072)). When my query is executed however, I need them back to the right size when printing them to the client.
Would appreciate info on how to best deal with this please. Thanks in advance.
I agree with #marc_s -- fix the problem, not the symptom especially if your intent is to truncate. What will another developer think when he comes along and a proc is throwing these errors and a non standard flag was used to suppress the issue?
Code to make your intent to truncate clear.
Identifying your Problem
The fiddle doesn't display the behavior your describe. So I'm still a little confused as to the issue.
Also, your SQL fiddle is way too dense for a question like this. If I don't answer your question below work to isolate the problem to the simplest use case possible. Don't just dump 500 lines of your app into a window.
Note: The Max NVarchar is either 4000 in version of SQL 7 & 2000 or 2 Gigs (nvarchar(max)) in SQL 2005 and later. I have no idea where you came up with 3072.
My Test
If you're truncating at the SPROC parameter level, ANSI Warnings flags is ignored, as this MSDN page warns. If it's inside your procedure, I created a little test proc that displays the ANSI flag allowing truncation:
CREATE Proc DoSomething (#longThing varchar(50)) AS
DECLARE #T1 TABLE ( shortThing VARCHAR(20) );
SET ANSI_WARNINGS OFF
Print ' I don''t even whimpler when truncating'
INSERT INTO #T1 (ShortThing) VALUES ( #longThing);
SET ANSI_WARNINGS ON
Print ' I yell when truncated'
INSERT INTO #T1 (ShortThing) VALUES ( #longThing);
Then calling it the following works as expected:
exec DoSomething 'Text string longer than 20 characters'
FIXING THE PROBLEM
Nevertheless, why not just code so your intent to (potentially) truncate data is clear? You can avoid the warning rather than turn it off. I would do one of the following:
make your Procedure parameters long enough to accommodate the input
IF you need to shorten string data use Substring() to trim data.
Use CAST or CONVERT to format the data to your requirement. This page (section headed "Implicit Conversions" should help) details how cast & convert work.
My simple example above can be modified as follows to avoid the need to set any flag.
CREATE Proc DoSomethingBETTER (#longThing varchar(50)) AS
SET ANSI_WARNINGS ON
DECLARE #T1 TABLE ( shortThing VARCHAR(20) );
--try one of these 3 options...
INSERT INTO #T1 (ShortThing) VALUES ( Convert(varchar(20), #longThing));
INSERT INTO #T1 (ShortThing) VALUES ( Substring(#longThing, 1, 20));
INSERT INTO #T1 (ShortThing) VALUES ( Cast(#longThing as varchar(20)) );
Print('Ansi warnings can be on when truncating data');
An Aside - Clustered Guids
Looking at your fiddle I noticed that you Uniqueidentifer as the key in your Clustered indexes. In almost every scenario this is a pretty inefficient option. The randomness of GUIDs means your data is constantly being fragmented & re-shuffled.
Hopefully you can convert to int identity, you're using newsequentialid(), or COMB guids as described in Jimmy Nilsson's article.
You can see more about the problem here, here, here, and here.

How can I ignore 'Arithmetic Overflow' related errors from within a data view?

I have a complex data view that recursively links and summarizes information.
Each night a scheduled task runs a stored procedure that selects all of the data from the data view, and inserts it into a table so that users can query and analyze the data much more quickly than running a select statement on the data view.
The parent table consists of a few hundred thousand records and the result set from the export is well over 1,000,000 records in size.
For most nights the exportation process works without any trouble, however, if a user enters an incorrect value within our master ERP system, it will crash the nightly process because one of the decimal fields will contain a value that doesn't fit within some of the conversions that I have to make on the data. Debugging and finding the specific, errant field can be very hard and time consuming.
With that said, I've read about the two SQL settings NUMERIC_ROUNDABORT and ARITHABORT. These sounds like the perfect options for solving my problem, however, I can't seem to get them to work with either my data view or stored procedure.
My stored procedure is nothing more than a TRUNCATE and INSERT statement. I appended...
SET NUMERIC_ROUNDABORT OFF
SET ARITHABORT OFF
... to the beginning of the SP and that didn't help. I assume this is because the error is technically taking place from within the code associated with the data view.
Next, I tried adding two extended properties to the Data View, hoping that that would work. It didn't.
Is there a way that I can set these SQL properties to ignore rounding errors so that I can export my data from my data view?
I know for most of us, as SO answerers, our first inclination is to ask for code. In this case, however, the code is both extremely complex and proprietary. I know fixing the definitions that cause the occasional overflow is the most ideal solution, but in this circumstance, it is much more efficient to just ignore these type of errors because they happen on such a rare basis and are so difficult to troubleshoot.
What can I do to ignore this behavior?
UPDATE
By chance, I believe I might have found the root cause of the issue, however, I have no idea why this would be occurring. It just doesn't make since.
Through out my table view, I have various fields that are calculated. Since these fields need to fit in fields within the table that are defined as decimal (12, 5), I always wrap the view field statements in a CAST( ... AS DECIMAL(12, 5)) clauses.
By chance, I stumbled upon an oddity. I decided to see how SSMS "saw" my data view. In the SSMS Object Explorer, I expanded the Views->[My View]-Columns section and I saw that one of the fields was defined as a decimal (13, 5).
I assumed that I must have made a mistake in one of my casting statements but after searching throughout the code for the table view, there is no definition for a decimal(13, 5) field?! My only guess is that the definition that SSMS sees of the view field must be derived from resulting data. However, I have no clue how this could happen since I each field to a decimal(12, 5).
I would like to know why this is happening but, again, my original question still stands. How and what SET statement can I define on a table view that will ignore all of thee arithmetic overflows and write a null value in the fields with errant data?
FINAL COMMENTS
I've marked HeavenCore's response as the answer because it does address my question but it hasn't solved my underlying problem.
After a bit of troubleshooting and attempts at trying to get my export to work, I'm going to have to try a different approach. I still can't get the export to work, even if I set the NUMERIC_ROUNDABORT and ARITHABORT properties to OFF.
i think ARITHABORT is your friend here.
For instance, using SET ARITHABORT OFF & SET ANSI_WARNINGS OFF will NULL the values it fails to cast (instead of throwing exceptions)
Here is a quick example:
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[tbl_OverflowExample](
[Value] [decimal](12, 2) NULL
) ON [PRIMARY]
GO
INSERT [dbo].[tbl_OverflowExample] ([Value]) VALUES (CAST(9999999999.00 AS Decimal(12, 2)))
GO
INSERT [dbo].[tbl_OverflowExample] ([Value]) VALUES (CAST(1.10 AS Decimal(12, 2)))
GO
--#### Select data without any casting - works
SELECT VALUE
FROM dbo.tbl_OverflowExample
--#### With ARITHABORT and ANSI warnings disabled - Returns NULL for 999999 but 1.10 as expected
SET ARITHABORT OFF;
SET ANSI_WARNINGS OFF;
SELECT CONVERT(DECIMAL(3, 2), VALUE)
FROM dbo.tbl_OverflowExample
GO
--#### With defaults - Fails with overflow exception
SET ARITHABORT ON;
SET ANSI_WARNINGS ON;
SELECT CONVERT(DECIMAL(2, 2), VALUE)
FROM dbo.tbl_OverflowExample
Personally though - i'd prefer to debug the view and employ some CASE /.../ END statements to return NULL if the underlying value is greater than the target data type - this would ensure the view works regardless of the connection options.
EDIT: Corrected some factual errors

Exceeding maximum number of variable declarations in script on SQL Server 2005

An application I am currently working on will generate an SQL script to populate a database. A single transaction in the script looks like this (note I have changed table/variable names ;-)
USE MyDatabase
BEGIN TRANSACTION
SET XACT_ABORT ON
DECLARE #Foo int
SET IDENTITY_INSERT Table1 ON
INSERT INTO Table1 [...]
SET IDENTITY_INSERT Table1 OFF
SET IDENTITY_INSERT Table2 ON
INSERT INTO Table2 [...]
SET IDENTITY_INSERT Table2 OFF
INSERT INTO Table3 [...]
-- Here I reference #Foo
SET #Foo = dbo.SomeStoredProcedure()
-- Use #Foo in some query
COMMIT TRANSACTION
GO
SET NOCOUNT ON
This script will then generate n of these transactions, which will then be excuted on SQL Server 2005 to populate the database with n records.
The problem I am seeing is with the declaration of the #Foo variable shown above. When running the script, once we have reached 65535 records, I get the following error:
The variable name '#Foo' has already been declared.
Variable names must be unique within a query batch or stored procedure.
I think this is a misleading error message, because everything is fine until I hit 65535, and the significance of this number (2^16-1) leads me to believe I am hitting some sort of script limitation.
I have tried defining the #Foo variable once, at the top of the script, and re-using it within each transaction. But this doesn't work as it appears each transaction has its own scope.
Would creating an extra level of scope (i.e. an inner transaction) and declaring the variable within the deeper scope help address this issue?
Any other recommendations about the best way to fix this issue?
Looks like you've missed the GO delimiter, since I have scripts with many more lines. Check your scripting solution
Try The Go delimiter and also change the data type for the dynmc. variable. You are going out of its range and an "INT" is a cyclic data type in SQL Servers. It will come to the same address when the cycle of -65536 to 65535 completes.

NHibernate INSERT TO SQL Calculated field SET NOCOUNT

I need to surpress messages output from a SQL function. As in 1 row affected. I can't use SET NOCOUNT as it's invalid in a function.
Anyone know a way to do this?
Thanks.
EDIT
I was trying to limit the background information in an attemp to boil the problem down to it's essence. But I'll expand. I'm using MSSQL2005 and NHibernate to insert a record in to a SQL table. On the table I have a computed column that runs the function which is reporting back 1 row affected.
I didn't really want to edit the NHibernate part of the process but it may be unavoidable.
A function that returns "(1 row affected)" will be part of a bigger query in a batch. It makes no sense to have SET NOCOUNT ON in the function
You need to do this:
SET NOCOUNT ON;
SELECT * FROM MyUDFTVF();
Note a stored procedure is simply a wrapper for this
CREATE PROC Whatever
AS
SET NOCOUNT ON;
SELECT * FROM MyUDFTVF();
GO
SET NOCOUNT ON is normally needed to stop triggers etc breaking client code: why do you need it here?
The nocount setting is not available in functions.
Stored procedures allow you to set nocount. So converting the function to a stored procedure would solve the problem.
Otherwise, the calling code will have to set nocount. That shouldn't be hard, but might be tedious if the function is used in many places.
P.S. If you post the reason why suppressing the count messages is required, perhaps we can offer some more solutions.