VB.NET and SQL INSERT INTO; data is truncated - sql

I use VB.NET Studio Express 2012 to read a filestream into SQL Server Express. The database and table are created fine, most records load without error using .ExecuteNonQuery INSERT INTO, but some records I get the error:
String or binary data would be truncated.
Originally this was correct, because the column was only 20 characters and the data was between 22-25 on the failing records. I have changed the table so the column now is 30 char, but the error is still the same. I dropped the database and recreated it, but still the same problem.
Does VB keep info on field length somewhere?

May be some spaces are present before or after your string,you can use Trim() function and then try to insert.Trim function will remove extra spaces placed before and after you string.

Related

ASP.NET VB SqlBulkCopy String or binary data would be truncated

I'm trying to use SqlBulkCopy to upload a data table into SQL Server.
I'm getting this error:
String or binary data would be truncated.
I'm using
sqlBulkcopy.WriteToServer(dtCustom)
The SQL Server column is defined as nvarchar(50)
I changed the column to nvarchar(150) and same problem
This line is causing the problems with the import:
Enter the Location of Inspection
The line is 32 characters
If I remove "on" then it works and imports.
So what's the deal with a SQL Server table column defined as nvarchar(50) and 32 characters for data going into it?
Any ideas?
The columns have to be mapped. Even though the datatable has the same column names as the SQL table - in the asp.net vb coding you still have to do column mappings.

Can't convert String to Numeric/Decimal in SSIS

I have five or six OLE DB Sources with a String[DT_STR], with a length of 500 and 1252 (Latin) as Code Page.
The format of the column is like 0,08 or 0,10 etc etc. As you can see, it is separated with a comma.
All of them are equal except one of them. In this one source, I have a POINT as separation. On this it is working when I set the Data Type in the advanced editor of the OLE DB Source. On another (with comma separated) it is also working, if I set the Data Type in the advanced editor of the OLE DB Source. BUT the weird thing is, that it isn't working with the other sources although they are the same (sperated with comma).
I tested Numeric(18,2) and decimal(2).
Another try to solve the problem with the conversion task and/or the derived column task, failed.
I'm using SQL Server 2008 R2
Slowly, I think SSIS is fooling me :)
Has anyone an idea?
/// EDIT
Here a two screens:
Is working:
click
Isn't working:
click
I would not set the Data Type in the Advanced Editor of the OLE DB Source. I would convert the data in the SQL Code of the OLE DB Source, or in a Script Transformation e.g. using Decimal.TryParse , which would update a new column.
SSIS is unbeleivably fussy over datatypes and trying to mess with its internals is not productive.
Check that there are any spaces in between the commas, so that the SSIS is throwing an error trying to convert the blank space to a number. A blank space does not equal nothing in between spaces.
Redirect error rows and output the data to a file. Then you can examine the data that is being rejected by the SSIS and determine why it's causing error.
Reason for the error
1) Null’s are not properly handled either in the destination database or during SSIS package creation. It is quite possible that the source contains a null database but the destination is not accepting the NULL data leading to build generate above error.
2) Data types between source and destination does not match. For example, source column has varchar data and destination column have an int data type. This can easily generate above error. There are certain kind of datatypes which will automatically convert to another data type and will not generate the error but there are for incompatible datatypes which will generate The value could not be converted because of a potential loss of data. error.
The Issue arises when there is unhandled space or null. I have worked around using the Conditional (Ternary) Operator which checks the length:
LEN(TRIM([Column Name])) >= 1 ? (DT_NUMERIC,38,8)[Column Name] : 0

SQL error message when querying table containing one varbinary (blob) field

We are using Delphi XE3 together with a TSQLDataSet and a TClientDataset to read a table into memory from SQL server 2012.
The table contains various fields, one of them a blob "varbinary(max)" where we store the content from a text file.
My problem is that we get an error message saying "connection is busy with results for another command" when we do a open on the ClientDataset.
The commandtext is a simple "select * from tablename".
This happens only if there are more than one item in the table. It also happens only if there are data in the blob field (<> NULL).
Everything work fine if we add a second varbinary field to the table. The second field does not have to contain any data.
This is driving me crazy, please help.
EDIT: As a workaround we have simply added a "dummy" varbinary field to the table. Because of this strange behavior, we have come to the conclusion that this has to be a bug in the TClientDataset component. Tried to do the same in a older version of delphi (XE2 SP3) with the same result.

Issues with Chr(0) in SQL INSERT script

We currently use the SQL Publishing Wizard to back up our database schemas and data, however we have some database tables with hashed passwords that contain the null character (chr(0)). When SQL Publishing Wizard generates the insert data scripts, the null character causes errors when we try and run the resulting SQL - it appears to ignore ALL TEXT after the first instance of this character in a script. We recently tried out RedGate SQL Compare, and found that it has the same issue with this character. I have confirmed it is ascii character code 0 by running the ascii() sql function against the offending record.
A sample of the error we are getting is:
Unclosed quotation mark after the character string '??`????{??0???
The fun part is, I can't really paste a sample Insert statement because of course everything that appears after the CHR(0) is being omitted when pasting!
Change the definition of the column to VARBINARY. The data you store in there doesn't seem to be an appropiate VARCHAR to start with.
This will ripple through the code that uses the column as you'll get a byte[] CLR tpe back in the client, and you should change your insert/update code accordingly. But after all, a passowrd hash is a byte[], not a string.

Why is maximum length of varchar less than 8,000 bytes?

So I have a stored procedure in a SQLServer 2005 database, which retrieves data from a table, format the data as a string and put it into a varchar(max) output variable.
However, I notice that although len(s) reports the string to be > 8,000, the actual string I receive (via SQLServer output window) is always truncated to < 8,000 bytes.
Does anybody know what the causes of this might be ? Many thanks.
The output window itself is truncating your data, most likely. The variable itself holds the data but the window is showing only the first X characters.
If you were to read that output variable from, for instance, a .NET application, you'd see the full value.
Are you talking about in SQL Server Management Studio? If so, there are some options to control how many characters are returned (I only have 2008 in front of me, but the settings are in Tools|Options|Query Results|SQL Server|Results to Grid|Maximum Characters Retrieved and Results to Text|Maximum number of characters displayed in each column.
The data is all there, but management studio isn't displaying all of the data.
In cases like this, I've used MS Access to link to the table and read the data. It's sad that you have to use Access to view the data instead of Management Studio or Query Analyzer, but that seems to be the case.
However, I notice that although len(s) reports the string to be > 8,000
I have fallen for the SQL Studio issue too :) but isn't the maximum length of varchar 8,000 bytes, or 4,000 for nvarchar (unicode).
Any chance the column data type is actually text or ntext and you're converting to varchar?