"String or binary data would be truncated." error after publishing on IIS7 - sql

Could you help me with the following error, which causes in my asp.net webforms application after publishing it on IIS 7 web-server.
Error:
There is a following error message "String or binary data would be truncated. The statement has been terminated."
Additional details:
I have a table in MS SQL database to keep files uploaded by users.
The field is set as varbinary(max)
I run my webapp from Visual Studio Development server using the same DB from my workstation and there is no any error.
But after publishing on IIS7 web-server I have this error.
(Source codes, a file to upload and DB are the same)
I tried to set the exact length of SqlParameter, but result is the same.
Dim fileDataParam As New SqlParameter("FileData", SqlDbType.VarBinary,
fileData.Length)
fileDataParam.Value = fileData
params.Add(fileDataParam)
Please, could you give me a piece of advice what reasons can cause this error?
Could you recommend me what settings of IIS or MSSQL I should set or check?
Update:
I run SQL Server profile in both cases.
SQL queries are the same, in the first case (on IIS) error occurs, in the second case (on my PC) - no error.
Solved
I have solved the problem. It was a truncation of a string but in another field [ServerName] (it passed short value on my PC and long value on server side.)

I would suggest you accept #ps2goat comment as the answer.
The data you are inserting into the column is too large for the column size. Set the column to varbinary(MAX)

Related

SQL Server 2017 database downgrade to SQL Server 2014 (the target is in a Virtual Machine)

I've been dealing for trying to migrate a really big database to an earlier SQL Server with multiple ways, I started doing a .bak file, but I found that it is not compatible and it should be the same SQL Server version.
Then I chose the task - generate Script, to create a .sql file with all schema and data, but the file was 24gb big! Even though the file was really big by using the sqlcmd I managed to execute it. But it never finished executing successfully, it threw multiple types of errors, like:
Msg 156, Level 15, State 1:
Incorrect syntax near the keyword '...'
Msg 105, Level 15, State 1
Unclosed quotation mark after the character string
Then I found this comment with 2 solutions https://stackoverflow.com/a/27623706/3192041, I tried the first one but still throwing the second error, I tried the second one and It worked! everything was now running smoothly, but then I got another error...
This error:
Sqlcmd: Error: Internal error at ReadText (Reason: An attempt was made
to move the file pointer before the beginning of the file).
So now the issue has something to do with sqlcmd command??
Should I continue trying to migrate the database with the generated script? is there a better way of this and making it compatible with an earlier SQL Server version?
Things to clarify
I first created a script only with schema information, but when I tried to generate a separate script for data only the SSMS was throwing an error. So with this way I can't or I don't know how to export all data with an easy way. I know you could export data for each of the tables, but the database has more than 200 tables and this is not viable.
Also the script takes more than one hour, and maybe a lot more than that time if the process would finish correctly.
Finally after also trying with a bacpac file, that also didn't let me to create because of a bunch of errors of windows users, external object references, and more...
The best answer to solve this, is by creating a .dacpac file. The dacpac file from SSMS 2012 to the latest versions, you can now include the data of all your tables.
And to solve the incompatibility issue, you can use the AllowIncompatiblePlatform property to allow deployment to different versions of SQL Server when publishing to the target server.
so first you need to extract using the SqlPackage.exe from your bin folder of the sql server, in my case this is the folder: C:\Program Files (x86)\Microsoft SQL Server\130\DAC\bin
then run the command with the Extract action:
SqlPackage /Action:Extract /SourceDatabaseName:"<database-name>" /SourceServerName:"<server-name>" /SourceUser:"sa" /SourcePassword:"<password>" /TargetFile:"<dacpac-file-path>" /p:ExtractAllTableData=True
Then in the other server run this command in correct bin folder of the SqlPackage.exe program:
SqlPackage /Action:Publish /SourceFile:" <dacpac file path>\filename.dacpac" /TargetDatabaseName:"<database name>" /TargetServerName:"<ServerName>" /TargetUser:"<username>" /TargetPassword:"<password>(if needed)" /p:AllowIncompatiblePlatform=true /p:CreateNewDatabase=true
And If you want to create the database from scrath.
/p:CreateNewDatabase=true
I hope this helps anyone with this problem, with big databases, and importing from a bigger sql server version.

SSIS Oracle connection string as project parameter is not being replaced at runtime

We move data from Oracle 11 to SQL 2014 using SSIS project deployment model. We use Attunity 3.0 connector.
Connection string to oracle data source is a project parameter and is also stored in a table in SQL.
We use custom stored procedure that
Gets this connection string stored in the sql table
sets project parameters (via [SSISDB].[catalog].[set_execution_parameter_value] )
executes packages (via [SSISDB].[catalog].[start_execution] )
We use [SSISDB].[internal].[execution_parameter_values] to check that parameter values are being replaced during run time with the connection string we stored in the backend.
What's interesting is that, even though Oracle connection string is being replaced during runtime, the package still tries to use the connection string it has been complied with(Project Params). We do not have the same issue when connecting to a SQL Source in a similar fashion.
Do you have any suggestions? Is it a known issue?
Found the solution. Turns out that the oracle connection string that we stored in the table did not prefix the server name with "SERVER = ". The connection string would straight away start with For Eg - 'x1abc01.something.com:1234/x1abc01;ORACLEHOME=;ORACLEHOME64=;WINAUTH=0;'. Changed the connection string to 'SERVER = x1abc01.something.com:1234/x1abc01;ORACLEHOME=;ORACLEHOME64=;WINAUTH=0;' and it started working now. We tested it by deploying the ssis solution with one connection string and changing it with a different connection string from the database and the overwritten value persists.
However, its still bizarre where the disconnect happens when the run time connection string has an invalid value and its not reported out as an error and ssis quietly switches to design time value in Project Param.

SSRS Parameter issue (SQL Server 2008 R2)

Having an issue where I have created a report with a shared data source and several shared data sets. The report has 2 parameters one dependent on the other. This was working fine until just recently. Now any report I create with a parameter does not work. I've tried creating a simple empty report that has an embedded DataSet:
SELECT ClientId, Name From Client
Then have the parameter use this query to populate available values (Value = ClientId and Label = Name).
When I preview in BIDS (VS-2008) it works just fine.
When I deploy and run the report I get the following error:
An error occurred within the report server database. This may be due to a connection failure, timeout or low disk condition within the database. (rsReportServerDatabaseError)
Arithmetic overflow error converting expression to data type int. The statement has been terminated.
There are no expressions and I've deleted all old datasets that used to have an expression. Seems like SSRS has something cached though I have deleted Internet Cache and deleted the datasets from SSRS and went into the DB and deleted everything I could. Anyone else out there experience this issue?
Thanks

How to prevent 'query timeout expired'? (SQLNCLI11 error '80040e31')

I have a connection to a MS SQL Server 2012 database in classic ASP (VBScript). This is my connection string:
Provider=SQL Server Native Client 11.0;Server=localhost;
Database=databank;Uid=myuser;Pwd=mypassword;
When I execute this SQL command:
UPDATE [info] SET [stamp]='2014-03-18 01:00:02',
[data]='12533 characters goes here',
[saved]='2014-03-18 01:00:00',
[confirmed]=0,[ip]=0,[mode]=3,[rebuild]=0,
[updated]=1,[findable]=0
WHERE [ID]=193246;
I get the following error:
Microsoft SQL Server Native Client 11.0
error '80040e31'
Query timeout expired
/functions.asp, line 476
The SQL query is pretty long, the data field is updated with 12533 characters. The ID column is indexed so finding the post with ID 193246 should be fast.
When I execute the exact same SQL expression (copied and pasted) on SQL Server Management Studio it completes successfully in no time. No problem what so ever. So there isn't a problem with the SQL itself. I've even tried using a ADODB.Recordset object and update via that (no self-written SQL) but I still get the same timeout error.
If I go to Tools > Options > Query Execution in the Management Studio I see that execution time-out is set to 0 (infinite). Under Tools > Options > Designers I see that transaction time-out is set to 30 seconds, which should be plenty enough since the script and database is on the same computer ("localhost" is in the connection string).
What is going on here? Why can I execute the SQL in the Management Studio but not in my ASP code?
Edit: Tried setting the 30 sec timeout in the Designers tab to 600 sec just to make sure, but I still get the same error (happens after 30 sec of page loading btw).
Here is the code that I use to execute the SQL on the ASP page:
Set Conn = Server.CreateObject("ADODB.Connection")
Conn.Open "Provider=SQL Server Native Client 11.0;
Server=localhost;Database=databank;Uid=myuser;Pwd=mypassword;"
Conn.Execute "UPDATE [info] SET [stamp]='2014-03-18 01:00:02',
[data]='12533 characters goes here',[saved]='2014-03-18 01:00:00',
[confirmed]=0,[ip]=0,[mode]=3,[rebuild]=0,[updated]=1,[findable]=0
WHERE [ID]=193246;"
Edit 2: Using Conn.CommandTimeout = 0 to give infinite execution time for the query does nothing, it just makes the query execute forever. Waited 25 min and it was still executing.
I then tried to separate the SQL into two SQL statements, the long data update in one and the other updates in the other. It still wouldn't update the long data field, just got timeout.
I tried this with two additional connection strings:
Driver={SQL Server};Server=localhost;Database=databank;Uid=myuser;Pwd=mypassword;
Driver={SQL Server Native Client 11.0};Server=localhost;Database=databank;Uid=myuser;Pwd=mypassword;
Didn't work. I even tried changing the data to 12533 A's just to see if the actual data was causing the problem. Nope, same problem.
Then I found out something interesting: I tried to execute the short SQL first, before the long update of the data field. It ALSO got query timeout exception...
But why? It has so little stuff to update in it (the whole SQL statement is less than 200 characters). Will investigate further.
Edit 3: I thought it might have been something to do with the login but I didn't find anything that looked wrong. I even tried changing the connection string to use the sa-account but even that didn't work, still getting "Query timeout expired".
This is driving me mad. There is no solution, no workaround and worst of all no ideas!
Edit 4: Went to Tools > Options > Designers in the Management Studio and ticked off the "Prevent saving changes that require table re-creation". It did nothing.
Tried changing the "data" column data type from "nvarchar(MAX)" to the inferior "ntext" type (I'm getting desperate). It didn't work.
Tried executing the smallest change on the post I could think of:
UPDATE [info] SET [confirmed]=0 WHERE [ID]=193246;
That would set a bit column to false. Didn't work. I tried executing the exact same query in the Management Studio and it worked flawlessly.
Throw me some ideas if you have got them because I'm running out for real now.
Edit 5: Have now also tried the following connection string:
Provider=SQLOLEDB.1;Password=mypassword;Persist Security Info=True;User ID=myuser;Initial Catalog=databank;Data Source=localhost
Didn't work. Only tried to set confirmed to false but still got a time out.
Edit 6: Have now attempted to update a different post in the same table:
UPDATE [info] SET [confirmed]=0 WHERE [ID]=1;
It also gave the timeout error. So now we know it isn't post specific.
I am able to update posts in other tables in the same "databank" database via ASP. I can also update tables in other databases on localhost.
Could there be something broken with the [info] table? I used the MS Access wizard to auto move data from Access to MS SQL Server 2012, it created columns of data type "ntext" and I manually went and changed that to "nvarchar(MAX)" since ntext is deprecated. Could something have broken down? It did require me to re-create the table when I changed the data type.
I have to get some sleep but I will be sure to check back tomorrow if anybody has responded to me. Please do, even if you only have something encouraging to say.
Edit 7: Quick edit before bed. Tried to define the provider as "SQLNCLI11" in the connection string as well (using the DLL name instead of the actual provider name). It makes no difference. Connection is created just as fine but the timeout still happens.
Also I'm not using MS SQL Server 2012 Express (as far as I know, "Express" wasn't mentioned anywhere during installation). It's the full thing.
If it helps, here's the "Help" > "About..." info that is given by the Management Studio:
Microsoft SQL Server Management Studio: 11.0.2100.60
Microsoft Analysis Services Client Tools: 11.0.2100.60
Microsoft Data Access Components (MDAC): 6.3.9600.16384
Microsoft MSXML: 3.0 5.0 6.0
Microsoft Internet Explorer: 9.11.9600.16521
Microsoft .NET Framework: 4.0.30319.34011
Operating System: 6.3.9600
Edit 8 (also known as the "programmers never sleep" edit):
After trying some things I eventually tried to close the database connection and reopening it right before executing the SQL statements. It worked all of a sudden. What the...?
I have had my code inside a subroutine and it turns out that outside of it the post that I was trying to update was already opened! So the reason for the timeout was that the post or the whole table was locked by the very same connection that tried to update it. So the connection (or CPU thread) was waiting for a lock that would never unlock.
Hate it when it turns out to be so simple after trying so hard.
The post had been opened outside the subroutine by this simple code:
Set RecSet = Conn.Execute("SELECT etc")
I just added the following before calling the subroutine.
RecSet.Close
Set RecSet = Nothing
The reason why this never crossed my mind is simply because this was allowed in MS Access but now I have changed to MS SQL Server and it wasn't so kind (or sloppy, rather). The created RecSet by Conn.Execute() had never created a locked post in the database before but now all of a sudden it did. Not too strange since the connection string and the actual database had changed.
I hope this post saves someone else some headache if you are migrating from MS Access to MS SQL Server. Though I can't imagine there are that many Access users left in the world nowadays.
Turns out that the post (or rather the whole table) was locked by the very same connection that I tried to update the post with.
I had a opened record set of the post that was created by:
Set RecSet = Conn.Execute()
This type of recordset is supposed to be read-only and when I was using MS Access as database it did not lock anything. But apparently this type of record set did lock something on MS SQL Server 2012 because when I added these lines of code before executing the UPDATE SQL statement...
RecSet.Close
Set RecSet = Nothing
...everything worked just fine.
So bottom line is to be careful with opened record sets - even if they are read-only they could lock your table from updates.

Most efficient way to output SQL table to XML file

I have server that needs to process and dump from an SQL database queries and tables into xml format on disk. This needs to be a scheduled task.
Currently using BCP via a scheduled batch file > sql script > xp_cmdshell >bcp, but this error
SQLState = S1000, NativeError = 0
Error = [Microsoft][SQL Server Native Client 10.0][SQL Server]Warning: Server data (172885 bytes) exceeds host-file field length (65535 bytes) for field (1). Use prefix length, termination string, or a larger host-file field size. Truncation cannot occur
for BCP output files.
is troubling me in the log files. I have found no solution online yet. I do not quite understand what the 'host-file field' is referring to. The original table has no column with a value as large as 172885 bytes. The output files are very large, and so far it seems as thought the data is all being written, but there seems to be some garbage at the end of all the xml files.
Performance is important but reliability is the most important for me in this situation.
I have tried recreating the error locally but have been unsuccessful in doing so. The server runs Windows Server 2008 r2.
Any help or explanation/analysis of the error and it's meaning, as well as a recommendation of a simple scheduled solution to dump the sql tables/queries to xml files, would be appreciated.
You should check out the FOR XML PATH syntax introduced in SQL Server 2005:
SQL Server: simple example of creating XML file with T-SQL
What's new in FOR XML in SQL Server 2005
With this, you can easily create fairly nifty XML outputs, including hierarchies, attributes and more