How to prevent 'query timeout expired'? (SQLNCLI11 error '80040e31') - sql

I have a connection to a MS SQL Server 2012 database in classic ASP (VBScript). This is my connection string:
Provider=SQL Server Native Client 11.0;Server=localhost;
Database=databank;Uid=myuser;Pwd=mypassword;
When I execute this SQL command:
UPDATE [info] SET [stamp]='2014-03-18 01:00:02',
[data]='12533 characters goes here',
[saved]='2014-03-18 01:00:00',
[confirmed]=0,[ip]=0,[mode]=3,[rebuild]=0,
[updated]=1,[findable]=0
WHERE [ID]=193246;
I get the following error:
Microsoft SQL Server Native Client 11.0
error '80040e31'
Query timeout expired
/functions.asp, line 476
The SQL query is pretty long, the data field is updated with 12533 characters. The ID column is indexed so finding the post with ID 193246 should be fast.
When I execute the exact same SQL expression (copied and pasted) on SQL Server Management Studio it completes successfully in no time. No problem what so ever. So there isn't a problem with the SQL itself. I've even tried using a ADODB.Recordset object and update via that (no self-written SQL) but I still get the same timeout error.
If I go to Tools > Options > Query Execution in the Management Studio I see that execution time-out is set to 0 (infinite). Under Tools > Options > Designers I see that transaction time-out is set to 30 seconds, which should be plenty enough since the script and database is on the same computer ("localhost" is in the connection string).
What is going on here? Why can I execute the SQL in the Management Studio but not in my ASP code?
Edit: Tried setting the 30 sec timeout in the Designers tab to 600 sec just to make sure, but I still get the same error (happens after 30 sec of page loading btw).
Here is the code that I use to execute the SQL on the ASP page:
Set Conn = Server.CreateObject("ADODB.Connection")
Conn.Open "Provider=SQL Server Native Client 11.0;
Server=localhost;Database=databank;Uid=myuser;Pwd=mypassword;"
Conn.Execute "UPDATE [info] SET [stamp]='2014-03-18 01:00:02',
[data]='12533 characters goes here',[saved]='2014-03-18 01:00:00',
[confirmed]=0,[ip]=0,[mode]=3,[rebuild]=0,[updated]=1,[findable]=0
WHERE [ID]=193246;"
Edit 2: Using Conn.CommandTimeout = 0 to give infinite execution time for the query does nothing, it just makes the query execute forever. Waited 25 min and it was still executing.
I then tried to separate the SQL into two SQL statements, the long data update in one and the other updates in the other. It still wouldn't update the long data field, just got timeout.
I tried this with two additional connection strings:
Driver={SQL Server};Server=localhost;Database=databank;Uid=myuser;Pwd=mypassword;
Driver={SQL Server Native Client 11.0};Server=localhost;Database=databank;Uid=myuser;Pwd=mypassword;
Didn't work. I even tried changing the data to 12533 A's just to see if the actual data was causing the problem. Nope, same problem.
Then I found out something interesting: I tried to execute the short SQL first, before the long update of the data field. It ALSO got query timeout exception...
But why? It has so little stuff to update in it (the whole SQL statement is less than 200 characters). Will investigate further.
Edit 3: I thought it might have been something to do with the login but I didn't find anything that looked wrong. I even tried changing the connection string to use the sa-account but even that didn't work, still getting "Query timeout expired".
This is driving me mad. There is no solution, no workaround and worst of all no ideas!
Edit 4: Went to Tools > Options > Designers in the Management Studio and ticked off the "Prevent saving changes that require table re-creation". It did nothing.
Tried changing the "data" column data type from "nvarchar(MAX)" to the inferior "ntext" type (I'm getting desperate). It didn't work.
Tried executing the smallest change on the post I could think of:
UPDATE [info] SET [confirmed]=0 WHERE [ID]=193246;
That would set a bit column to false. Didn't work. I tried executing the exact same query in the Management Studio and it worked flawlessly.
Throw me some ideas if you have got them because I'm running out for real now.
Edit 5: Have now also tried the following connection string:
Provider=SQLOLEDB.1;Password=mypassword;Persist Security Info=True;User ID=myuser;Initial Catalog=databank;Data Source=localhost
Didn't work. Only tried to set confirmed to false but still got a time out.
Edit 6: Have now attempted to update a different post in the same table:
UPDATE [info] SET [confirmed]=0 WHERE [ID]=1;
It also gave the timeout error. So now we know it isn't post specific.
I am able to update posts in other tables in the same "databank" database via ASP. I can also update tables in other databases on localhost.
Could there be something broken with the [info] table? I used the MS Access wizard to auto move data from Access to MS SQL Server 2012, it created columns of data type "ntext" and I manually went and changed that to "nvarchar(MAX)" since ntext is deprecated. Could something have broken down? It did require me to re-create the table when I changed the data type.
I have to get some sleep but I will be sure to check back tomorrow if anybody has responded to me. Please do, even if you only have something encouraging to say.
Edit 7: Quick edit before bed. Tried to define the provider as "SQLNCLI11" in the connection string as well (using the DLL name instead of the actual provider name). It makes no difference. Connection is created just as fine but the timeout still happens.
Also I'm not using MS SQL Server 2012 Express (as far as I know, "Express" wasn't mentioned anywhere during installation). It's the full thing.
If it helps, here's the "Help" > "About..." info that is given by the Management Studio:
Microsoft SQL Server Management Studio: 11.0.2100.60
Microsoft Analysis Services Client Tools: 11.0.2100.60
Microsoft Data Access Components (MDAC): 6.3.9600.16384
Microsoft MSXML: 3.0 5.0 6.0
Microsoft Internet Explorer: 9.11.9600.16521
Microsoft .NET Framework: 4.0.30319.34011
Operating System: 6.3.9600
Edit 8 (also known as the "programmers never sleep" edit):
After trying some things I eventually tried to close the database connection and reopening it right before executing the SQL statements. It worked all of a sudden. What the...?
I have had my code inside a subroutine and it turns out that outside of it the post that I was trying to update was already opened! So the reason for the timeout was that the post or the whole table was locked by the very same connection that tried to update it. So the connection (or CPU thread) was waiting for a lock that would never unlock.
Hate it when it turns out to be so simple after trying so hard.
The post had been opened outside the subroutine by this simple code:
Set RecSet = Conn.Execute("SELECT etc")
I just added the following before calling the subroutine.
RecSet.Close
Set RecSet = Nothing
The reason why this never crossed my mind is simply because this was allowed in MS Access but now I have changed to MS SQL Server and it wasn't so kind (or sloppy, rather). The created RecSet by Conn.Execute() had never created a locked post in the database before but now all of a sudden it did. Not too strange since the connection string and the actual database had changed.
I hope this post saves someone else some headache if you are migrating from MS Access to MS SQL Server. Though I can't imagine there are that many Access users left in the world nowadays.

Turns out that the post (or rather the whole table) was locked by the very same connection that I tried to update the post with.
I had a opened record set of the post that was created by:
Set RecSet = Conn.Execute()
This type of recordset is supposed to be read-only and when I was using MS Access as database it did not lock anything. But apparently this type of record set did lock something on MS SQL Server 2012 because when I added these lines of code before executing the UPDATE SQL statement...
RecSet.Close
Set RecSet = Nothing
...everything worked just fine.
So bottom line is to be careful with opened record sets - even if they are read-only they could lock your table from updates.

Related

SQL Server & RStudio - SQL Connection Almost Working

I've been running into an issue in R Studio with a SQL connection.
We've had an on-prem SQL Server that's been upgraded over the years, and a colleague that set it up no longer is with the organization.
We also have an Azure Server that's loaded with a SQL Server as well that was much more recently set up before they departed.
We have a GUI program we're currently developing, and one of the early steps is a SQL Login connection for the user where the variable is declared (db_user) and changes with their login and passes the password correctly within system variables defined in .Renviron as posted on RStudio's site for references.
Our initial connection string looks like this, and this is the line of code that starts the connection and where I believe the issue may lie first:
db_conn_onprem <- DBI::dbConnect(odbc::odbc(),
Driver = "SQL Server",
Server = Sys.getenv("server"),
Database = Sys.getenv("database"),
UID = Sys.getenv("db_user"),
PWD = Sys.getenv("PWD")
Whenever the Azure connection succeeds, it connects as dbo#Azure\Azure vs On-Prem's guest#Server\Server.
(I can't post in-line screenshots yet)
On-Prem Connection Screenshot: https://i.ibb.co/PmbGt5y/RStudio-SQL.png
Azure Connection Screenshot: https://i.ibb.co/WFY3FqZ/azure1.png
I feel this is something dbo-related since that's where the connection drops.
(variable names anonymized)
Now for the issue:
Whenever we attempt to run a series of queries, our on-prem errors out with this:
Error: nanodbc/nanodbc.cpp:1655: 42000: [Microsoft][SQL Server][SQL Server]Cannot execute as the server principal because the principal "db_user" does not exist, this type of principal cannot be impersonated, or you do not have permission.
<SQL> 'EXECUTE AS LOGIN = 'db_user' SELECT name FROM master.sys.sysdatabases WHERE dbid > 4 AND HAS_DBACCESS(name) = 1 ORDER BY name ASC'
However, run the exact same procedure on the SQL Server in Azure with relatively no major configuration, and it succeeds.
Here's the SQL Code we run:
EXECUTE AS LOGIN = 'db_user' SELECT name
FROM master.sys.sysdatabases
WHERE dbid > 4
AND HAS_DBACCESS(name) = 1
ORDER BY name ASC
I feel like I've exhausted my resources for this, first I thought it was the initial R code or possibly SQL Drivers, however I don't believe that to be the issue since the SQL driver pulls a list of names in R Studio in the Connections context menu, but bounces back the error when attempting to complete the query.
Whenever I'm searching errors for references for this error, I see
Cannot execute as the server principal because the principal "dbo" does not exist, this type of principal cannot be impersonated, or you do not have permission.
Listed as the most commonly related error for the one I'm experiencing, however I've tried a number of those (From blank DB ownerships to unrelated solutions), but I've mostly hit a wall here.
Any assistance would be greatly appreciated.
I feel this is something dbo-related since that's where the connection drops, but I have no clue where to continue going on this issue.
Yep.
This
EXECUTE AS LOGIN = 'db_user'
requires impersonate permission for the login. Which the error message is clearly telling you. It's unclear why you want to impersonate that login instead of simply connecting as the login to begin with.

"String or binary data would be truncated." error after publishing on IIS7

Could you help me with the following error, which causes in my asp.net webforms application after publishing it on IIS 7 web-server.
Error:
There is a following error message "String or binary data would be truncated. The statement has been terminated."
Additional details:
I have a table in MS SQL database to keep files uploaded by users.
The field is set as varbinary(max)
I run my webapp from Visual Studio Development server using the same DB from my workstation and there is no any error.
But after publishing on IIS7 web-server I have this error.
(Source codes, a file to upload and DB are the same)
I tried to set the exact length of SqlParameter, but result is the same.
Dim fileDataParam As New SqlParameter("FileData", SqlDbType.VarBinary,
fileData.Length)
fileDataParam.Value = fileData
params.Add(fileDataParam)
Please, could you give me a piece of advice what reasons can cause this error?
Could you recommend me what settings of IIS or MSSQL I should set or check?
Update:
I run SQL Server profile in both cases.
SQL queries are the same, in the first case (on IIS) error occurs, in the second case (on my PC) - no error.
Solved
I have solved the problem. It was a truncation of a string but in another field [ServerName] (it passed short value on my PC and long value on server side.)
I would suggest you accept #ps2goat comment as the answer.
The data you are inserting into the column is too large for the column size. Set the column to varbinary(MAX)

Connect to remote sql database using SQL SERVER

I finished a program using VB.NET 2008 AND SQL SERVER 2005 AND Linq To SQL. I want to use the program in 2 or More PCs and get access to one Database
I'm using this connection string:
db = New connectionString("server=192.168.1.3;database=DBNAME;user=DBUSER;password=DBPASS;integrated security=true")
The problem here is I get this message:
Expiration of the waiting period. The waiting time has elapsed prior to completion of the operation or the server is not responding.
NB: The message I got is translated from French language to English..
This error occurs usually because of two issues,
1 - SQL server unreachable ( due to TCP/IP problems or Firewall problems like steve mentioned above)
2 - Badly written queries that are taking too long and the SQL time expires, in this case kindly see the blow link
https://www.simple-talk.com/sql/performance/how-to-identify-slow-running-queries-with-sql-profiler/

VBA Timeout DoCmd.RunSQL Insert

Having a problem with a MS Access application that is throwing an ODBC connection timeout error on a DoCmd.RunSQL with an insert on a MS SQL Server linked table.
I've tried using:
Dim Mydb As Database
Set Mydb = CurrentDb
Mydb.QueryTimeout = 900
per the closest MSDN I could find, but did not work. I can insert into that SQL DB with less than 3-seconds query run time from SQL Management Studio, but from Access it gives this timeout.
Anyone else ran into the issue and/or found a remedy?
I would suggest creating a pass through query for this. With the pass through query you can set the timeout option on the property sheet. It is listed as
ODBC Timeout
If you set this to 0 it will wait until the query returns records. The other great thing about the pass through query is the SQL Server is what is doing the actual work and then it returns all of the records back to Access so it runs more efficient.
When you open the query in design view, there is a property ODBC Timeout. (Right click in blank -> Properties)
Have you tried setting it to 0 (infinite) or to a higher value?
It works for me!

SQL 2005 Linked Server Query Periodically Failing

We have a database running on SQL 2005. One of the store procedure looks up a user's email address from Active Directory using a linked server. The call to the linked server occurs in a database function.
I'm able to call is successfully from my Asp.Net application the first time, but periodically after that, it fails with the following error:
{"The requested operation could not be performed because OLE DB provider \"ADsDSOObject\" for linked server \"ADSI\" does not support the required transaction interface."}
It appears that the amount of time between calling the function affects whether the linked server query will work correctly. I am not using any transactions. When I try calling the function in a quick make-shift SQL script, it runs fine everytime (even when tested in quick succession).
Is there some sort of transaction being left open that naturally dies if I don't try calling the procedure again? I'm at a loss here.
Here is the simple call in the store procedure:
DECLARE #email varchar(50)
SELECT #email = LEFT(mail, 50)
FROM OPENQUERY (
ADSI,
'SELECT mail, sAMAccountName FROM ''LDAP://DC=Katz,DC=COM'' WHERE objectCategory = ''Person'' AND objectClass = ''User'''
)
WHERE sAMAccountName = CAST(#LoginName AS varchar(35))
RETURN #email
I've worked with SQL Server linkservers often, though rarely LDAP queries... but I got curious and read the Microsoft support page linked to in Ric Tokyo's previous post. Towards the bottom it reads:
It is typical for a directory server
to enforce a server limitation on the
number of objects that will be
returned for a given query. This is to
prevent denial-of-service attacks and
network overloading. To properly query
the directory server, large queries
should be broken up into many smaller
ones. One way to do this is through a
process called paging. While paging is
available through ADSI's OLEDB
provider, there is currently no way
available to perform it from a SQL
distributed query. This means that the
total number of objects that can be
returned for a query is the server
limit. In the Windows 2000 Active
Directory, the default server limit is
1,000 objects.
I'm thinking that the reason it fails on you (or not) depending on whether call it from the app or from a "quick make-shift sql script" (as you put it) might be related to the security context under which the operation is executing. Depending on how the link server connection was set up, the operation could be being executed under a variety of possible credentials depending on how you initiate the query.
I don't know, but that's my best guess. I'd look at the linkserver configuration, in particular the linkserver settings for what set of credentials are used as the security context under which operations executed across the linkserver run.
Rather then query Active Directory through a linked server, you might be better off caching your AD data into a SQL database and then querying that instead. You could use Integration Services by creating a OLE DB connection using "OLE DB PRovider for Microsoft Directory Services" and having a DataReader source with a query like:
SELECT physicalDeliveryOfficeName, department, company, title, displayName, SN,
givenName, sAMAccountName, manager, mail, telephoneNumber, mobile
FROM 'LDAP://DC=SOMECO,DC=COM'
WHERE objectClass='User' and objectCategory = 'Person'
order by mail
Using this method you will still run into the 1000 row limit for results from an AD query (note it is NOT advisable to try and increase this limit in AD, it is there to prevent the domain controller from becoming overloaded). Sometimes its possible to use a combination of queries to return the full data set, e.g. names A - L and M - Z
Alternatively you could use the CSVDE command line utility in Windows Server to export your directory information to a CSV file and then import it into a SQL database (see http://computerperformance.co.uk/Logon/Logon_CSVDE_Export.htm for more info on exporting AD data with CSVDE).
please read the support page from Microsoft
I suspect that it might be the cached query plan due to your statement that "When I try calling the function in a quick make-shift SQL script, it runs fine everytime (even when tested in quick succession)."
Could you try executing your stored procedure like so:
EXEC usp_MyProcedure WITH RECOMPILE
This question appears in the top of the first google page when search for the error string but has not valid answer.
This error happens intermitently when isolation level is not specified on .NET code nor in Store Procedure.
This error also happens in SQL Server 2008.
The fix is force SET TRANSACTION ISOLATION LEVEL READ (UN)COMMITTED because a isolation level any higher is not supported by Active Directory and SQL Server is trying to use SERIALIZABLE.
Now, as this error is intermitent. Why is ADO.NET or SQLServer switching its default isolation to SERIALIZABLE sometimes and sometimes not? What triggers this switching?