When to close the result set (Basic ODBC question) - sql

I am working on some small project for the local firm and the following code runs fine on my machine, but it produces errors on their server. Currently I don't have access to that server, and this is not a field that I know a lot about, so I have to ask you guys.
The page is written in the classic ASP (javascript for scripting).
The logic goes like this:
conn.Open("myconnection");
bigQuery = "...";
rs = conn.execute(bigQuery);
while (!rs.eof) {
...
smallQuery = "..."
rssmall = conn.execute(smallQuery);
...
rssmall.close();
...
rs.movenext();
}
rs.close();
conn.close();
As I said this runs fine on my machine, but it returns some error (the worst thing is that I don't even know what error) on company's server if bigQuery returns more than ~20 rows. Is there something wrong with my code (this really isn't my field, but I guess it is ok to gather data in the loop like this?), or is there something wrong with their IIS server.
Thanks.
edit:
More info:
It 's Access database. Everything is pretty standard:
conn=Server.CreateObject("ADODB.Connection");
conn.Provider="Microsoft.Jet.OLEDB.4.0";
conn.Open("D:/db/testingDb.mdb");
Queries are bit long, so I wont post them. They are totally ordinary selects so they aren't the issue.

I had a legacy Classic ASP application which I inherited that was very similar (big queries with other queries running within the loop retrieving the first query's results) that ran fine until forty or more records were returned. The way I "solved" the problem was to instantiate another connection object to run the other query. So using your pseudo code, try --
conn.Open("myconnection");
conn2.Open("myconnection")
bigQuery = "...";
rs = conn.execute(bigQuery);
while (!rs.eof) {
...
smallQuery = "..."
rssmall = conn2.execute(smallQuery);
...
rssmall.close();
...
rs.movenext();
}
rs.close();
conn2.close();
conn.close();

What server are they actually running?
Most newer versions of Windows Server don't actually come with the Jet 4.0 driver for 64 bit at all so you can't use an access database with that driver if your app runs as a 64 bit app. You can try running as 32 bit which might solve the problem.
There is an alternative driver packaged as an office component which may be an option.
Try writing a simple test page that literally opens and closes the database connection like so:
<%
Dim conn
Set conn = Server.CreateObject("ADODB.Connection")
conn.Provider = "Microsoft.Jet.OLEDB.4.0"
conn.Open("D:/db/testingDb.mdb")
Response.Write("Database Opened OK")
conn.Close()
%>
Run this on the production server and if you see Database Opened OK then you'll know it's definitely the query rather than the database causing the issue.
If you get an error trying to open the database then you need to changed to using the newer driver or try the app in 32 bit mode
In the case that it is the query causing the issue then it may be that you'll need to use the various additional arguments to the Open() method to try using a different cursor (forward only if you only need to iterate over the results once) which will change how ADODB retrieves the data and hopefully mediate any performance bottleneck related to the query.
Edit
If you want to try debugging the code a bit more add the following at the start of the file. This causes the ASP script processor to continue even if it hits an error
On Error Resume Next
Then at intervals throughout the file where you expect an error might have happened do
If Err <> 0 Then
Response.Write(Err.Number & " - " & Err.Description)
End If
See this article from ASP 101 for the basics of error handling in ASP and VBScript

Related

Querying from CSV file won't work on my Test Server

I have an online application that will query data from CSV file that the users send. It works like a charm in the server machine somewhere in the globe.
I decided to clone the application and database to my dev machine but the query gives me a Recordset error saying "Item cannot be found in the collection...", but wait, don't facepalm so soon.
Here is the code that queries the CSV file:
conExcel.Open "Provider=Microsoft.Jet.OLEDB.4.0;DATA SOURCE=" & Server.MapPath("xls\") &";Extended Properties=""Text;HDR=Yes;IMEX=1; ImportMixedTypes=Text;FMT=Delimited(;)"""
strSQL="SELECT * FROM ["& strFile&"]"
The file has more than 10 columns and using the code below it only shows me 10 and a fragment of the 11th column. But in my online application it shows all of them just fine.
For each x in SQL.Fields
response.write x.Name
Next
I'm guessing it has something to do with my IIS, but I don't know where I could tweak to make both ambients behave the same way.
Counting on you, guys.

SQL Azure - Transient "ExecuteReader requires an open connection" exception

I'm using SQL Azure in a Windows Azure app running as a cloud service. Most of the time my database actions works completely fine (that is, after handling all sorts of timeouts and what not), however i'm running into a problem that seems
using (var connection = new SqlConnection(m_connectionString))
{
m_ConnectionRetryPolicy.ExecuteAction(() => connection.Open());
using (var command = connection.CreateCommand())
{
command.CommandText = "SELECT * FROM X WHERE Y = Z";
var reader = m_CommandRetryPolicy.ExecuteAction(() => command.ExecuteReader());
return LoadData(reader).FirstOrDefault();
}
}
The line that fails is the Command.ExecuteReader with an:
ExecuteReader requires an open and available Connection. The connection's current state is closed
Things that i have already considered
I'm not "reusing" an old connection or saving a connection is a member variable
There should be no concurrency issues - the repository class that these methods belong to is created each time it is needed
Have anyone else experienced this? I could of course just add this to the list of exception which would yield a retry, but I'm not very comfortable with that as
I had a bunch of these errors a few days ago (West Europe) on my production deployment, but they went away by themselves. At the same time I was seeing timeouts, throttling and other errors from SQL Azure. I assume that there was a temporary problem with the platform (or at least the server that I am running on).
You probably aren't doing anything wrong in your code, but are suffering from degraded performance on SQL Azure. Try and handle the errors, perform retries, exponential back-off, queues (to reduce concurrency), splitting your load across databases — that sort of thing.
write every thing within try and catch,finally block.
as follows:
try
{
con.open();
m_ConnectionRetryPolicy.ExecuteAction(() => connection.Open());
using (var command = connection.CreateCommand())
{
command.CommandText = "SELECT * FROM X WHERE Y = Z";
var reader = m_CommandRetryPolicy.ExecuteAction(() => command.ExecuteReader());
return LoadData(reader).FirstOrDefault();
}
con.close();
}
catch(exception ex)
{
}
finally
{
con.close();
}
Remember to close connection in finally block as well.
There is an Enterprise Library that MS has produced specifically for SQL Azure, here are some examples from their patterns and Practice.
It's similar to what you are doing, however it does more on the reliability (and these examples show how to get a reliable connection)
http://msdn.microsoft.com/en-us/library/hh680899(v=pandp.50).aspx
Are you sure it's the reader that's failing and not the opening of the connection? I'm encountering an exception when I wrap the connection.Open() in the m_ConnectionRetryPolicy.ExecuteAction().
However it works just fine for me if I skip the ExecuteAction wrapper and open the connection using connection.OpenWithRetry(m_ConnectionRetryPolicy).
And I'm also using command.ExecuteReaderWithRetry(m_ConnectionRetryPolicy) which is working for me.
I have no idea though why it's not working when wrapped in ExecuteAction though.
I believe this means that Azure has closed the connection behind the scenes, without telling the connection pooler. This is by design. So, the connection pooler gives you what it thinks is an available, open connection, but when you try to use it, it finds out it's not open after all.
This seems very clunky to me, but it's the way Azure is at the moment.

web service - web client function will not write to db

I am unable to write any records to my database using a web service. Service is set up OK and can access it via uri and also query my database via the service using a simple page i created.
When it comes to writing to the database, I am not getting any errors, and the instance of my WebClient which is populated with variables to write to the db is holding all the variables OK but when it comes to actually writing to the db (see below code) nothing seems to happen except that the Member ID of the last existing member added to the database is returned.
'assign all abMem fields to values within form to write to database
newMem.Title = ddTitle.SelectedValue
newMem.Initials = txtInitials.Text
newMem.Surname = txtSurname.Text
newMem.Address1 = txtAdd1.Text
newMem.Address2 = txtAdd2.Text
newMem.Address3 = txtAdd3.Text
'etc etc .... additional fields have been removed
Try
cc.Open()
cc.CreateMember(newMem)
returnMem = cc.GetMember(newMem)
MesgBox(returnMem.MemberID & " - Member Created")
cc.Close()
Catch cex As CommunicationException
MesgBox("CommEX - " & cex.Message)
cc.Abort()
Catch tex As TimeoutException
MesgBox("TimeEX - " & tex.Message)
cc.Abort()
Finally
MesgBox("Closed the Client")
End Try
When i run the above, I've noticed in the log file for the service (in the system32 folder on my server) that 2 requests are made each time - presumably one for where I am trying to add a record and the other I would think would be the request for the ID of this member (which isn't created, hence why I believe it it is simply returning the last successful entry in the table).
I know there isn't a problem with the actual web service as there is another user successfully able to add to the db via the service (unfortunately I am unable to simply copy their set-up as they are hitting it via a php page) so i know there is a problem somewhere in my code.
Is cc.CreateMember(newMem) the correct syntax for passing a member's details to the function in the webservice is what I am wondering?
I've re-wrote the code (seems identical to above) and republished the web service. Seems to be working OK now so I must have had some silly mistake somewhere!

DB Query no longer recognizes SQL parameters in existing application when debugging in VS2010

I just started working with an application that I inherited from someone else and I'm having some issues. The application is written in C# and runs in VS2010 against the 3.5 framework. I can't run the application on my machine to debug because it will not recognize the way they referenced their parameters when writing their DB queries.
For instance wherever they have a SQL or DB2 query it is written like this:
using (SqlCommand command = new SqlCommand(
"SELECT Field1 FROM Table1 WHERE FieldID=#FieldID", SQLconnection))
{
command.Parameters.AddWithValue("FieldID", 10000);
SqlDataReader reader = command.ExecuteReader();
...
If you will notice the "parameters.AddWithValue("FieldID", 10000);" statement does not include the "#" symbol from the original command text. When I run it on my machine I get an error message stating that the parameter "FieldID" could not be found.
I change this line:
command.Parameters.AddWithValue("FieldID", 10000);
To this:
command.Parameters.AddWithValue("#FieldID", 10000);
And all is well... until it hits the next SQL call and bombs out with the same error. Obviously this must be a setting within visual studio, but I can't find anything about it on the internet. Half the examples for SQL parameter addition are written including the "#" and the other half do not include it. Most likely I just don't know what to search for.
Last choice is to change every query over to use the "#" at the front of the parameter name, but this is the transportation and operations application used to manage the corporation's shipments and literally has thousands of parameters. Hard to explain the ROI on your project when the answer to the director's question "How's progress?" happens to be "I've been hard at it for a week and I've almost started."
Has anyone run into this problem, or do you know how to turn this setting off so it can resolve the parameter names without the "#"?
Success! System.Data is automatically imported whenever you create a .NET solution. I removed this reference and added it back to make sure that I had the latest version of this library and that fixed the issue. I must have had an old version of this library that was originally pulled in... only thing I can figure.
Its handled by the .NET Framework data providers not Visual Studio.
It depends on the data source. Look here:Working with Parameter Placeholders
You can try working with System.Data.Odbc provider and using the question mark (?) place holder. In thios case dont forget to add the parameters in the same order they are in the query.

Problem during SQL Bulk Load

we've got a real confusing problem. We're trying to test an SQL Bulk Load using a little app we've written that passes in the datafile XML, the schema, and the SQL database connection string.
It's a very straight-forward app, here's the main part of the code:
SQLXMLBULKLOADLib.SQLXMLBulkLoad4Class objBL = new SQLXMLBULKLOADLib.SQLXMLBulkLoad4Class();
objBL.ConnectionString = "provider=sqloledb;Data Source=SERVER\\SERVER; Database=Main;User Id=Username;Password=password;";
objBL.BulkLoad = true;
objBL.CheckConstraints = true;
objBL.ErrorLogFile = "error.xml";
objBL.KeepIdentity = false;
objBL.Execute("schema.xml", "data.xml");
As you can see, it's very simple but we're getting the following error from the library we're passing this stuff to: Interop.SQLXMLBULKLOADLib.dll.
The message reads:
Failure: Attempted to read or write protected memory. This is often an indication that other memory has been corrupted
We have no idea what's causing it or what it even means.
Before this we first had an error because SQLXML4.0 wasn't installed, so that was easy to fix. Then there was an error because it couldn't connect to the database (wrong connection string) - fixed. Now there's this and we are just baffled.
Thanks for any help. We're really scratching our heads!
I am not familiar with this particular utility (Interop.SQLXMLBULKLOADLib.dll), but have you checked that your XML validates to its schema .xsd file? Perhaps the dll could have issues with loading the xml data file into memory structures if it is invalid?
I try to understand your problem ,but i have more doubt in that,
If u have time try access the below link ,i think it will definitely useful for you
link text
I know I did something that raised this error message once, but (as often happens) the problem ended up having nothing to do with the error message. Not much help, alas.
Some troubleshooting ideas: try to determine the actual SQL command being generated and submitted by the application to SQL Server (SQL Profiler should help here), and run it as "close" to the database as possible--from within SSMS, using SQLCMD, direct BCP call, whatever is appropriate. Detailing all tests you make and the results you get may help.