I have an online application that will query data from CSV file that the users send. It works like a charm in the server machine somewhere in the globe.
I decided to clone the application and database to my dev machine but the query gives me a Recordset error saying "Item cannot be found in the collection...", but wait, don't facepalm so soon.
Here is the code that queries the CSV file:
conExcel.Open "Provider=Microsoft.Jet.OLEDB.4.0;DATA SOURCE=" & Server.MapPath("xls\") &";Extended Properties=""Text;HDR=Yes;IMEX=1; ImportMixedTypes=Text;FMT=Delimited(;)"""
strSQL="SELECT * FROM ["& strFile&"]"
The file has more than 10 columns and using the code below it only shows me 10 and a fragment of the 11th column. But in my online application it shows all of them just fine.
For each x in SQL.Fields
response.write x.Name
Next
I'm guessing it has something to do with my IIS, but I don't know where I could tweak to make both ambients behave the same way.
Counting on you, guys.
Related
We got a problem with Microsoft Access .mdb-files. After some time working with an .mdb-file in a multiuser environment the file becomes corrupted and has to be repaired. After it's repaired it takes less time to become corrupted again. And at some point after multiple reperations the file isn't usable at all anymore.
This problem started to appear after we changed from MS Access Runtime 2010 to MS Access Runtime 2013.
I've already spent some time looking into this problem and this is my theory:
The mdb-file appearently contains a "Database Header Page" (described in a White Paper from Microsoft called "Understanding Microsoft Jet Locking" from 1996) which saves information about the commit bytes of Users (important: 0000 = writing to disk, 0100 = accessed a corrupted page). And there is a paragraph in the white paper about this Database Header Page which explains exactly what is happening in our problem:
"[...]Therefore, if a value of 00 00 is present without corresponding user lock [this would be an entry in the corresponding .ldb-file I think], or a value of 0100 is present, users will not be allowed to connect to the database without first executing the repair utility."
So my guess is, that after some amount of loosing connection to the mdb/breaking the connection this Database Header Page overflows and you have to repair the file but the repair doesn't remove every entry inside of this Database Header Page, so the amount of broken connections needed for the file to become corrupted decreases until the file completely breaks.
Now I would like to know if this theory is any good and if it is I would like to know:
How can I test this theory (how can I read this Database Header Page of the mdb-file)?
Can I modify the Database Header Page?
Can I modify the Database Header Page while someone is working with the mdb?
I know it's a very specific problem but I hope you guys can help me!
P.S. I can't find a link to the white paper but the "LDBViewer"-packet contains this white paper.
A few quick and dirty tricks that I often use are enabled by creating a small database which serves one purpose: control how the desired target database is opened.
This database is copied to the desktop of every user, so every user/session has its own.
So I would have
* 1 database on a server-location : [TheBigOne.mdb]
* 1 database copied to several desktops : [TheCaller.mdb]
--01-- :
Every morning, call a function which will perform the following steps :
* rename [TheCaller.mdb] to [TheCaller-xxx.mdb]
* create and open [TheCaller-NEW.mdb]
* for all tables, queries, forms, reports, macros, modules
DoCmd.TransferDatabase acImport, , [TheCaller-xxx.mdb], _
acTable-acQuery-acForm-acReport-acMacro-acModule, "y", "y", False
* rename [TheCaller-NEW.mdb] to [TheCaller.mdb]
--02--
Create a form in [TheCaller.mdb] with one button, with an OnClick-event like :
If Dir(ServerLocation & "\TheBigOnd.ldb") = "" Then
'open [TheBigOne.mdb]
Then
Msgbox "Database already in use!"
End If
This is prone to a lot of headaches, because it will happen that the [TheBigOnd.ldb] file does exists, while no one is using the database. But at least, you will have less database-corruption.
--03--
An alternative for this procedure --02-- can be achieved by using a command-line switch /excl. At any given time, only one user will be able to work in the database.
--04--
Create a second button on the form from point --02--, which will open [TheBigOne.mdb] with command-line switch /ro. This will open the database in a read-only mode for this user, avoiding the corruption of the database.
--05--
Create a small backend database db_sessions.mdb on a server-location, with
a table in it like T_Sessions(id, whois, started_at_timestamp, ended_at_timestamp, excl_or_ro, troubles) for keeping track [who] [opens] and [closes] [when] the database [TheBigOne.mdb].
If a user wants more priviliges then read-only, the following test has to be true: DCount("*","T_Sessions","ended_at_timestamp is null AND excl_or_ro = 'excl'") = 0. If this test results in false, the field troubles can be used to dispatch messages to other users.
Today (10.00 AM GMT+2) the code deployed in a production environment, started throwing an increasing number of errors while requesting file lists from a Google Drive folder, the error was always 500 "No Individual Errors".
After 2 hours, all the request failed.
The code regarding the file list request is the following:
'Search for a specific file name
oListReq.Q = "mimeType = 'application/vnd.google-apps.folder' and title = '" + ParentFolder + "' and trashed=false"
oListReq.Fields = "items/id" 'MSO - 20130621 - only ID is needed
oListReq.MaxResults = 10 'Max 10 files (too many I Expect only 1)
'Get the results
oFileList = oListReq.Fetch()
Testing the same requests with the API Explorer there is no problem and only the ID is returned.
Going step by step trying to identify the problem, turns out that all the requests with the Fields field specified generated a 500 error (other requests in the code have "items(id,alternateLink)" but the result is the same as the code above).
Temporary fixed the code commenting those lines.
Could you please investigate why this filters are not working with the .Net Client Library anymore?
Sorry for that. This error has been reproduced and Google is investigating on this. For now, please turn off fields filter.
It seems the issue is now fixed. We had the same issue with one of our production application, we had to produce a hot fix, but I performed a test a few minutes ago and it looks like it works again.
I am unable to write any records to my database using a web service. Service is set up OK and can access it via uri and also query my database via the service using a simple page i created.
When it comes to writing to the database, I am not getting any errors, and the instance of my WebClient which is populated with variables to write to the db is holding all the variables OK but when it comes to actually writing to the db (see below code) nothing seems to happen except that the Member ID of the last existing member added to the database is returned.
'assign all abMem fields to values within form to write to database
newMem.Title = ddTitle.SelectedValue
newMem.Initials = txtInitials.Text
newMem.Surname = txtSurname.Text
newMem.Address1 = txtAdd1.Text
newMem.Address2 = txtAdd2.Text
newMem.Address3 = txtAdd3.Text
'etc etc .... additional fields have been removed
Try
cc.Open()
cc.CreateMember(newMem)
returnMem = cc.GetMember(newMem)
MesgBox(returnMem.MemberID & " - Member Created")
cc.Close()
Catch cex As CommunicationException
MesgBox("CommEX - " & cex.Message)
cc.Abort()
Catch tex As TimeoutException
MesgBox("TimeEX - " & tex.Message)
cc.Abort()
Finally
MesgBox("Closed the Client")
End Try
When i run the above, I've noticed in the log file for the service (in the system32 folder on my server) that 2 requests are made each time - presumably one for where I am trying to add a record and the other I would think would be the request for the ID of this member (which isn't created, hence why I believe it it is simply returning the last successful entry in the table).
I know there isn't a problem with the actual web service as there is another user successfully able to add to the db via the service (unfortunately I am unable to simply copy their set-up as they are hitting it via a php page) so i know there is a problem somewhere in my code.
Is cc.CreateMember(newMem) the correct syntax for passing a member's details to the function in the webservice is what I am wondering?
I've re-wrote the code (seems identical to above) and republished the web service. Seems to be working OK now so I must have had some silly mistake somewhere!
I just started working with an application that I inherited from someone else and I'm having some issues. The application is written in C# and runs in VS2010 against the 3.5 framework. I can't run the application on my machine to debug because it will not recognize the way they referenced their parameters when writing their DB queries.
For instance wherever they have a SQL or DB2 query it is written like this:
using (SqlCommand command = new SqlCommand(
"SELECT Field1 FROM Table1 WHERE FieldID=#FieldID", SQLconnection))
{
command.Parameters.AddWithValue("FieldID", 10000);
SqlDataReader reader = command.ExecuteReader();
...
If you will notice the "parameters.AddWithValue("FieldID", 10000);" statement does not include the "#" symbol from the original command text. When I run it on my machine I get an error message stating that the parameter "FieldID" could not be found.
I change this line:
command.Parameters.AddWithValue("FieldID", 10000);
To this:
command.Parameters.AddWithValue("#FieldID", 10000);
And all is well... until it hits the next SQL call and bombs out with the same error. Obviously this must be a setting within visual studio, but I can't find anything about it on the internet. Half the examples for SQL parameter addition are written including the "#" and the other half do not include it. Most likely I just don't know what to search for.
Last choice is to change every query over to use the "#" at the front of the parameter name, but this is the transportation and operations application used to manage the corporation's shipments and literally has thousands of parameters. Hard to explain the ROI on your project when the answer to the director's question "How's progress?" happens to be "I've been hard at it for a week and I've almost started."
Has anyone run into this problem, or do you know how to turn this setting off so it can resolve the parameter names without the "#"?
Success! System.Data is automatically imported whenever you create a .NET solution. I removed this reference and added it back to make sure that I had the latest version of this library and that fixed the issue. I must have had an old version of this library that was originally pulled in... only thing I can figure.
Its handled by the .NET Framework data providers not Visual Studio.
It depends on the data source. Look here:Working with Parameter Placeholders
You can try working with System.Data.Odbc provider and using the question mark (?) place holder. In thios case dont forget to add the parameters in the same order they are in the query.
I am working on some small project for the local firm and the following code runs fine on my machine, but it produces errors on their server. Currently I don't have access to that server, and this is not a field that I know a lot about, so I have to ask you guys.
The page is written in the classic ASP (javascript for scripting).
The logic goes like this:
conn.Open("myconnection");
bigQuery = "...";
rs = conn.execute(bigQuery);
while (!rs.eof) {
...
smallQuery = "..."
rssmall = conn.execute(smallQuery);
...
rssmall.close();
...
rs.movenext();
}
rs.close();
conn.close();
As I said this runs fine on my machine, but it returns some error (the worst thing is that I don't even know what error) on company's server if bigQuery returns more than ~20 rows. Is there something wrong with my code (this really isn't my field, but I guess it is ok to gather data in the loop like this?), or is there something wrong with their IIS server.
Thanks.
edit:
More info:
It 's Access database. Everything is pretty standard:
conn=Server.CreateObject("ADODB.Connection");
conn.Provider="Microsoft.Jet.OLEDB.4.0";
conn.Open("D:/db/testingDb.mdb");
Queries are bit long, so I wont post them. They are totally ordinary selects so they aren't the issue.
I had a legacy Classic ASP application which I inherited that was very similar (big queries with other queries running within the loop retrieving the first query's results) that ran fine until forty or more records were returned. The way I "solved" the problem was to instantiate another connection object to run the other query. So using your pseudo code, try --
conn.Open("myconnection");
conn2.Open("myconnection")
bigQuery = "...";
rs = conn.execute(bigQuery);
while (!rs.eof) {
...
smallQuery = "..."
rssmall = conn2.execute(smallQuery);
...
rssmall.close();
...
rs.movenext();
}
rs.close();
conn2.close();
conn.close();
What server are they actually running?
Most newer versions of Windows Server don't actually come with the Jet 4.0 driver for 64 bit at all so you can't use an access database with that driver if your app runs as a 64 bit app. You can try running as 32 bit which might solve the problem.
There is an alternative driver packaged as an office component which may be an option.
Try writing a simple test page that literally opens and closes the database connection like so:
<%
Dim conn
Set conn = Server.CreateObject("ADODB.Connection")
conn.Provider = "Microsoft.Jet.OLEDB.4.0"
conn.Open("D:/db/testingDb.mdb")
Response.Write("Database Opened OK")
conn.Close()
%>
Run this on the production server and if you see Database Opened OK then you'll know it's definitely the query rather than the database causing the issue.
If you get an error trying to open the database then you need to changed to using the newer driver or try the app in 32 bit mode
In the case that it is the query causing the issue then it may be that you'll need to use the various additional arguments to the Open() method to try using a different cursor (forward only if you only need to iterate over the results once) which will change how ADODB retrieves the data and hopefully mediate any performance bottleneck related to the query.
Edit
If you want to try debugging the code a bit more add the following at the start of the file. This causes the ASP script processor to continue even if it hits an error
On Error Resume Next
Then at intervals throughout the file where you expect an error might have happened do
If Err <> 0 Then
Response.Write(Err.Number & " - " & Err.Description)
End If
See this article from ASP 101 for the basics of error handling in ASP and VBScript