I have some simple code, an if statement checking the HasRows of a data reader and for some reason when I run my code in Visual Studio 2017 it takes forever to evaluate and return the answer (while writing this, my code has been running for 4 minutes). Any suggestions?
Dim cmd As OdbcCommand
Dim rdr As OdbcDataReader
cmd = New OdbcCommand("select GLPN,GLFY,GLDCT,GLDOC,GLCO,GLDGJ,GLANI,GLSBL,GLLT,GLCRCD,GLAA,GLU,GLGLC,GLEXA,GLICUT,GLR2,GLR1,GLSFX,GLOKCO" _
& ",GLEXR,GLODOC,GLPKCO,GLPDCT,GLCN,GLDKJ,GLVINV,GLIVD,GLPO,GLDCTO,GLLNID,GLTORG,GLAN8,GLICU,GLOPSQ,GLJBCD" _
& ",GLACR,GLABR2,GLABR1,GLDGJ,GLLT,GLCRCD,GLEXA,GLICUT,GLEXR,GLDKJ,GLIVD,GLAN8,GLICU,GLACR,GLKCO,GLSBLT,GLOBJ,GLSUB,GLJELN,GLEXTL,GLCRR,GLBCRC" _
& " from " _
& "PRODDTA.F0911 where GLPOST = 'P' and GLDGJ >= ? and GLDGJ <= ? and (GLLT = 'AA' or GLLT = 'CA') and GLDOC = 206940", cnn)
cmd.Parameters.Add("?GLUPMJs", OdbcType.Int).Value = todaysdate - 14
cmd.Parameters.Add("?GLUPMJe", OdbcType.Int).Value = todaysdate
cnn.Open()
cmd.CommandTimeout = 300
rdr = cmd.ExecuteReader
If rdr.HasRows() Then
'Do a bunch of stuff
End if
Edit1: Still getting the funny issue but it I have noticed it's only in one spot, I have the "HasRows()" Check in multiple spots and it is working fast 3ms and such. it's only on the one query.
Edit2: The query I referenced above runs on SQL developer very fast total of 1.202 seconds last time I tried, it also returns no messages.
Edit3: I am wondering if it has something to do with the amount of fields I am returning, the other queries that run fast on this line are returning much smaller field counts.
This may not speed it up any but have you tried??
While rdr.Read()
'Do a bunch of stuff
End While
EDIT:
Dim dt As DataTable = New DataTable
dt.Load(rdr)
If dt.Rows.Count >= 0 Then
'Do a bunch of stuff
End If
Maybe the problem is that the query is taking a lot to be executed. If you are only interested in checking the existence of records you could perform a query with "SELECT TOP 1 1" and use ExecuteScalar instead of ExecuteReader.
It seems that you are not the only one having trouble with the HasRows method. In this post, the OP stated a problem with that method, although it is SQL-SERVER. But I think the solution might be useful, since both DataReaders inherit from the same class and they are closely related.
If you read the post that they reference in the question and the most voted answer, you'll see that they concluded that HasRows showed a buggy behavior when the SQL query execution returned not only some data results, but also a message. They ended up using an alternative way to check if there were any data on the data reader based on the Read() method, which proved to be more reliable.
In their case, the problem was that HasRows property was set to false when the reader actually contained data, but maybe it is related to your performance problem. You should definitely give a try to the Read method just to be sure that your query is not the problem.
P.S: another interesting link with a person reporting the same problem.
The reason for the delay on HasRows() is because the ExecuteReader didn't actually execute the query, it Defers the execution until it's needed. HasRows is the one executing it. thanks to Dwillis for pointing it out.
I tested this by putting a line of code ahead of it and this is where the delay sat and when hasRows ran it was fast.
Interestingly the same sql run in Sql Management Studio runes in 1.5 seconds returning all rows, yet in the Code it takes for ever which is a different issue but also may be due to ODBC settings.
Related
I'm trying to update data in textbox and this error show up. Is it incorrect query?
Try
Dim Str = "UPDATE userinfo SET firstname='" & TextBox1.Text.ToUpper & "',lastname='" & TextBox3.Text.ToUpper & "'," &
"WHERE id='" & Label15.Text
connection.Open()
Dim mysc2 As New MySqlCommand(Str, connection)
mysc2.ExecuteNonQuery()
MsgBox("User successfully updated!", MsgBoxStyle.Information)
connection.Close()
Me.Close()
Catch ex As Exception
MsgBox(ex.Message)
connection.Close()
End Try
As noted, it not only less then ideal to try and concat all those things together, but as you can see it is ALSO really easy to mess up the sql string.
So, your sql looks wrong. It should be this:
Dim Str as string
Str = "UPDATE userinfo SET firstname = '" & TextBox1.Text.ToUpper & "'," & _
"lastname = '" & TextBox3.Text.ToUpper & "'" & _
" WHERE id= " & Label15.Text
Now of course, we don't know if "ID" is a string, or a number type. And if it is a number, then you do NOT surround the Lable15.Text value with single quotes. And you had a extra "," after the list textbox 3 in your sql.
But, as noted, the suggestion here is to use parameters. While this is a bit wee extra code, such code is LESS prone to concatenation errors, is easier to read, and in fact easier to write. (you can hit ctrl-d in the code editor to duplicate the line you are on!!!).
and of course, using parameters ALSO gives you AUTOMATIC data typing. So you don't have to worry or think about those extra parameters. And not only do numbers have no single quotes around them, date delimiters can even be more of a challenge.
Along without the messy concatenation?
Along without the data type issue (quotes or not????)
We ALSO get protection from SQL injection.
So, we getting a real nice truckload of bonus features. I have ALWAYS lamented that many suggest to use parameters, but then don't point out the benefits (beyond that of sql injection).
So you might wind up with a extra line or two of code, but that code is LESS error prone, and in most cases will be easier to read, but ALSO MUCH more able to allow you to add more parameters - all without messy string concatenations.
So, our code can be this:
Dim Str As String
Str = "UPDATE userinfo SET firstname = #firstName, lastname = #lastname WHERE id = #id"
Using cmdSQL As New MySqlCommand(Str, connection)
cmdSQL.Parameters.Add("#firstname", SqlDbType.NVarChar).Value = TextBox1.Text
cmdSQL.Parameters.Add("#lastname", SqlDbType.NVarChar).Value = TextBox3.Text
cmdSQL.Parameters.Add("#id", SqlDbType.Int).Value = Label15.Text
cmdSQL.Connection.Open()
cmdSQL.ExecuteNonQuery()
End Using
Some interesting things to note:
First: it really nice to read, right?
Second: Easy to code.
When I typed in this:
cmdSQL.Parameters.Add("#firstname", SqlDbType.NVarChar).Value = TextBox1.Text
While on above, I hit ctrl-d two times. that duplicated the above line.
So now, I just "cursor" up/down into the parameter value area almost like editing a column in Excel. (I did not HAVE to type those extra lines - but just edit the values and change the textbox/label values.
And BETTER is while typing in that code, I can EASY see, and read, view the nice sql line right above in code - so I just read off the parameters as I type, and edit the two additonal parameters rows. So, in fact, I actually wind up typing LESS then your posted code. And with the nice sql text in view - I have a LOWER brain work load - I don't have to remember the parameters - I just read them!
Next up:
SQL server does not care about upper vs lower case - so you don't need to worry about upper and lower - they will match regardless.
More next up:
By wrapping the whole deal inside of a using, then we don't have to close the connection - this is correctly handled by the using block (it cleans up for you).
More more next up:
And because we don't use string concatenation for the values, and DO NOT have to worry about quotes or not, and also get sql injection protection?
We ALSO get STRONG data typing.
In other words, the values from the controls are CAST to the correct data types required by sql server. I mean, often you can munge in a number with quotes as a string - sql server might work, might not! - but this way, automatic taking care of the quotes (or no quotes for numbers) is done for you!
so the updated original sql string should work, but give the parameters a try, and while the using block is a extra line of code, we don't have to close the connection, so we get back that 1 line of code!
The major goal here is it not that you must do this, or do that! Often when starting out, such advice is not all that helpful. Saying don't do that? Geesh - what kind of help is that.
In above, I not only said to use parameters, but I ALSO made a long and impassioned case as to why they are better.
The strong data typing is really nice, and protection from SQL injection is in fact a bonus feature here!
The real gold here is that we wound up with not a lot of code, and code that is easier to maintain, easier to read, and even MORE so if we decide in the future to add some more text boxes etc. on the form.
I mean, lets say you have 4 or 5 text boxes. Can you imagine then how absolute difficult that long huge concatenated string will be to edit, look at, and more so try to debug errors!
So must you use parameters? no, you don't have to, but once you try the above?
You adopt the above because the code is less mental effort and work on your part - and that's really what great code is all about.
I not that great of a coder, and thus your long string sql is too hard for me, and too great of a effort. Only those really good coders can figure out that mess. For me, you have to think as me as a one cell creature - limited brain power. The same goes for great pool players. They actually don't make the hard and difficult shots - they set themselves up in such a way that they never have to make those hard shots, or in this case read and maintain difficult and complex code.
Your code can work - but it is beyond my pay grade and ability to read and maintain such code!
Good luck!
I'm currently working with a database of over 50 million records, where I read a file which a person wants to search the database for etc. I have noticed my data reader part is running particularly slow, where as the query seems almost instant (database is indexed). I was just wondering does anyone know as to why it seems to be running slow?
con.Open()
Using sw As New StreamWriter("G:\USER\SEARCH-RESULTS.txt")
Try
For Each word As String In result
Using com As New SqlCommand("select t.SmeNbr, t.FilPth, r.MaxDate, t.DteAdd, t.LnePos from (Select SmeNbr, MAX(FilDte) as MaxDate from Test_Table where SmeNbr = #word group by SmeNbr)r inner join Test_Table t on t.SmeNbr = r.SmeNbr and t.FilDte = R.MaxDate", con)
com.Parameters.AddWithValue("#word", word)
Using RDR = com.ExecuteReader
If RDR.HasRows Then
Do While RDR.Read
MyFilePath = RDR.Item("FilPth").ToString()
linePos = RDR.Item("LnePos").ToString()
Using sr As New StreamReader(MyFilePath)
sr.BaseStream.Seek(4096 * (linePos - 1), SeekOrigin.Begin)
FoundWords.Add(sr.ReadLine)
For Each item As String In FoundWords
sw.WriteLine(item)
Next
FoundWords.Clear()
End Using
Loop
Else
Continue For
End If
End Using
End Using
Next
Catch ex As Exception
MessageBox.Show("Couldn't process search")
Finally
con.Close()
End Try
End Using
MsgBox("Complete!")
So it works perfect, as in it gets the records and bits of info I want very quickly through the query and all and even the writing reults to a new file is near instant, I used breakpoints and like I said it seems to take ages between the "Using RDR = com.ExecuteReader" and "If RDR.HasRows Then"
Any help or ideas would be greatly appreciated.
com.Parameters.AddWithValue("#word", word)
AddWithValue infers the parameter data type from the provided .NET object value. Since .NET strings are Unicode, this code is will add an nvarchar(n) parameter with the length of the actual value. I see from your comments that the actual column data type is char(13) so it would be best to explicitly specify that as the parameter data type:
com.Parameters.Add("#word", SqlDbType.Char, 13).Value = word
The implications with AddWithValue are that indexes might not be used due to the mismatched data type and there may be many variations of the same query in the SQL Server procedure cache that differ only by length. For these reasons, I suggest one avoid AddWithValue.
Hi friends of stackoverflow,
I'm writing a question here because i'm having problems to detect why sometimes a read to a field of a datareader returns a invalid cast exception.
I will give all information possible to understand my situation. I'm working with ASP.NET 3.5
I have a Module that have a function that returns a IDataReader and receives a sql query. something like this
function get_dr(query as String) as IDataReader
dim connection = new SqlClient.SqlConnection("connection string")
connection.open()
dim command = connection.createCommand
command.commandText = query
dim reader = command.executeReader(CommandBehavior.CloseConnection)
return reader
end function
I have a Class with a Shared function that recovers a new dataReader and returns a date. something like this:
public shared function getDate() as Date
using dr = get_dr("SELECT dbo.getDate()")
if dr.read() and dr(0) isnot dbnull.value then
return dr.GetDateTime(0)
end if
end using
end function
when in another code i call the getDate() function, it gives me a call stack like this.
System.InvalidCastException: Specified cast is not valid.
at System.Data.SqlClient.SqlBuffer.get_DateTime()
at System.Data.SqlClient.SqlDataReader.GetDateTime(Int32 i)
Why sometimes i'm getting this error? i was thinking this is because that a lot of users is calling this function in conjunction with another functions of my application (those functions eventually uses get_dr too), mixing the data of the dataReader on another executions, but i need to know if im doing something wrong or maybe to do something better.
Notes:
dbo.getDate is a sql function that ALWAYS returns a date.
don't worry about bad writing code, those are only examples but they have the necessary to understand the scenario.
sorry for my bad english
Really thanks in advance.
One possible reason - you declare connection inside of the function that returns DataReader. When you're out of the function that connection goes out of scope. That means that at some unpredictable point (depends on memory usage etc.) Garbage Collector will collect it. If you try to use the DataReader at that point - all bets are off.
One way to solve it is to declare connection outside of function get_dr and pass it there as a parameter. But also seeing that you're returning a single value and if you don't plan to use the reader for multiple values I suggest using ExecuteScalar instead - it will save you a lot of headaches.
In my Winforms app which uses a remote database, I have the function below. (I also have two other functions which work similarly: One for scalar queries which returns zero upon failure, the other for updates and inserts which returns false upon failure.)
Currently, ALL data manipulation is pumped through these three functions.
It works fine, but overall please advise if I'd be better off establishing the connection upon launching my app, then closing it as the app is killed? Or at another time? (Again, it's a windows forms app, so it has the potential to be sitting stagnant while a user takes a long lunch break.)
So far, I'm not seeing any ill effects as everything seems to happen "in the blink of an eye"... but am I getting data slower, or are there any other potential hazards, such as memory leaks? Please notice I am closing the connection no matter how the function terminates.
Public Function GetData(ByVal Query As String) As DataTable
Dim Con As New SqlConnection(GlobalConnectionString)
Dim da As New SqlDataAdapter
Dim dt As New DataTable
Try
Con.Open()
da = New SqlDataAdapter(Query, Con)
Con.Close()
da.Fill(dt)
Catch ex As Exception
Debug.Print(Query)
MsgBox("UNABLE TO RETRIEVE DATA" & vbCrLf & vbCrLf & ex.Message, MsgBoxStyle.Critical, "Unable to retrieve data.")
End Try
da.Dispose()
Con.Close()
Return dt
End Function
There are exceptions to this, but best practices in .Net do indeed call for creating a brand new connection object for most queries. Really.
To understand why is, first understand actually connecting to a database involves lots of work in terms of protocol negotiations, authentication, and more. It's not cheap. To help with this, ADO.Net provides a built-in connection pooling feature. Most platforms take advantage of this to help keep connections efficient. The actual SqlConnection, MySqlConnection, or similar object used in your code is comparatively light weight. When you try to re-use that object, you're optimizing for the small thing (the wrapper) at the expense of the much larger thing (the actual underlying connection resources).
Aside from the benefits created from connection pooling, using a new connection object makes it easier for your app to scale to multiple threads. Imagine writing an app which tries to rely on a single global connection object. Later you build a process which wants to spawn separate threads to work on a long-running task in the background, only to find your connection is blocked, or is itself blocking other normal access to the database. Worse, imagine trying to do this for a web app, and getting it wrong such that the single connection is shared for your entire Application Domain (all users to the site). This is a real thing I've seen happen.
So this is something that your existing code does right.
However, there are two serious problems with the existing method.
The first is that the author seems not to understand when to open and when to close a connection. Using the .Fill() method complicates this, because this method will open and close your connection all on its own.1 When using this method, there is no good reason to see a single call to .Open(), .Close(), or .Dispose() anywhere in that method. When not using the .Fill() method, connections should always be closed as part of a Finally block: and the easiest way to do that is with Using blocks.
The second is SQL Injection. The method as written allows no way to include parameter data in the query. It only allows a completed SQL command string. This practically forces you to write code that will be horribly vulnerable to SQL Injection attacks. If you don't already know what SQL Injection attacks are, stop whatever else you're doing and go spend some time Googling that phrase.
Let me suggest an alternative method for you to address these problems:
Public Function GetData(ByVal Query As String, ParamArray parameters() As SqlParameter) As DataTable
Dim result As New DataTable()
Using Con As New SqlConnection(GlobalConnectionString), _
Cmd As New SqlCommand(Query, Con),
da As New SqlDataAdpapter(Cmd)
If parameters IsNot Nothing Then Cmd.Parameters.AddRange(parameters)
Try
da.Fill(result)
Catch ex As Exception
Debug.Print(Query)
'Better to allow higher-level method to handle presenting the error to the user
'Just log it here and Rethrow so presentation tier can catch
Throw
End Try
End Using 'guarantees connection is not left hanging open
Return result
End Function
1See the first paragraph of the "Remarks" section.
This isn't a real "answer" to my own question, but I have something to add and I wanted to add some code.
To Joe: Thank you, my code is well on the way to using parameterized queries almost exclusively. Although I knew what SQL injection attacks were, and that they're a pretty big deal, here's my exuse: In the past I had used stored procedures for parameterized queries, and I kinda hate writing those and for the first year my code will be used only within by my small company of 5 employees who are family members... I had planned to switch everything to stored procedures later if I sold the software. This approach is better and I will probably not need stored procedures at all.
I especially like how elegantly parameterized queries handle dates, as I don't have to convert dates to appropriate text. Much easier.
Anopther advantage I'm seeing: Sometimes a "Save button" must execute either Insert or Update, depending on whether the record displayed is new. Using parameters allows me to write two alternate short basic queries, but to use the same parameters for either with less code.
Overall, this means a whole lot less code-intensive construction of the query string.
The part I didn't have, and I learned to do it elsewhere, was assigning the parameter array, calling the procedure, so I'm including an example here hoping others find it useful:
Dim query As String = "Select Phone from Employees where EmpNo = #EmployeeNumber and Age = #Age"
Dim params As SqlParameter() = {
New SqlParameter("#EmployeeNumber", txtEmployeeNumber.Value),
New SqlParameter("#Age", txtAge.Value)
}
Dim Phone as String = GetData(query, params).Rows.Item(0)
I'm having some timing problems in a unit test of some of my vb code.
I set up a test harness by checking for and then deleting records added to the db in the previous testing session, and then I test my recording adding code by adding back the same records.
Interestingly this code works fine when there is a break in the debugger, but fails with a "duplicate key" exception when I let it run with no breaks, which leads me to believe there is some kind of concurrency issue.
The basic metacode is as follows:
DoTest()
dim j as datacontext
dim recs = from myrecs in j.mythings select myrecs where myrecs.key="key1" or
myrecs.key = "key2"
if recs.count > 0
for each rec in myrecs
j.mythings.deleteonsubmit(rec)
next
j.submitchanges()
end if
j.dispose
dim tc as new tablecontroller()
tc.addrecordtomytable("key1","value1")
tc.addrecordtomytable("key2","value2")
end
Class tablecontroller
Sub addrecordstomytable(key as string, value as string)
dim j as new mydatacontext
dim thing as new mything
thing.key = key
thing.value = value
j.mythings.addonsubmit(thing)
j.submitchanges
j.dispose
end sub
end class
I've confirmed that I'm properly deleted the previous added records, and this works fine as does adding the new records when I have a break in the code before I hit the add records step. but without the break, it throws duplicate key exceptions in the "addrecordestomytable" method suggesting that it hasn't grabbed the current version of the table when it creates the new data context in addrecordstomytable, even though the records should have already been deleted.
I've tried refreshing the table, but this doesn't seem to work either.
Note backing database is ms sql server 10
Suggestions?
This is not so much an answer as a trouble-shooting technique. Have you tried using the log property? i.e.
j.log = Console.Out
So that you can see the actual SQL generated to make sure it is what you expect? Other than that is there anything relevant in your test setup or tear down? Is there anything managing a transaction? Are there triggers running? In terms of the latter if there is a trigger that takes some time to run and then the delete is finalized that might explain what you're seeing. I guess LINQ syntax varies between VB and C#, which surprised me, because I don't think your comparison code as written is valid in C#, you would need ==, not =, for a comparison but since it works when you break in the debugger...