Need to Batch a Large DataTable and write each batch to a Text file - VB.Net - vb.net

I have a requirement that I need to query a DB and fetch the records in a Data Table. The Data Table has 20,000 records.
I need to batch these records in Batches of 100 records each and write these batches into a individual Text files.
Till now I have been able to batch the records in batches of 100 each using IEnumerable(of DataRow).
I am now facing issue in writing the IEnumeable(Of DatRow) to a Text File.
My code is a below:
Dim strsql = "Select * from myTable;"
Dim dt as DataTable
Using cnn as new SqlConnection(connectionString)
cnn.Open()
Using dad as new SqlAdapter(strsql ,cnn)
dad.fill(dt)
End Using
cnn.Close()
End Using
Dim Chunk = getChunks(dt,100)
For each chunk as IEnumerable(Of DataRow) In Chunks
Dim path as String = "myFilePath"
If Not File.Exists(myFilePath) Then
//** Here I will write my Batch into the File.
End If
Next
Public Iterator Function getChunks(byVal Tab as DataTable, byVal size as Integer) as IEnumerable (Of IEnumerable(of DataRow))
Dim chunk as List(Of DataRow) = New List(of DataRow)(size)
For Each row As DataRow in tab.Rows
chunk.Add(row)
if chunk.Count = size Then
Yield chunk
chunk = New List(of DataRow0(size)
Next
if chunk.Any() Then Yield chunk
End Function
Need your help to write the IEneumerable of DataRows into a Text file for each Batch of Records.
Thanks
:)

Your existing code is needlessly complex. If this is all you're doing, then using a datatable is unnecessary/unwise; this is one of the few occasions I would advocate using a lower level datareader to keep the memory impact low
Writing a db table to a file, quick, easy and low memory consumption:
Dim dr = sqlCommand.ExecuteReader()
Dim sb as New StringBuilder
Dim lineNum = -1
Dim batchSize = 100
While dr.Read()
'turn the row into a string for our file
For x = 0 to dr.FieldCount -1
sb.Append(dr.GetString(x)).Append(",")
Next x
sb.Length -= 1 'remove trailing comma
sb.AppendLine()
'keep track of lines written so we can batch accordingly
lineNum += 1
Dim fileNum = lineNum \ batchSize
File.AppendAllText($"c:\temp\file{fileNum}.csv", sb.ToString())
'clear the stringbuilder
sb.Length = 0
End While
If you really want to use a datatable, there isn't anything stopping you swapping this while dr For a For Each r as DataRow in myDatatable.Rows
Please note, this isn't an exercise in creating a fully escaped csv, nor formatting the data; it is demonstrating the concept of having a firehose of data and simply writing it to N different files by utilising the fact that doing an integer divide on every number from 0 to 99 will result in 0 (and hence go in file 0) and then very number from 1 to 199 will result in 1 (and hence lines go in file 1) etc, and doing this process on a single stream of data, or single iteration of N items
You could build the file lines in the string builder and write them every batchSize if lineNum Mod batchSize = batchSize - 1, if you feel that it would be more efficient than calling file appendalltext (which opens and closes the file)

Tested this with a table of a little over 1,500 records and 10 fields. The file creation took a little over 5 seconds (excluding data access). All things being equal (which I know they are not) that would be over 13 seconds writing the files.
Since your problem was with the iterator I assume the there were no memory issues with the DataTable.
You can include more than one database object in a Using block by using a comma to designate a list of objects in the Using.
Private Sub OPCode()
Dim myFilePath = "C:\Users\xxx\Documents\TestLoop\DataFile"
Dim strsql = "Select * from myTable;"
Dim dt As New DataTable
Using cnn As New SqlConnection(connectionString),
cmd As New SqlCommand(strsql, cnn)
cnn.Open()
dt.Load(cmd.ExecuteReader)
End Using
sw.Start()
Dim StartRow = 0
Dim EndRow = 99
Dim FileNum = 1
Dim TopIndex = dt.Rows.Count - 1
Do
For i = StartRow To EndRow
Dim s = String.Join("|", dt.Rows(i).ItemArray)
File.AppendAllText(myFilePath & FileNum & ".txt", s & Environment.NewLine)
Next
FileNum += 1
StartRow += 100
EndRow += 100
If EndRow >= TopIndex Then
EndRow = TopIndex
End If
Loop Until StartRow >= TopIndex
sw.Stop()
MessageBox.Show(sw.ElapsedMilliseconds.ToString)
End Sub

I thought your code was a great use of the iteration function.
Here is the code for your iterator.
Public Iterator Function getChunks(ByVal Tab As DataTable, ByVal size As Integer) As IEnumerable(Of IEnumerable(Of DataRow))
Dim chunk As List(Of DataRow) = New List(Of DataRow)(size)
For Each row As DataRow In Tab.Rows
chunk.Add(row)
If chunk.Count = size Then
Yield chunk
chunk = New List(Of DataRow)(size)
End If
Next
If chunk.Any() Then Yield chunk
End Function
Private Sub Button1_Click(sender As Object, e As EventArgs) Handles Button1.Click
Dim dt = LoadDataTable()
Dim myFilePath As String = "C:\Users\xxx\Documents\TestLoop\DataFile"
Dim FileNum = 1
For Each chunk As IEnumerable(Of DataRow) In getChunks(dt, 100)
For Each row As DataRow In chunk
Dim s = String.Join("|", row.ItemArray)
File.AppendAllText(myFilePath & FileNum & ".txt", s & Environment.NewLine)
Next
FileNum += 1
Next
MessageBox.Show("Done")
End Sub
You just needed to nest the For Each to get at the data rows.

Related

how to read a specific csv line vb.net

ask permission,
I created a bot to input data to the web using vb.net and selenium.
Retrieve data from csv .
How to retrieve data from csv as needed, for example, there are 100 rows, only 30-50 rows are taken, for example. The loop code should not be looped at all.
Dim textFieldParser As TextFieldParser = New TextFieldParser(TextBox1.Text) With
{
.TextFieldType = FieldType.Delimited,
.Delimiters = New String() {","}
}
drv = New ChromeDriver(options)
While Not textFieldParser.EndOfData
Try
Dim strArrays As String() = textFieldParser.ReadFields()
Dim name As String = strArrays(0)
Dim alamat As String = strArrays(1)
Dim notlp As String = strArrays(2)
drv.Navigate().GoToUrl("URL")
Dim Nm = drv.FindElement(By.XPath("/html/body/div[1]/div[3]/form/div[1]/div[1]/div[1]/div/div[2]/input"))
Nm.SendKeys(name)
Threading.Thread.Sleep(3000)
Catch ex As Exception
MsgBox("Line " & ex.Message & "is not valid and will be skipped.")
End Try
End While
Thank you
Here's an example of using TextFieldParser to read one specific line and a specific range of lines. Note that I am using zero-based indexes for the lines. You can adjust as required if you want to use 1-based line numbers.
Public Function GetLine(filePath As String, index As Integer) As String()
Using parser As New TextFieldParser(filePath) With {.Delimiters = {","}}
Dim linesDiscarded = 0
Do Until linesDiscarded = index
parser.ReadLine()
linesDiscarded += 1
Loop
Return parser.ReadFields()
End Using
End Function
Public Function GetLines(filePath As String, startIndex As Integer, count As Integer) As List(Of String())
Using parser As New TextFieldParser(filePath) With {.Delimiters = {","}}
Dim linesDiscarded = 0
Do Until linesDiscarded = startIndex
parser.ReadLine()
linesDiscarded += 1
Loop
Dim lines As New List(Of String())
Do Until lines.Count = count
lines.Add(parser.ReadFields())
Loop
Return lines
End Using
End Function
Simple loops to skip and to take lines.

How to put data of MS excel of one column inside array in vb.net

I have data on my MS.Excel spreadsheet which contain different column (Sn , Amount and tech id). I am trying to put all the data of tech id on tech id in array like :-
mydata = [43219 , 43220 , 43221 , 43222 ,43223 ,43224 , 43225 ]
My code of only one main processing function:-
Importing :-
Imports System.IO
Imports System.Data.OleDb
main processing function:-
Dim conString1 As String
Dim Mydata(200) As Integer
Dim connection As OleDbConnection
Dim adapter As OleDbDataAdapter
Private Sub LoadData(conStr As String)
con = New OleDbConnection(conStr)
Dim query As String = "SELECT * FROM [Sheet0$]"
adapter = New oleDbDataAdapter(query, connection)
'Putting data indide array
'For intCount = 0 To lengthofcolumn
'Mydata(intCount) = ?
'Next intCount
Debug.Print(adapter)
End Sub
Calling :-
conString1 = String.Format("Provider = Microsoft.Jet.OLEDB.4.0;Data Source = '{0}'; Extended Properties = Excel 8.0", 'F:\MicroTest\data\log.xlsx)')
LoadData(conString1)
I am a student , I am learning so please help ,I did't find this solution , Mostly I found solution of viewing excel data in datagrid
My test data was in B2:B8.
You will need to add the Reference: Microsoft Excel 14.0 Object Library
Dim oExcel As New Microsoft.Office.Interop.Excel.Application
oExcel.Workbooks.Open("C:\TEMP\test_data.xlsx")
Dim oSheet As Microsoft.Office.Interop.Excel.Worksheet = oExcel.Sheets(1)
' I would use list instead of an array.
Dim oTest As New List(Of String)
For Each oValue As String In oSheet.Range("B2:B8").Value2
oTest.Add(oValue)
Next
' Using an array
Dim oData(200) As Integer
Dim iCounter As Integer = 0
For Each oValue As String In oSheet.Range("B2:B8").Value2
oData(iCounter) = CType(oValue, Integer)
iCounter += 1
Next
oExcel.Quit()
I think your approach is good, accessing the file with OleDB and not openning an instance of Excel.
I used a DataReader and DataTable to collect and hold the data in memory.
The Using...End Using blocks ensure your objects that have a Dispose method are closed and disposed properly even if there is an error.
Private Sub LoadData()
Dim dt As New DataTable()
Dim conStr As String = "Your connection string"
Using con As New OleDbConnection(conStr)
Dim query As String = "SELECT * FROM [Sheet1$]"
Using cmd As New OleDbCommand(query, con)
con.Open()
Using dr As OleDbDataReader = cmd.ExecuteReader()
dt.Load(dr)
End Using
End Using
End Using
'The number of rows in the DataTable less the first 2 rows which are title and blank
'and subtract 1 because vb.net arrays are defined array(upper bound)
Dim arraySize As Integer = dt.Rows.Count - 3
Dim myData(arraySize) As Integer
Dim arrayIndex As Integer = 0
'Putting data indide array
For rowIndex As Integer = 2 To dt.Rows.Count - 1
myData(arrayIndex) = CInt(dt.Rows(rowIndex)(3)) '3 is the index of the TechID column
arrayIndex += 1
Next
'Checking the array - delete in final version
'I used i as a variable name because this is a very tiny
'loop and will be deleted eventually. Otherwise, I would
'have used a more descriptive name.
For Each i As Integer In myData
Debug.Print(i.ToString)
Next
End Sub

get column names Jet OLE DB in vb.net

I've written a function which reads csv files and parametrizes them accordingly, therefore i have a function gettypessql which queries sql table at first to get data types and therefore to adjust the columns which are later inserted in sql. So my problem is when I set HDR=Yes in Jet OLE DB I get only column names like F1, F2, F3. To circumvent this issue I've set HDR=No and written some for loops but now I get only empty strings, what is actually the problem? here is my code:
Private Function GetCSVFile(ByVal file As String, ByVal min As Integer, ByVal max As Integer) As DataTable
Dim ConStr As String = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" & TextBox1.Text & ";Extended Properties=""TEXT;HDR=NO;IMEX=1;FMT=Delimited;CharacterSet=65001"""
Dim conn As New OleDb.OleDbConnection(ConStr)
Dim dt As New DataTable
Dim da As OleDb.OleDbDataAdapter = Nothing
getData = Nothing
Try
Dim CMD As String = "Select * from " & _table & ".csv"
da = New OleDb.OleDbDataAdapter(CMD, conn)
da.Fill(min, max, dt)
getData = New DataTable(_table)
Dim firstRow As DataRow = dt.Rows(0)
For i As Integer = 0 To dt.Columns.Count - 1
Dim columnName As String = firstRow(i).ToString()
Dim newColumn As New DataColumn(columnName, mListOfTypes(i))
getData.Columns.Add(newColumn)
Next
For i As Integer = 1 To dt.Rows.Count - 1
Dim row As DataRow = dt.Rows(i)
Dim newRow As DataRow = getData.NewRow()
For j As Integer = 0 To getData.Columns.Count - 1
If row(j).GetType Is GetType(String) Then
Dim colValue As String = row(j).ToString()
colValue = ChangeEncoding(colValue)
colValue = ParseString(colValue)
colValue = ReplaceChars(colValue)
newRow(j) = colValue
Else
newRow(j) = row(j)
End If
Next
getData.Rows.Add(newRow)
Application.DoEvents()
Next
Catch ex As OleDbException
MessageBox.Show(ex.Message)
Catch ex As Exception
MessageBox.Show(ex.Message)
Finally
dt.Dispose()
da.Dispose()
End Try
Return getData
End Function
and get types sql, this one doesn't convert properly, especially doubles
Private Sub GetTypesSQL()
If (mListOfTypes Is Nothing) Then
mListOfTypes = New List(Of Type)()
End If
mListOfTypes.Clear()
Dim dtTabelShema As DataTable = db.GetDataTable("SELECT TOP 0 * FROM " & _table)
Using dtTabelShema
For Each col As DataColumn In dtTabelShema.Columns
mListOfTypes.Add(col.DataType)
Next
End Using
End Sub
I think you have made it more complicated than it needs to be. For instance, you get the dbSchema by creating an empty DataTable and harvesting the Datatypes from it. Why not just use that first table rather than creating a new table from the Types? The table also need not be reconstructed over and over for each batch of rows imported.
Generally since OleDb will try to infer types from the data, it seems unnecessary and may even get in the way in some cases. Also, you are redoing everything that OleDB does and copying data to a different DT. Given that, I'd skip the overhead OleDB imposes and work with the raw data.
This creates the destination table using the CSV column name and the Type from the Database. If the CSV is not in the same column order as those delivered in a SELECT * query, it will fail.
The following uses a class to map csv columns to db table columns so the code is not depending on the CSVs being in the same order (since they may be generated externally). My sample data CSV is not in the same order:
Public Class CSVMapItem
Public Property CSVIndex As Int32
Public Property ColName As String = ""
'optional
Public Property DataType As Type
Public Sub New(ndx As Int32, csvName As String,
dtCols As DataColumnCollection)
CSVIndex = ndx
For Each dc As DataColumn In dtCols
If String.Compare(dc.ColumnName, csvName, True) = 0 Then
ColName = dc.ColumnName
DataType = dc.DataType
Exit For
End If
Next
If String.IsNullOrEmpty(ColName) Then
Throw New ArgumentException("Cannot find column: " & csvName)
End If
End Sub
End Class
The code to parse the csv uses CSVHelper but in this case the TextFieldParser could be used since the code just reads the CSV rows into a string array.
Dim SQL = String.Format("SELECT * FROM {0} WHERE ID<0", DBTblName)
Dim rowCount As Int32 = 0
Dim totalRows As Int32 = 0
Dim sw As New Stopwatch
sw.Start()
Using dbcon As New MySqlConnection(MySQLConnStr)
Using cmd As New MySqlCommand(SQL, dbcon)
dtSample = New DataTable
dbcon.Open()
' load empty DT, create the insert command
daSample = New MySqlDataAdapter(cmd)
Dim cb = New MySqlCommandBuilder(daSample)
daSample.InsertCommand = cb.GetInsertCommand
dtSample.Load(cmd.ExecuteReader())
' dtSample is not only empty, but has the columns
' we need
Dim csvMap As New List(Of CSVMapItem)
Using sr As New StreamReader(csvfile, False),
parser = New CsvParser(sr)
' col names from CSV
Dim csvNames = parser.Read()
' create a map of CSV index to DT Columnname SEE NOTE
For n As Int32 = 0 To csvNames.Length - 1
csvMap.Add(New CSVMapItem(n, csvNames(n), dtSample.Columns))
Next
' line data read as string
Dim data As String()
data = parser.Read()
Dim dr As DataRow
Do Until data Is Nothing OrElse data.Length = 0
dr = dtSample.NewRow()
For Each item In csvMap
' optional/as needed type conversion
If item.DataType = GetType(Boolean) Then
' "1" wont convert to bool, but (int)1 will
dr(item.ColName) = Convert.ToInt32(data(item.CSVIndex).Trim)
Else
dr(item.ColName) = data(item.CSVIndex).Trim
End If
Next
dtSample.Rows.Add(dr)
rowCount += 1
data = parser.Read()
If rowCount = 50000 OrElse (data Is Nothing OrElse data.Length = 0) Then
totalRows += daSample.Update(dtSample)
' empty the table if there will be more than 100k rows
dtSample.Rows.Clear()
rowCount = 0
End If
Loop
End Using
End Using
End Using
sw.Stop()
Console.WriteLine("Parsed and imported {0} rows in {1}", totalRows,
sw.Elapsed.TotalMinutes)
The processing loop updates the DB every 50K rows in case there are many many rows. It also does it in one pass rather than reading N rows thru OleDB at a time. CsvParser will read one row at a time, so there should never be more than 50,001 rows worth of data on hand at a time.
There may be special cases to handle for type conversions as shown with If item.DataType = GetType(Boolean) Then. A Boolean column read in as "1" cant be directly passed to a Boolean column, so it is converted to integer which can. There could be other conversions such as for funky dates.
Time to process 250,001 rows: 3.7 mins. An app which needs to apply those string transforms to every single string column will take much longer. I'm pretty sure that using the CsvReader in CSVHelper you could have those applied as part of parsing to a Type.
There is a potential disaster waiting to happen since this is meant to be an all-purpose importer/scrubber.
For i As Integer = 0 To dt.Columns.Count - 1
Dim columnName As String = firstRow(i).ToString()
Dim newColumn As New DataColumn(columnName, mListOfTypes(i))
getData.Columns.Add(newColumn)
Next
Both the question and the self-answer build the new table using the column names from the CSV and the DataTypes from a SELECT * query on the destination table. So, it assumes the CSV Columns are in the same order that SELECT * will return them, and that all CSVs will always use the same names as the tables.
The answer above is marginally better in that it finds and matches based on name.
A more robust solution is to write a little utility app where a user maps a DB column name to a CSV index. Save the results to a List(Of CSVMapItem) and serialize it. There could be a whole collection of these saved to disk. Then, rather than creating a map based on dead reckoning, just deserialize the desired for user as the csvMap in the above code.

Selecting different number of columns in a CSV file

The task is to extract data from multiple CSV files according to a criteria. The file contains a sampleId (this is the criteria) and other columns. At the end of the file there are the measurement values under 0...100 named columns (the numbers are the actual names of the columns). To make it a bit more interesting there can be variations in different CSV files, depending on the customer needs. This means the measurement data count can be 15, 25, 50 etc. but no more than 100 and no variations within one file. This data is always placed in the end of the line, so there is a set of columns before the numbers.
I'd like to have a SQL statement which can accept parameters:
SELECT {0} FROM {1} WHERE sampleId = {2}
0 is the numbers, 1 is the CSV file name and 2 is sampleId is what we looking for. The other solution which came into my mind is to look all the columns after the last fix column. I don't know is it possible or not, just thinking out loud.
Please be descriptive, my SQL knowledge is basic. Any help is really appreciated.
So finally managed to solve it. The code is in VB.NET, but the logic is quite clear.
Private Function GetDataFromCSV(sampleIds As Integer()) As List(Of KeyValuePair(Of String, List(Of Integer)))
Dim dataFiles() As String = System.IO.Directory.GetFiles(OutputFolder(), "*.CSV")
Dim results As List(Of KeyValuePair(Of String, List(Of Integer))) = New List(Of KeyValuePair(Of String, List(Of Integer)))
If dataFiles.Length > 0 And sampleIds.Length > 0 Then
For index As Integer = 0 To sampleIds.Length - 1
If sampleIds(index) > 0 Then
For Each file In dataFiles
If System.IO.File.Exists(file) Then
Dim currentId As String = sampleIds(index).ToString()
Dim filename As String = Path.GetFileName(file)
Dim strPath As String = Path.GetDirectoryName(file)
Dim conn As OleDb.OleDbConnection = New OleDb.OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0; Data Source=" & strPath & "; Extended Properties='text; HDR=Yes; FMT=Delimited'")
Dim command As OleDb.OleDbCommand = conn.CreateCommand()
command.CommandText = "SELECT * FROM [" & filename & "] 'WHERE Sample ID = " & currentId
conn.Open()
Dim reader As OleDb.OleDbDataReader = command.ExecuteReader()
Dim numberOfFields = reader.FieldCount
While reader.Read()
If reader("Sample ID").ToString() = currentId Then 'If found write particle data into output file
Dim particles As List(Of Integer) = New List(Of Integer)
For field As Integer = 0 To numberOfFields - 1
particles.Add(CInt(reader(field.ToString())))
Next field
results.Add(New KeyValuePair(Of String, List(Of Integer))(currentId, particles))
End If
End While
conn.Close()
End If
Next file
End If
Next index
Return results
Else
MessageBox.Show("Missing csv files or invalid sample Id(s)", "Internal error", MessageBoxButtons.OK, MessageBoxIcon.Exclamation)
End If
End Function

How can I be alerted if my recursion decides to start skipping loops?

I have a nested For / Next recursion at the heart of a web app I am trying to develop. When I programmed it, I verified its output on an example that nested 3-5 levels and thought all was well.
Lately, I decided to verify an entry in the result set that was nested 12 to 15 levels deep. The resulting percentage was incorrect because some progeny were not recursed.
Because my result set can take so long to run (8-13 hours for the more indepth queries) and I could run the same test on a subset (~150,000 records) of the records in the full database table (~1,200,000 records), I trialled it against the smaller table.
Low and behold it appears to work perfectly. A search that was returning ~56,000 records, returned ~126,000 records (meaning it was previously skipping out a lot of recursions). I verified random sample results to be correct.
Comparing result sets from the same query done on the two database tables, it seems that the missed recursions start to show up with some (not all) of the records that are nested beyond 12 levels deep.
The troubling thing for me is that I need to know when my result set is suspect without having to search the result set for dropped nesting.
Here is the code for the recursive sub:
' lookup table for up to 63 generations '
Dim percentage() As Double = {100, 50, 25, 12.5, 6.25, 3.125, ect. . .}
' DataTable to display results of looping through the db '
Dim t As DataTable
Dim c As DataColumn
Dim r As DataRow
' columns are: id, name, dad, mom, gender, year born, trait, percentage '
Private Sub GetPct(ByRef progeny As List(Of Int32), ByRef gender As List(Of String), ByVal generations As Int16, ByVal count As Int16)
Dim nxtGeneration As Int16 = generations + 1
Dim nxtPercentage As Double = percentage(nxtGeneration)
For i As Int16 = 0 To count
Dim dbConn As New SqlConnection(connString)
Dim j As Int16 = -1
Dim prog As New List(Of Int32)
Dim gndr As New List(Of String)
If gender(i) = "M" Then
Dim dreader As SqlDataReader
Dim dgetComm As New SqlCommand("d_get", dbConn)
dgetComm.CommandType = CommandType.StoredProcedure
dgetComm.Parameters.Add("#id", SqlDbType.Int)
dgetComm.Parameters("#id").Value = prog(i)
Using dbConn
Try
dbConn.Open()
dreader = dgetComm.ExecuteReader()
If dreader.HasRows = True Then
While dreader.Read()
j += 1
Dim updated As DataRow = t.Rows.Find(dreader(0))
If updated Is Nothing Then
t.BeginLoadData()
r = t.NewRow()
r(0) = dreader(0)
r(1) = dreader(1)
r(2) = dreader(2)
r(3) = dreader(3)
r(4) = dreader(4)
r(5) = dreader(5)
r(6) = dreader(6)
r(7) = nxtPercentage
t.Rows.Add(r)
t.EndLoadData()
prg.Add(dreader(0))
gnd.Add(dreader(4))
Else
prg.Add(dreader(0))
gnd.Add(dreader(4))
updated(7) += nxtPercentage
End If
End While
End If
dgetComm.Dispose()
dgetComm = Nothing
dreader.Close()
dreader = Nothing
Catch ex As Exception
' modify when going live '
lblDetails.Text &= "Error loading to table with get of dam" & Err.Description
End Try
End Using
GetPct(prg, gnd, nxtG, j)
Else
Dim sreader As SqlDataReader
Dim sgetComm As New SqlCommand("s_get", dbConn)
sgetComm.CommandType = CommandType.StoredProcedure
sgetComm.Parameters.Add("#id", SqlDbType.Int)
sgetComm.Parameters("#id").Value = prog(i)
Using dbConn
Try
dbConn.Open()
sreader = sgetComm.ExecuteReader()
If sreader.HasRows = True Then
While sreader.Read()
j += 1
Dim updated As DataRow = t.Rows.Find(srdr(0))
If updated Is Nothing Then
t.BeginLoadData()
r = t.NewRow()
r(0) = sreader(0)
r(1) = sreader(1)
r(2) = sreader(2)
r(3) = sreader(3)
r(4) = sreader(4)
r(5) = sreader(5)
r(6) = sreader(6)
r(7) = nxtPercentage
t.Rows.Add(r)
t.EndLoadData()
prg.Add(sreader(0))
gnd.Add(sreader(4))
Else
prg.Add(sreader(0))
gnd.Add(sreader(4))
updated(7) += nxtPercentage
End If
End While
End If
sgetComm.Dispose()
sgetComm = Nothing
sreader.Close()
sreader = Nothing
Catch ex As Exception
' modify when going live'
lblDetails.Text &= "Error loading to table with get of sire" & Err.Description
End Try
End Using
GetPct(prg, gnd, nxtG, j)
End If
Next i
End Sub