get column names Jet OLE DB in vb.net - vb.net

I've written a function which reads csv files and parametrizes them accordingly, therefore i have a function gettypessql which queries sql table at first to get data types and therefore to adjust the columns which are later inserted in sql. So my problem is when I set HDR=Yes in Jet OLE DB I get only column names like F1, F2, F3. To circumvent this issue I've set HDR=No and written some for loops but now I get only empty strings, what is actually the problem? here is my code:
Private Function GetCSVFile(ByVal file As String, ByVal min As Integer, ByVal max As Integer) As DataTable
Dim ConStr As String = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" & TextBox1.Text & ";Extended Properties=""TEXT;HDR=NO;IMEX=1;FMT=Delimited;CharacterSet=65001"""
Dim conn As New OleDb.OleDbConnection(ConStr)
Dim dt As New DataTable
Dim da As OleDb.OleDbDataAdapter = Nothing
getData = Nothing
Try
Dim CMD As String = "Select * from " & _table & ".csv"
da = New OleDb.OleDbDataAdapter(CMD, conn)
da.Fill(min, max, dt)
getData = New DataTable(_table)
Dim firstRow As DataRow = dt.Rows(0)
For i As Integer = 0 To dt.Columns.Count - 1
Dim columnName As String = firstRow(i).ToString()
Dim newColumn As New DataColumn(columnName, mListOfTypes(i))
getData.Columns.Add(newColumn)
Next
For i As Integer = 1 To dt.Rows.Count - 1
Dim row As DataRow = dt.Rows(i)
Dim newRow As DataRow = getData.NewRow()
For j As Integer = 0 To getData.Columns.Count - 1
If row(j).GetType Is GetType(String) Then
Dim colValue As String = row(j).ToString()
colValue = ChangeEncoding(colValue)
colValue = ParseString(colValue)
colValue = ReplaceChars(colValue)
newRow(j) = colValue
Else
newRow(j) = row(j)
End If
Next
getData.Rows.Add(newRow)
Application.DoEvents()
Next
Catch ex As OleDbException
MessageBox.Show(ex.Message)
Catch ex As Exception
MessageBox.Show(ex.Message)
Finally
dt.Dispose()
da.Dispose()
End Try
Return getData
End Function
and get types sql, this one doesn't convert properly, especially doubles
Private Sub GetTypesSQL()
If (mListOfTypes Is Nothing) Then
mListOfTypes = New List(Of Type)()
End If
mListOfTypes.Clear()
Dim dtTabelShema As DataTable = db.GetDataTable("SELECT TOP 0 * FROM " & _table)
Using dtTabelShema
For Each col As DataColumn In dtTabelShema.Columns
mListOfTypes.Add(col.DataType)
Next
End Using
End Sub

I think you have made it more complicated than it needs to be. For instance, you get the dbSchema by creating an empty DataTable and harvesting the Datatypes from it. Why not just use that first table rather than creating a new table from the Types? The table also need not be reconstructed over and over for each batch of rows imported.
Generally since OleDb will try to infer types from the data, it seems unnecessary and may even get in the way in some cases. Also, you are redoing everything that OleDB does and copying data to a different DT. Given that, I'd skip the overhead OleDB imposes and work with the raw data.
This creates the destination table using the CSV column name and the Type from the Database. If the CSV is not in the same column order as those delivered in a SELECT * query, it will fail.
The following uses a class to map csv columns to db table columns so the code is not depending on the CSVs being in the same order (since they may be generated externally). My sample data CSV is not in the same order:
Public Class CSVMapItem
Public Property CSVIndex As Int32
Public Property ColName As String = ""
'optional
Public Property DataType As Type
Public Sub New(ndx As Int32, csvName As String,
dtCols As DataColumnCollection)
CSVIndex = ndx
For Each dc As DataColumn In dtCols
If String.Compare(dc.ColumnName, csvName, True) = 0 Then
ColName = dc.ColumnName
DataType = dc.DataType
Exit For
End If
Next
If String.IsNullOrEmpty(ColName) Then
Throw New ArgumentException("Cannot find column: " & csvName)
End If
End Sub
End Class
The code to parse the csv uses CSVHelper but in this case the TextFieldParser could be used since the code just reads the CSV rows into a string array.
Dim SQL = String.Format("SELECT * FROM {0} WHERE ID<0", DBTblName)
Dim rowCount As Int32 = 0
Dim totalRows As Int32 = 0
Dim sw As New Stopwatch
sw.Start()
Using dbcon As New MySqlConnection(MySQLConnStr)
Using cmd As New MySqlCommand(SQL, dbcon)
dtSample = New DataTable
dbcon.Open()
' load empty DT, create the insert command
daSample = New MySqlDataAdapter(cmd)
Dim cb = New MySqlCommandBuilder(daSample)
daSample.InsertCommand = cb.GetInsertCommand
dtSample.Load(cmd.ExecuteReader())
' dtSample is not only empty, but has the columns
' we need
Dim csvMap As New List(Of CSVMapItem)
Using sr As New StreamReader(csvfile, False),
parser = New CsvParser(sr)
' col names from CSV
Dim csvNames = parser.Read()
' create a map of CSV index to DT Columnname SEE NOTE
For n As Int32 = 0 To csvNames.Length - 1
csvMap.Add(New CSVMapItem(n, csvNames(n), dtSample.Columns))
Next
' line data read as string
Dim data As String()
data = parser.Read()
Dim dr As DataRow
Do Until data Is Nothing OrElse data.Length = 0
dr = dtSample.NewRow()
For Each item In csvMap
' optional/as needed type conversion
If item.DataType = GetType(Boolean) Then
' "1" wont convert to bool, but (int)1 will
dr(item.ColName) = Convert.ToInt32(data(item.CSVIndex).Trim)
Else
dr(item.ColName) = data(item.CSVIndex).Trim
End If
Next
dtSample.Rows.Add(dr)
rowCount += 1
data = parser.Read()
If rowCount = 50000 OrElse (data Is Nothing OrElse data.Length = 0) Then
totalRows += daSample.Update(dtSample)
' empty the table if there will be more than 100k rows
dtSample.Rows.Clear()
rowCount = 0
End If
Loop
End Using
End Using
End Using
sw.Stop()
Console.WriteLine("Parsed and imported {0} rows in {1}", totalRows,
sw.Elapsed.TotalMinutes)
The processing loop updates the DB every 50K rows in case there are many many rows. It also does it in one pass rather than reading N rows thru OleDB at a time. CsvParser will read one row at a time, so there should never be more than 50,001 rows worth of data on hand at a time.
There may be special cases to handle for type conversions as shown with If item.DataType = GetType(Boolean) Then. A Boolean column read in as "1" cant be directly passed to a Boolean column, so it is converted to integer which can. There could be other conversions such as for funky dates.
Time to process 250,001 rows: 3.7 mins. An app which needs to apply those string transforms to every single string column will take much longer. I'm pretty sure that using the CsvReader in CSVHelper you could have those applied as part of parsing to a Type.
There is a potential disaster waiting to happen since this is meant to be an all-purpose importer/scrubber.
For i As Integer = 0 To dt.Columns.Count - 1
Dim columnName As String = firstRow(i).ToString()
Dim newColumn As New DataColumn(columnName, mListOfTypes(i))
getData.Columns.Add(newColumn)
Next
Both the question and the self-answer build the new table using the column names from the CSV and the DataTypes from a SELECT * query on the destination table. So, it assumes the CSV Columns are in the same order that SELECT * will return them, and that all CSVs will always use the same names as the tables.
The answer above is marginally better in that it finds and matches based on name.
A more robust solution is to write a little utility app where a user maps a DB column name to a CSV index. Save the results to a List(Of CSVMapItem) and serialize it. There could be a whole collection of these saved to disk. Then, rather than creating a map based on dead reckoning, just deserialize the desired for user as the csvMap in the above code.

Related

Need to Batch a Large DataTable and write each batch to a Text file - VB.Net

I have a requirement that I need to query a DB and fetch the records in a Data Table. The Data Table has 20,000 records.
I need to batch these records in Batches of 100 records each and write these batches into a individual Text files.
Till now I have been able to batch the records in batches of 100 each using IEnumerable(of DataRow).
I am now facing issue in writing the IEnumeable(Of DatRow) to a Text File.
My code is a below:
Dim strsql = "Select * from myTable;"
Dim dt as DataTable
Using cnn as new SqlConnection(connectionString)
cnn.Open()
Using dad as new SqlAdapter(strsql ,cnn)
dad.fill(dt)
End Using
cnn.Close()
End Using
Dim Chunk = getChunks(dt,100)
For each chunk as IEnumerable(Of DataRow) In Chunks
Dim path as String = "myFilePath"
If Not File.Exists(myFilePath) Then
//** Here I will write my Batch into the File.
End If
Next
Public Iterator Function getChunks(byVal Tab as DataTable, byVal size as Integer) as IEnumerable (Of IEnumerable(of DataRow))
Dim chunk as List(Of DataRow) = New List(of DataRow)(size)
For Each row As DataRow in tab.Rows
chunk.Add(row)
if chunk.Count = size Then
Yield chunk
chunk = New List(of DataRow0(size)
Next
if chunk.Any() Then Yield chunk
End Function
Need your help to write the IEneumerable of DataRows into a Text file for each Batch of Records.
Thanks
:)
Your existing code is needlessly complex. If this is all you're doing, then using a datatable is unnecessary/unwise; this is one of the few occasions I would advocate using a lower level datareader to keep the memory impact low
Writing a db table to a file, quick, easy and low memory consumption:
Dim dr = sqlCommand.ExecuteReader()
Dim sb as New StringBuilder
Dim lineNum = -1
Dim batchSize = 100
While dr.Read()
'turn the row into a string for our file
For x = 0 to dr.FieldCount -1
sb.Append(dr.GetString(x)).Append(",")
Next x
sb.Length -= 1 'remove trailing comma
sb.AppendLine()
'keep track of lines written so we can batch accordingly
lineNum += 1
Dim fileNum = lineNum \ batchSize
File.AppendAllText($"c:\temp\file{fileNum}.csv", sb.ToString())
'clear the stringbuilder
sb.Length = 0
End While
If you really want to use a datatable, there isn't anything stopping you swapping this while dr For a For Each r as DataRow in myDatatable.Rows
Please note, this isn't an exercise in creating a fully escaped csv, nor formatting the data; it is demonstrating the concept of having a firehose of data and simply writing it to N different files by utilising the fact that doing an integer divide on every number from 0 to 99 will result in 0 (and hence go in file 0) and then very number from 1 to 199 will result in 1 (and hence lines go in file 1) etc, and doing this process on a single stream of data, or single iteration of N items
You could build the file lines in the string builder and write them every batchSize if lineNum Mod batchSize = batchSize - 1, if you feel that it would be more efficient than calling file appendalltext (which opens and closes the file)
Tested this with a table of a little over 1,500 records and 10 fields. The file creation took a little over 5 seconds (excluding data access). All things being equal (which I know they are not) that would be over 13 seconds writing the files.
Since your problem was with the iterator I assume the there were no memory issues with the DataTable.
You can include more than one database object in a Using block by using a comma to designate a list of objects in the Using.
Private Sub OPCode()
Dim myFilePath = "C:\Users\xxx\Documents\TestLoop\DataFile"
Dim strsql = "Select * from myTable;"
Dim dt As New DataTable
Using cnn As New SqlConnection(connectionString),
cmd As New SqlCommand(strsql, cnn)
cnn.Open()
dt.Load(cmd.ExecuteReader)
End Using
sw.Start()
Dim StartRow = 0
Dim EndRow = 99
Dim FileNum = 1
Dim TopIndex = dt.Rows.Count - 1
Do
For i = StartRow To EndRow
Dim s = String.Join("|", dt.Rows(i).ItemArray)
File.AppendAllText(myFilePath & FileNum & ".txt", s & Environment.NewLine)
Next
FileNum += 1
StartRow += 100
EndRow += 100
If EndRow >= TopIndex Then
EndRow = TopIndex
End If
Loop Until StartRow >= TopIndex
sw.Stop()
MessageBox.Show(sw.ElapsedMilliseconds.ToString)
End Sub
I thought your code was a great use of the iteration function.
Here is the code for your iterator.
Public Iterator Function getChunks(ByVal Tab As DataTable, ByVal size As Integer) As IEnumerable(Of IEnumerable(Of DataRow))
Dim chunk As List(Of DataRow) = New List(Of DataRow)(size)
For Each row As DataRow In Tab.Rows
chunk.Add(row)
If chunk.Count = size Then
Yield chunk
chunk = New List(Of DataRow)(size)
End If
Next
If chunk.Any() Then Yield chunk
End Function
Private Sub Button1_Click(sender As Object, e As EventArgs) Handles Button1.Click
Dim dt = LoadDataTable()
Dim myFilePath As String = "C:\Users\xxx\Documents\TestLoop\DataFile"
Dim FileNum = 1
For Each chunk As IEnumerable(Of DataRow) In getChunks(dt, 100)
For Each row As DataRow In chunk
Dim s = String.Join("|", row.ItemArray)
File.AppendAllText(myFilePath & FileNum & ".txt", s & Environment.NewLine)
Next
FileNum += 1
Next
MessageBox.Show("Done")
End Sub
You just needed to nest the For Each to get at the data rows.

How to put data of MS excel of one column inside array in vb.net

I have data on my MS.Excel spreadsheet which contain different column (Sn , Amount and tech id). I am trying to put all the data of tech id on tech id in array like :-
mydata = [43219 , 43220 , 43221 , 43222 ,43223 ,43224 , 43225 ]
My code of only one main processing function:-
Importing :-
Imports System.IO
Imports System.Data.OleDb
main processing function:-
Dim conString1 As String
Dim Mydata(200) As Integer
Dim connection As OleDbConnection
Dim adapter As OleDbDataAdapter
Private Sub LoadData(conStr As String)
con = New OleDbConnection(conStr)
Dim query As String = "SELECT * FROM [Sheet0$]"
adapter = New oleDbDataAdapter(query, connection)
'Putting data indide array
'For intCount = 0 To lengthofcolumn
'Mydata(intCount) = ?
'Next intCount
Debug.Print(adapter)
End Sub
Calling :-
conString1 = String.Format("Provider = Microsoft.Jet.OLEDB.4.0;Data Source = '{0}'; Extended Properties = Excel 8.0", 'F:\MicroTest\data\log.xlsx)')
LoadData(conString1)
I am a student , I am learning so please help ,I did't find this solution , Mostly I found solution of viewing excel data in datagrid
My test data was in B2:B8.
You will need to add the Reference: Microsoft Excel 14.0 Object Library
Dim oExcel As New Microsoft.Office.Interop.Excel.Application
oExcel.Workbooks.Open("C:\TEMP\test_data.xlsx")
Dim oSheet As Microsoft.Office.Interop.Excel.Worksheet = oExcel.Sheets(1)
' I would use list instead of an array.
Dim oTest As New List(Of String)
For Each oValue As String In oSheet.Range("B2:B8").Value2
oTest.Add(oValue)
Next
' Using an array
Dim oData(200) As Integer
Dim iCounter As Integer = 0
For Each oValue As String In oSheet.Range("B2:B8").Value2
oData(iCounter) = CType(oValue, Integer)
iCounter += 1
Next
oExcel.Quit()
I think your approach is good, accessing the file with OleDB and not openning an instance of Excel.
I used a DataReader and DataTable to collect and hold the data in memory.
The Using...End Using blocks ensure your objects that have a Dispose method are closed and disposed properly even if there is an error.
Private Sub LoadData()
Dim dt As New DataTable()
Dim conStr As String = "Your connection string"
Using con As New OleDbConnection(conStr)
Dim query As String = "SELECT * FROM [Sheet1$]"
Using cmd As New OleDbCommand(query, con)
con.Open()
Using dr As OleDbDataReader = cmd.ExecuteReader()
dt.Load(dr)
End Using
End Using
End Using
'The number of rows in the DataTable less the first 2 rows which are title and blank
'and subtract 1 because vb.net arrays are defined array(upper bound)
Dim arraySize As Integer = dt.Rows.Count - 3
Dim myData(arraySize) As Integer
Dim arrayIndex As Integer = 0
'Putting data indide array
For rowIndex As Integer = 2 To dt.Rows.Count - 1
myData(arrayIndex) = CInt(dt.Rows(rowIndex)(3)) '3 is the index of the TechID column
arrayIndex += 1
Next
'Checking the array - delete in final version
'I used i as a variable name because this is a very tiny
'loop and will be deleted eventually. Otherwise, I would
'have used a more descriptive name.
For Each i As Integer In myData
Debug.Print(i.ToString)
Next
End Sub

Selecting different number of columns in a CSV file

The task is to extract data from multiple CSV files according to a criteria. The file contains a sampleId (this is the criteria) and other columns. At the end of the file there are the measurement values under 0...100 named columns (the numbers are the actual names of the columns). To make it a bit more interesting there can be variations in different CSV files, depending on the customer needs. This means the measurement data count can be 15, 25, 50 etc. but no more than 100 and no variations within one file. This data is always placed in the end of the line, so there is a set of columns before the numbers.
I'd like to have a SQL statement which can accept parameters:
SELECT {0} FROM {1} WHERE sampleId = {2}
0 is the numbers, 1 is the CSV file name and 2 is sampleId is what we looking for. The other solution which came into my mind is to look all the columns after the last fix column. I don't know is it possible or not, just thinking out loud.
Please be descriptive, my SQL knowledge is basic. Any help is really appreciated.
So finally managed to solve it. The code is in VB.NET, but the logic is quite clear.
Private Function GetDataFromCSV(sampleIds As Integer()) As List(Of KeyValuePair(Of String, List(Of Integer)))
Dim dataFiles() As String = System.IO.Directory.GetFiles(OutputFolder(), "*.CSV")
Dim results As List(Of KeyValuePair(Of String, List(Of Integer))) = New List(Of KeyValuePair(Of String, List(Of Integer)))
If dataFiles.Length > 0 And sampleIds.Length > 0 Then
For index As Integer = 0 To sampleIds.Length - 1
If sampleIds(index) > 0 Then
For Each file In dataFiles
If System.IO.File.Exists(file) Then
Dim currentId As String = sampleIds(index).ToString()
Dim filename As String = Path.GetFileName(file)
Dim strPath As String = Path.GetDirectoryName(file)
Dim conn As OleDb.OleDbConnection = New OleDb.OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0; Data Source=" & strPath & "; Extended Properties='text; HDR=Yes; FMT=Delimited'")
Dim command As OleDb.OleDbCommand = conn.CreateCommand()
command.CommandText = "SELECT * FROM [" & filename & "] 'WHERE Sample ID = " & currentId
conn.Open()
Dim reader As OleDb.OleDbDataReader = command.ExecuteReader()
Dim numberOfFields = reader.FieldCount
While reader.Read()
If reader("Sample ID").ToString() = currentId Then 'If found write particle data into output file
Dim particles As List(Of Integer) = New List(Of Integer)
For field As Integer = 0 To numberOfFields - 1
particles.Add(CInt(reader(field.ToString())))
Next field
results.Add(New KeyValuePair(Of String, List(Of Integer))(currentId, particles))
End If
End While
conn.Close()
End If
Next file
End If
Next index
Return results
Else
MessageBox.Show("Missing csv files or invalid sample Id(s)", "Internal error", MessageBoxButtons.OK, MessageBoxIcon.Exclamation)
End If
End Function

vb.net access database

I am trying to read fields from a query into a text string arrary. In vb6 I can simply declare the array and then read the fields into it without it caring what type of values are in it. Now when I try to do that same thing I get an "unable to cast com object of type 'dao.fieldclass' to type 'system.string". Do I need to read the field value into a seperarte variable and then convert it to a string? The seqNum is what I am having the problem with
Public dbEngine As dao.DBEngine
Public db As dao.Database, recSet As dao.Recordset
dbEngine = New dao.DBEngine
Dim seqNum As Long
scExportTemplatePath = "M:\robot\scTemplates\"
db = dbEngine.OpenDatabase(scExportStuffPath & "scExport.mdb")
tsOut = fso.CreateTextFile(wildePath & dte & "-" & fle.Name & ".csv", True)
With recSet
.MoveFirst()
Do While Not .EOF
seg = .Fields("segmentID")
If seg <> segHold Then
seqNum = 1
End If
arrOut(0) = .Fields("jobnum_AM")
Loop
End With
You have several problems with this code. In addition to the points mentioned by Jeremy:
What was Long in VB6 is now Integer in VB.NET. Long is a 64 bit integer now.
Use System.IO.Path.Combine in order to combine path strings. Combine automatically adds missing backslashes and removes superfluous ones. Path.Combine(scExportTemplatePath, "scExport.mdb")
The Field property does not have a default property any more. Non-indexed properties are never default properties in VB.NET. Get the field value with .Fields("segmentID").Value.
Convert its value to the appropriate type: seg = Convert.ToInt32(.Fields("segmentID").Value)
Note: VB's Integer type is just an alias for System.Int32.
You are always adding to the same array field. I don't know exactly what you have in mind. If you want to add one field only, you could just use a List(Of String). If you are adding several fields for each record, then a List(Of String()) (i.e. a list of string arrays) would be appropriate. Lists have the advantage that they grow automatically.
Dim list As New List(Of String())
Do While Not .EOF
Dim values = New String(2) {}
values(0) = Convert.ToString(.Fields("field_A").Value)
values(1) = Convert.ToString(.Fields("field_B").Value)
values(2) = Convert.ToString(.Fields("field_C").Value)
list.Add(values)
recSet.MoveNext()
Loop
But it is more comprehensible, if you create a custom class for storing your field values:
Console.WriteLine("{0} {1} ({2})", user.FirstName, user.LastName, user.DateOfBirth)
... reads much better than:
Console.WriteLine("{0} {1} ({2})", values(0), values(1), values(2))
In VB.NET you have other possibilities to work with databases:
Dim list As New List(Of String())
Using conn = New OleDb.OleDbConnection( _
"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=MyPath\MyDb.mdb")
Dim sql = "SELECT myStuff FROM myTable"
Dim command = New OleDbCommand(sql, conn)
conn.Open()
Using reader As OleDbDataReader = command.ExecuteReader()
While reader.Read()
Dim values = New String(reader.FieldCount - 1) {}
For i = 0 To reader.FieldCount - 1
values(i) = Convert.ToString(reader.GetValue(i))
Next
list.Add(values)
End While
End Using
End Using
Note that the Using statement closes the resources automatically at the end. Even if an error occurs and the code is terminated prematurely.
In VB.NET you can write to files like this (without using fso, which is not .NET like)
Using writer As New StreamWriter("myFile.txt", False)
writer.WriteLine("line 1")
writer.WriteLine("line 2")
writer.WriteLine("line 3")
End Using
1) You dont show how you open the Recordset, eg:
Set recSet = db.OpenRecordset("query_name or SQL")
2) You dont have a .MoveNext in the While Loop:
With recSet
.MoveFirst()
Do While Not .EOF
seg = .Fields("segmentID")
If seg <> segHold Then
seqNum = 1
End If
arrOut(0) = .Fields("jobnum_AM")
.MoveNext()
loop

Best way to merge two Datatables

I need to marge two datatables with condition. I have a datatable where the data comes from a local XML Database and another datatable where the data comes from a remote SQL Server.
If any update made in the remote datatable I need to update/merge with the local datatable. Here is what I have so far:
Public Sub MargeTwoTable()
Dim SQL As String = ""
Dim RemoteTable As New DataTable
Dim LocalTable As DataTable
Dim dal As New DalComon
Dim yy As Integer = 0
Dim UpdateDate As String
Dim TableName As String = "V_Book_Price"
LocalTable = LoadDataTable(TableName, True)
UpdateDate = LocalTable.Compute("MAX(update_date)", Nothing)
SQL = "select * from V_Book_Price where Update_Date > '" & UpdateDate & "'"
RemoteTable = dal.GetDataSetBySQL(SQL).Tables(0)
If RemoteTable.Rows.Count > 0 Then
For i = 0 To RemoteTable.Rows.Count - 1
Dim st As DataRow
Dim mm() As DataRow = LocalTable.Select("ID = '" & RemoteTable.Rows(i).Item("ID") & "'")
If mm.Length = 0 Then
st = LocalTable.NewRow
For yy = 0 To RemoteTable.Columns.Count - 1
st(yy) = RemoteTable.Rows(i)(yy)
Next
LocalTable.Rows.Add(st)
Else
st = mm(0)
For yy = 0 To RemoteTable.Columns.Count - 1
If IsDate(RemoteTable.Rows(i)(yy)) Then
st(yy) = CDate(RemoteTable.Rows(i)(yy)).ToString("s")
Else
st(yy) = RemoteTable.Rows(i)(yy)
End If
Next
mm = Nothing
End If
Next
End If
End Sub
In this code data comes from the remote database which updates a date getter then the local database . Both tables have "ID" as the primary key. The code is working well, but the problem is that when more than 1000 records are updated this function takes too long using loops.
Not sure if can be applicable, but have you ever looked at the
DataTable.LoadDataRow() method?.
It seems a good candidate to substitute all of you code above.
Your code could be simplified to these lines
Dim row as DataRow
For Each row in RemoteTable.Rows
LocalTable.LoadDataRow(row.ItemArray, false)
Next
Another alternative could be the DataTable.Merge that could cut your code to a single line
LocalTable.Merge(RemoteTable, False)
However, the real effectiveness of these two methods depends on the schema compatibility and from the presence of AutoNumber (identity) columns.