How do I query SQL data then insert or update depending on the result - vb.net

I am a beginner at this. But let me explain what I need to do and show you my code
I have a CSV file.
inside the CSV I have a projectnumber, city,state,country
I have a SQL table with the same column
I want to use vb.net to check if projectnumber exists in sql table
if exists then I want to run update statement.
if it does not exists then I want to run insert statement.
I have the program working . but I am just wondering if this would be the correct way or my code is some hack way of doing it.
LEGEND:
DTTable is data table with CSV inside
DT is data table with SQL result data
First I fill insert all lines in the CSV into a data table
Dim parser As New FileIO.TextFieldParser(sRemoteAccessFolder & "text.csv")
parser.Delimiters = New String() {","}
parser.ReadLine()
Do Until parser.EndOfData = True
DTTable.Rows.Add(parser.ReadFields())
Loop
parser.Close()
then I use oledbdataadapter to run the select query and fill another data table with the result of the select statement
SQLString = "select * from tblProjects where ProjectID='" & DTTable.Rows.Item(i).Item("ProjectNumber") & "'"
da = New OleDb.OleDbDataAdapter(SQLString, Conn)
da.Fill(dt)
then I run if statement
If dt.Rows.Count = 0 then
SQLString = "INSERT STATEMENT HERE"
oCmd = New OleDb.OleDbCommand(SQLString, Conn)
oCmd.ExecuteNonQuery()
Else
SQLString = "UPDATE STATEMENT HERE"
oCmd = New OleDb.OleDbCommand(SQLString, Conn)
oCmd.ExecuteNonQuery()
End if
ALL above code is run inside a for loop, to go through all the lines in the CSV
For i = 0 To DTTable.Rows.Count - 1
what do you think?
please advise
thank you

Personally, I wouldn't use .NET. I would import the table into a temp SQL Server table and then write my queries to insert/update data from the temp table to the regular table. This is certainly the way you want to go if the dataset is large.
If this is a process you need to repeat frequently, you could make an SSIS package.

I'd run the select query using datareader = command.ExecuteReader(). Then:
If datareader.Read() then
'Update query using datareader(0) as a where predicate goes here
ElseIf datareader(0) = Nothing then
'Insert query goes here
End If
I should say, I'm a relative novice too though, so maybe others can suggest a more elegant way of doing it.

Related

Getting "Database is Locked" when trying to move a list of records from one table to another table in SQLite

I have a Public Sub to move a collection of records from one table to another in the same SQLite database. First it reads a record from strFromTable, then writes it to strToTable, then deletes the record from strFromTable. To speed things up, I've loaded the entire collection of records into a transaction. When the list involves moving a lot of image blobs, the db gets backed up, and throws the exception "The Database is Locked". I think what is happening is that it's not finished writing one record before it starts trying to write the next record. Since SQLite only allows one write at a time, it thows the "Locked" exception.
Here is the code that triggers the error when moving a lot of image blobs:
Using SQLconnect = New SQLiteConnection(strDbConnectionString)
SQLconnect.Open()
Using tr = SQLconnect.BeginTransaction()
Using SQLcommand = SQLconnect.CreateCommand
For Each itm As ListViewItem In lvcollection
SQLcommand.CommandText = $"INSERT INTO {strToTable} SELECT * FROM {strFromTable} WHERE id = {itm.Tag}; DELETE FROM {strFromTable} WHERE ID = {itm.Tag};"
SQLcommand.ExecuteNonQuery()
Next
End Using
tr.Commit()
End Using
End Using
When I get rid of the transaction, it executes without error:
Using SQLconnect = New SQLiteConnection(strDbConnectionString)
SQLconnect.Open()
Using SQLcommand = SQLconnect.CreateCommand
For Each itm As ListViewItem In lvcollection
SQLcommand.CommandText = $"INSERT INTO {strToTable} SELECT * FROM {strFromTable} WHERE id = {itm.Tag}; DELETE FROM {strFromTable} WHERE ID = {itm.Tag};"
SQLcommand.ExecuteNonQuery()
Next
End Using
End Using
I'm not very good with DB operations, so I'm sure there is something that needs improvement. Is there a way to make SQLite completely finish the previous INSERT before executing the next INSERT? How can I change my code to allow using a transaction?
Thank you for your help.
.
Ok ... here is the solution that I decided to go with. I hope this helps someone finding this in a search:
Dim arrIds(lvcollection.Count - 1) As String
Dim i as Integer = 0
' Load the array with all the Tags in the listViewCollection
For i = 0 to lvcollection.Count - 1
arrIds(i) = lvcollection(i).Tag 'item.Tag holds the Primary Key "id" field in the DB
Next
'build a comma-space separated string of all ids from the array of ids.
Dim strIds as String = String.Join(", ", arrIds)
Using SQLconnect = New SQLiteConnection(strDbConnectionString)
SQLconnect.Open()
Using tr = SQLconnect.BeginTransaction()
Using SQLcommand = SQLconnect.CreateCommand
SQLcommand.CommandText = $"INSERT INTO {strToTable} SELECT * FROM {strFromTable} WHERE id IN ({strIds});"
SQLcommand.ExecuteNonQuery()
SQLcommand.CommandText = $"DELETE FROM {strFromTable} WHERE ID IN ({strIds});"
SQLcommand.ExecuteNonQuery()
End Using
tr.Commit()
End Using
End Using
The IN statement allows me to pass all of the "id" values to be deleted as a batch. This solution is faster and more secure than doing them one by one with no transaction.
Thanks for the comments, and best wishes to everyone in their coding.

Pass parameter to a query from another query in Access

I have a parameterized query GET_CUSTOMER:
SELECT * FROM Customer WHERE id = [customer_id]
I want to call this query from another query and pass it a parameter:
SELECT * FROM GET_CUSTOMER(123)
Note the above code is not valid, it is here to give you an idea of what I'm trying to do. Is it possible to do this in MS Access?
UPDATE 1:
The queries I posted are for example. The actual queries are much more complex. I know I can use table joins, but in my specific case it would be much easier if I could run parameterized queries inside other queries (that are parameterized as well). I can't use access forms because I'm using access with my .NET application.
This is how I end up solving this with help of https://stackoverflow.com/a/24677391/303463 . It turned out that Access shares parameters among all queries so there is no need to specifically pass parameters from one query to another.
Query1:
SELECT * FROM Customer WHERE ID > [param1] AND ID < [param2]
Query2:
SELECT * FROM Query1
VB.NET code:
Dim ConnString As String = "Provider=Microsoft.Jet.OleDb.4.0;Data Source=Database.mdb"
Dim SqlString As String = "Query2"
Using Conn As New OleDbConnection(ConnString)
Using Cmd As New OleDbCommand(SqlString, Conn)
Cmd.CommandType = CommandType.StoredProcedure
Cmd.Parameters.AddWithValue("param1", "1")
Cmd.Parameters.AddWithValue("param2", "3")
Conn.Open()
Using reader As OleDbDataReader = Cmd.ExecuteReader()
While reader.Read()
Console.WriteLine(reader("ID"))
End While
End Using
End Using
End Using
You can build the SQL on the fly.
MyID = prompt or get from user some ID
strSQl = "Select * from tblCustomer where ID in " & _
"(select * from tblTestCustomers where id = " & MyID
So you can nest, or use the source of one query to feed a list of ID to the second query.

Vb.net pull in a SQL table row by row

I am a little new to using vb.net and SQL so I figured I would check with you guys to see if what I am doing makes sense, or if there is a better way. For the first step I need to read in all the rows from a couple of tables and store the data in the way the code needs to see it. First I get a count:
mysqlCommand = New SQLCommand("SELECT COUNT(*) From TableName")
Try
SQLConnection.Open()
count = myCommand.ExecuteScalar()
Catch ex As SqlException
Finally
SQLConnection.Close()
End Try
Next
Now I just want to iterate through the rows, but I am having a hard time with two parts, First, I cannot figure out the SELECT statement that will jet me grab a particular row of the table. I saw the example here, How to select the nth row in a SQL database table?. However, this was how to do it in SQL only, but I was not sure how well that would translate over to a vb.net call.
Second, in the above mycommand.ExecuteScalar() tell VB that we expect a number back from this. I believe the select statement will return a DataRow, but I do not know which Execute() statement tells the script to expect that.
Thank you in advance.
A simple approach is using a DataTable which you iterate row by row. You can use a DataAdapter to fill it. Use the Using-statement to dispose/close objects property that implement IDisposable like the connection:
Dim table = New DataTable
Using sqlConnection = New SqlConnection("ConnectionString")
Using da = New SqlDataAdapter("SELECT Column1, Column2, ColumnX FROM TableName ORDER By Column1", sqlConnection)
' you dont need to open/close the connection with a DataAdapter '
da.Fill(table)
End Using
End Using
Now you can iterate all rows with a loop:
For Each row As DataRow In table.Rows
Dim col1 As Int32 = row.Field(Of Int32)(0)
Dim col2 As String = row.Field(Of String)("Column1")
' ...'
Next
or use the table as DataSource for a databound control.

Update table while reading

I'm writing a piece of code (VB.NET) to cleanse a (quite big) table of data.
I am connecting to my SQL database, looping through the table, cleansing the data and adding the cleansed data in a different column.
As i'm currently doing an update to my database for each record in the same loop as where i am cleansing the data, i am wondering if there is a more efficient way of doing this, where i would cleanse the data and afterwards send all the updated records to the database in one go.
Simplified code:
'Connect
SQLConn.ConnectionString = strConnection
SQLConn.Open()
SQLCmd.Connection = SQLConn
SQLConn2.ConnectionString = strConnection
SQLConn2.Open()
SQLCmd2.Connection = SQLConn2
'Set query
strSQL = "SELECT Column1 FROM Table1"
SQLCmd.CommandText = strSQL
'Load Query
SQLdr = SQLCmd.ExecuteReader
'Start Cleansing
While SQLdr.Read
Cleansing()
'Add to database
strSQL2 = "UPDATE Table1 SET Clean_data = '" & strClean & "' WHERE Dirty_Data = '" & SQLdr(0).ToString & "'"
SQLCmd2.CommandText = strSQL2
SQLCmd2.ExecuteNonQuery()
End While
'Close Connections
SQLdr.Close()
SQLConn.Close()
SQLConn2.Close()
I'm guessing (from searching for a solution) that it is possible to do the update outside of my loop, but i can't seem to find how to do it specifically.
Many thanks!
Your code is taking a long time because the update is doing a full table scan for every record. You can speed it up by adding an index on the column "Dirty Data".
Essentially, you are reading the data in the select statement. Cleaning one row, and then updating it. The preferred "set-based" approach is more like:
Ideally, you would like to do:
update table1
set column1 = <fix the dirty data>
where column1 <is dirty>
And you have some options in SQL, in terms of replace() and case and like (for instance) that can help with this process.
But you already have the cleaning code external to the database. For this, you want to create and open a cursor, process the record, and then write back. Cursors are relatively slow, compared to in-database operations. But, this is exactly the situation they were designed for -- external code to be applied to individual records.

Is it possible to insert an entire VB.NET DataTable into a SQL Server at once

I have a SQLClient.DataSet in VB.NET, and I want to insert the entire thing into a SQL Server table without having to do the following:
For Each dr as Datarow in MyDataset
Dim sc As New SqlCommand("INSERT INTO MyNewTable " & _
"VALUES (#column1, #column2)", MyDBConnection)
sc.Parameters.AddWithValue("#column1", dr.Item(0))
sc.Parameters.AddWithValue("#column2", dr.Item(1))
sc.ExecuteNonQuery()
Next
Since I've got close to a million rows (all pretty skinny, so it's not much space), I obviously don't want to run this loop and generate a million INSERT statements.
I know that one option is to use a linked server when I initially fetch the data, since it's coming from another SQL Server, and just have it to the INSERT from there. However, if I already have the data in my application, is there a more efficient way to bulk insert it? Can I somehow pass the DataTable as a parameter to SQL Server and have it sort it out and insert the rows?
try with SqlBulkCopy
With SQL Server 2008 you can use Table-Valued Parameters:
Dim sc As New SqlCommand(
"INSERT INTO MyNewTable (field1, field2,...)"&
"SELECT field1, field2,... FROM #MyTable;", MyDBConnection)
sc.Parameters.AddWithValue("#MyTable", MyDataset)
sc.ExecuteNonQuery()
Use the SqlDataAdapter's InsertCommand to define your Insert query. Then call the DataAdapter's Update Method with your dataset as a parameter to have it push the data.
Something like:
Dim DA As SqlDataAdapter = New SqlDataAdapter
Dim Parm As New SqlParameter
DA.InsertCommand = New SqlCommand("Insert Into tbl1(fld0, fld1, fld2) Values(#fld0, #fld1, #fld2)", conn)
Parm = DA.InsertCommand.Parameters.Add(New SqlParameter ("#fld0", NVarChar, 50, "fld0"))
Parm = sqlDA.InsertCommand.Parameters.Add(New SqlParameter ("#fld1", SqlDbType.NVarChar, 50, "fld1"))
Parm = sqlDA.InsertCommand.Parameters.Add(New SqlParameter ("#fld2", SqlDbType.NVarChar, 50, "fld2"))
DA.Update(dataset1, "tbl1")
You could call .WriteXML() on the DataSet and dump that into the database in one insert.
A way simplier way is to use a table adapter. Then you can use the Fill method to give a datatable as argument :
Dim oStronglyTypedTable As StronglyTypedDataTable = GetTable() 'A custom function that creates your table from wherever you want)
If Not oStronglyTypedTable Is Nothing Then
Using oAdapter As New StronglyTypedTableAdapter
Dim res As Integer = oAdapter.Update(oStronglyTypedTable)
MsgBox(res & " rows have been updated.")
End Using
End If
Do not forget to change your Database "Copy to Output Directory" property to "Do net copy" and set your connection string properly...