Copying Tables contents of one Database to another from a vb.NET app using OracleDataAdapter.InsertCommand - vb.net

So the rundown of what I'm trying to achieve is essentially an update app that will pull data from our most recent production databases and copy it's contents to the Devl or QA databases. I plan to limit what is chosen by a number of rows so to increase the consistency that this update can happen by allowing us to only get what we need, as for right now these databases rarely get updated due to the vast size of the copy job. The actual pl/sql commands will be stored in a table that I plan to reference for each table, but I'm currently stuck on the best and easiest way to transfer these between these two databases while still getting my commandText to be used. I figured the best way would be to use the OracleDataAdapter.InsertCommand command, but very few examples can be found as to what I'm doing, any suggestions aside from the .InsertCommand are welcome as I'm still getting my footing with Oracle all together.
Dim da As OracleDataAdapter = New OracleDataAdapter
Dim cmd As New OracleCommand()
GenericOraLoginProvider.Connect()
' Create the SelectCommand.
cmd = New OracleCommand("SELECT * FROM TAT_TESTTABLE ", GenericOraLoginProvider.Connection())
da.SelectCommand = cmd
' Create the InsertCommand.
cmd = New OracleCommand("INSERT INTO TAT_TEMP_TESTTABLE", GenericOraLoginProvider.Connection())
da.InsertCommand = cmd
Question: This is an example of what I've been trying as a first step with the Insert command, TAT_TESTTABLE and TAT_TEMP_TESTTABLE are just junk tables that I loaded with data to see if I could move things the way I wanted this way.
As why I'm asking this question the data isn't transferring over, while these tables are on the same database in the future they will not be along with the change to the previously mentioned pl/sql commands. Thankyou for any help, or words of wisdom you can provide, and sorry for the wall of text I tried to keep it specific.

lookup sqlbulkcopy. I use this to transfer data between all kinds of vendor databases.. https://msdn.microsoft.com/en-us/library/ex21zs8x(v=vs.110).aspx

Related

searching table and finding record

i am doing a self study on programming. trying to develop a point of sale.
i am currently struggling with searching a table and finding an item from my inventory table and then throwing the item into my posinvoicedetails table with a default qty of 1
i cannot understand how to open connection on two different tables and search one and update the other at the same time.
can someone please expalin logic behind this. i will try and figure out code thereafter.
i can open one table and search item in datagrid view and update records.
thanks
Your question is quite vague so it's hard to guess what you want exactly. But this is what I think you're looking for.
Open the connection to the database .
Query it to search the specific table(the one you want to search) for what you're looking for.
Close the connection.
Open the connection again.
update the other table.
P.S. Through all of this, it should be the same database that you're connecting to but different tables within it.
Also, you need this code to connect to the database:
provider = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source ="
dataFile = "C:\Users\file.accdb"
connString = provider & dataFile
myConnection.ConnectionString = connString

VB6 Access speed when UPDATE to table

I know VB6 is a bit out of date but that's the application at the moment that I have inherited.
I have to do an update based on the results of an array to an access table.
The array contains a double and the id of the record to update.
The issue is that there are 120,000 records to update but on a test it took 60 seconds just to run it on 374 records.
This is my update code
Dim oCon As New ADODB.Connection
Dim oRs As New ADODB.Recordset
Dim string3 As String
oCon.Open "Driver={Microsoft Access Driver (*.mdb)};Dbq=" & App.Path &"\hhis.mdb;Pwd=/1245;"
oCon.BeginTrans
For i = 0 To maxnumberofrecords - 1
string3 = "UPDATE YourRatings SET yourRatings=" & YourTotalRatingsAll(i) & " Where thisID = " & thisID(i) & ";"
oRs.Open string3, oCon, adOpenStatic, adLockOptimistic
Next i
oCon.CommitTrans
oCon.Close
I added the "CommitTrans" based on some other articles that I had read but that doesn't seem to increase the speed.
The other problem is that I am going to have to run another query to rank the highest(1) to lowest (374) and update the db again...although I can probably do something with the array to add that column at the same time.
This seems quite slow to me especially when other post mention 200000 records in 14 seconds.
Have I just missed something?
Would it be quicker to load from a text file?
Thank you in advance for any help.
Mal
With Open you always constructing a new ResultSet object. Try oCon.execute string3 which only sends the SQL to your database without ResultSet overhead.
Ensure that you have an index on thisID.
Maybe your Access DB sits on a network drive. That could have a large performance impact. Try it local.
Why are you using the creaky old Access Desktop ODBC Driver instead of the Jet 4.0 OLEDB Provider?
Why are you opening a Recordset to do an UPDATE instead of calling Execute on the Connection?
Is there any reason you can't open the database for exclusive access to perform this operation? Locking overhead is all but eliminated and "cargo cult" techniques like thrashing with BeginTrans/CommitTrans lose any significance.
Do you have an index on your thisID field in the table?
Please do move to the magic bullet .Net as soon as possible. Your programs will be even slower but we won't have to read all the whinging blaming VB6.
Just to add to Wumpz answer, you might want to try just adding the query directly into the Access database and calling this using parameters. This SO thread says far more about it.
Parameters are far more stable and less hackable than injection.

XML file generation using Windows Azure SQL Database

Could someone tell me the best way to generate XML file with data from Windows Azure SQL database ?
Basically, I want to create a XML with data from Windows Azure SQL Database by querying certain table and the data is huge (around 90 MB). As I need to run this job at least every couple of hours, this should perform very good.
Any suggestions ?
Thanks,
Prav
This is a very general question, and not really specific to SQL Azure, so it's difficult to give you a good answer. I would suggest you research the different ways to query a SQL database, and also the different ways of writing XML. That might give you ideas for more specific questions to ask.
90MB is not particularly large - it shouldn't be difficult to that into memory. But nonetheless you might want to consider approaches that keep only a small part of the data in memory at once. eg, reading data from a SqlDataReader and immediately writing it to an XmlTextWriter, or something along those lines.
One way to do what you are looking for is to query the database and have the result saved to an ADO.net DataTable. Once you have the DataTable, give it a name of your choosing using the TableName property. Then use the WriteXml method of the DataTable to save the DataTable to a location of your choosing. Make sure to specify the XmlWriteMode.WriteSchema to make sure you save the schema and the data.
Note that if the datatable is going to be 2Gb or greater, you hit the default object memory limit of .Net. One solution is to break your query into smaller chunks and store multiple datatables in XML format per original query. Another solution, is to increase the max size of an object in .Net to greater than 2Gb. This comes with it's own set of risks and performance issues. However, to exceed that 2Gb object size restriction, make sure that you're application is 64 bit, the application is compiled to .Net 4.5 or later and the app.config file should have gcAllowVeryLargeObjects enabled="true".
using System.Data;
using System.Data.SqlClient;
string someConnectionString = "enter Aure connection string here";
DataTable someDataTable = new DataTable();
SqlConnection someConnection = new SqlConnection(someConnectionString);
someConnection.Open();
SqlDataReader someReader = null;
// enter your query below
SqlCommand someCommand = new SqlCommand(("SELECT * FROM [SomeTableName]"), someConnection);
// Since your are downloading a large amount of data, I effectively turned the timeout off in the line below
someCommand.CommandTimeout = 0;
someReader = someCommand.ExecuteReader();
someDataTable.Load(someReader);
someConnection.Close();
// you need to name the datatable before saving it in XML
someDataTable.TableName = "anyTableName";
string someFileNameAndLocation = #"C:\Backup\backup1.xml";
// the XmlWriteMode is necessary to save the schema and data of the datatable
someDataTable.WriteXml(someFileNameAndLocation, XmlWriteMode.WriteSchema);

Best way to store Sql Scripts in a database table

I need to store stored procedure execution scripts in a database table.
As an example:
exec proc_name 'somedata'
These are for execution at a later time after the data that will be changed has gone through a moderation process.
What is the best way to cleanse the script so that the statement cannot be used for sql injection.
Is there a specific type for encoding that I can use? Or is it as simple as doing a replacement on the '
Then it sounds like you would want to use a varchar(max) column and have a separate table for parameters.. If you use Parameters you should be safe from SQL injections. See quickie C# example below:
C# psuedo-code example
SQLCommand command = new SQLCommand("select * from myScripts where scriptid = #scriptid");
SQLParameter param = new SQLParameter("#scriptid", 12, int);
...new SQLCommand("select * from myParams where scriptid = #scriptid");
...new SQLParameter...
DataReader dr = new blah blah...
SQLCommand userCommand = new SQLCommand(dr['sql']);
foreach (parameter in params)
{
userCommand.Parameter.Add(parameter['name'], value);
}
userCommand.Execute...
There is no way to "cleanse" scripts.
The only way to secure your code is to separate the code from data. And "cleanse" the data only.
That's why we have our code separated from data.
The code is solid and secured, and data is variable and haver to be "cleansed".
As you are breaking this fundamental law, treating the code as data, there is no way to secure it.
Judging by the utter unusualness of the task, I'd say there is a proper solution for sure.
You just choose the wrong architecture.
So, you'd better ask another question, something like "I want to deal with quite complex metadata structure (with the structure and the purpose provided)" and you will get a proper solution that will require no storing SQL codes among the data.
You can either store your scripts for later execution in a Stored Procedure or a scheduled job. I don't see any reason for encoding a stored procedure, as you can put user privileges to prevent different users from reading or even seeing them.

Insert Records repeatedly faster

I'm monitoring a folder for Jpg files and need to process the incoming files.
I decode the filename to get all the information I want and insert into a table and then move the file to another folder.
The file name is already contains all the information I want. Eg.
2011--8-27_13:20:45_MyLocation_User1.jpg.
Now I'm using an Insert statement
Private Function InsertToDB(ByVal SourceFile As String, ByVal Date_Time As DateTime, ByVal Loc As String, ByVal User As String) As Boolean
Dim conn As SqlConnection = New SqlConnection(My.Settings.ConString)
Dim sSQL As String = "INSERT INTO StageTbl ...."
Dim cmd As SqlComman
cmd = New SqlCommand(sSQL, conn)
....Parameters Set ...
conn.Open()
cmd.ExecuteNonQuery()
conn.Close()
conn = Nothing
cmd = Nothing
End Function
The function will be called for every single file found.
Is this most efficient way? Looks like its is very slow. I need to process about 20~50 files/sec. Probably a stored procedure?
I need to do this as fast as possible. I guess bulk insert not applicable here.
Please help.
Bulk insert could be applicable here - do you need them to be in the DB instantly, or could you just build up the records in memory then push them into the database in batches?
Are you multi-threading as well - otherwise your end to end process could get behind.
Another solution would be to use message queues - pop a message into the queue for every file, then have a process (on a different thread) that is continually reading the queue and adding to the database.
There are several things you can do to optimize the speed of this process:
Don't open and close the connection for every insert. That alone will yield a (very) significant performance improvement (unless you were using connection pooling already).
You may gain performance if you disable autocommit and perform inserts in blocks, commiting the transaction after every N rows (100-1000 rows is a good number to try for a start).
Some DB systems provide a syntax to allow insertion of multiple rows in a single query. SQL Server doesn't but you may be interested on this: http://blog.sqlauthority.com/2007/06/08/sql-server-insert-multiple-records-using-one-insert-statement-use-of-union-all/
If there are many users/processes accessing this table, access can be slow depending on your transaction isolation level. In your case (20-50 inserts/sec) this shouldn't make a big difference. I don't recommend modifying this unless you understand well what you are doing: http://en.wikipedia.org/wiki/Isolation_%28database_systems%29 and http://technet.microsoft.com/es-es/library/ms173763.aspx .
I don't think a stored procedure will necessarily provide a big performance gain. You are only parsing/planning the insert 20-50 times per second. Use a stored procedure only if it fits well your development model. If all your other queries are in code, you can avoid it.
Ensure your bottleneck is the database (i.e. moving files is not taking a lot of time), but since the OS should be good at this, check the points above. If you find that moving files is your bottleneck, delaying or moving files in the background (another thread) can help to a certain extent.