This question is related to another one I posted earlier.
To recap, I need to fix an issue with an ancient legacy app where people messed up data storage by re-installing the software the wrong way.
The application stores data by saving a record in an SQL DB. Each record holds a reference to a file on disk of which the filename auto-increments.
By re-installing the app the filename auto-increment was re-set so the DB now holds multiple unrelated records which reference the same filename and I have to directories with files which I obviously cannot merge because of these identical filenames. The files hold no reference to the DB data so the only course of action that remains is to filter the DB records on date created and try to rename "EXED" to "IXED" or something like that.
The DB is relatively simple with one table containing a column that holds data of type "Image".
An example content of this image data is as follows:
0x3200001000000000000000200B0000000EFF00000300000031340000000070EC0100002C50000004000000C90000005D010000040000007955B63F4D01000004000000F879883E4F01000004000000BC95563E98010000040000009A99993F4A01000004000000000000004B01000004000000000000009101000004000000000000004E01000004000000721C83425101000004000000D841493F5E01000004000000898828414101000004000000F2D2BD3F4201000004000000FCA9B13F40010000040000007574204244010000040000000000204345010000040000007DD950414601000004000000000000004701000004000000000000009201000004000000000000008701000004000000D2DF13426A0100000400000000005C42740100000400000046B68F40500100000400000018E97A3F7901000004000000FB50CF3C7A01000004000000E645703F99010000040000000000E0404C010000040000008716593F8601000004000000000006439A0100000400000000008040700100000400000063D887449E01000004000000493CBA3E9C0100000400000069699D429B01000004000000DD60CA3F9D0100000400000035DE3C44B4010000040000008B5C744433000000040000003D0ABB4134000000040000000AFF7C44350000000400000093CB3942750400000400000054A69F41BA010000040000002635C64173040000040000008367C24100000080690100002B5000003101000032000010000000000000002009000000000000000100000000000000F00000000000000080080100000100000010000000540100000100000021F0AA42270000000200000010000000540100000200000021F0AA42280000000300000010000000540100000300000059C9E6432900000004000000100000005401000004000000637888442A00000005000000100000005401000005000000DFEF87442B00000006000000100000005401000006000000000000002C00000007000000100000005401000007000000000000002D00000008000000100000005401000008000000000000002D000000090000001000000054010000090000002F353D442D0000000A00000010000000540100000A00000035DE3C44340000000B00000010000000540100000B0000008B5C7444240000009D50000010000000CDCCCC3E2C513B41F65D5F3F2C51BB419E50000010000000CCBA2C3FE17C8C411553B13F83F32142000000403700000000FE0000090000004558454434386262002D50000008000000447973706E6F65008E5000000E00000056454C442052414D502033363000000000F000000000
The data is apparently Hex which mostly encodes meaningless crap but also holds the name of physical files (towards the end of the data field) in the filesystem that is linked to the SQL records:
??#7???????????EXED48bb?-P??????Dyspnoe??P??????VELD RAMP 360
I'm interested in the EXED part.
There is no clear regularity in the offset at which the filename appears and the filename is of variable length (so I do not know beforehand how long the substring will be).
I can call up all records with SQL like this:
SELECT COUNT(*) as "Number of EXED Files after critical date"
FROM [ZAN].[dbo].[zanu]
WHERE udata is not null
and SUBSTRING(udata, 1 , 2147483647) like '%EXED%'
and [udatum] > 0
and CONVERT(date,[udatum]) > CONVERT(date,'20100629')
What I would like to do now is know how to replace this EXED substring by something else (e.g. IXID).
I'm unfamiliar with SQL and Googling so far has yielded very little information on my options here.
I also have no other info on the original code that generated this data/the data format/encoding/whatever...
It's a mess really.
Any help is welcome!
An update on this:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Data.Linq;
using System.Text;
using System.Data.SqlClient;
using System.Threading;
namespace ZANLinq
{
class Program
{
static void Main(string[] args)
{
try
{
DataContext zanDB = new DataContext(#"Data Source=.\,1433;database=ZAN;Integrated Security=true");
string strSQL = #"SELECT
Idnr,
Udatum,
Uzeit,
Unr,
Uart,
Ubediener,
Uzugriff,
Ugr,
Uflags,
Usize,
Udata
FROM Zanu
WHERE (Udata IS NOT null and SubString(Udata, 1 , 2147483647) LIKE '%EXED%')
AND (Idnr = ' 2')";
var zanQuery = zanDB.ExecuteQuery<Zanu>(strSQL);
List<Zanu> list = zanQuery.ToList<Zanu>();
foreach (Zanu zanTofix in list)
{
string strOriginal = ASCIIEncoding.ASCII.GetString(zanTofix.Udata);
string strFixed = strOriginal.Replace("EXED", "IXED");
zanTofix.Udata = ASCIIEncoding.ASCII.GetBytes(strFixed);
}
zanDB.SubmitChanges();
//Console.WriteLine(zanResults.Count<Zanu>().ToString());
}
catch (SqlException e)
{
Console.WriteLine(e.Message);
}
}
}
}
It finds the records I'm interested in, I can easily manipulate the data but the commit doesnt work. I'm stumped, there are no exceptions, no indication the code is wrong.
Anybody have ideas?
UPDATE:
I think the above does not work because my table appears to have a composite PK (I cannot change this):
Since I could not debug this (no info anywhere, no exceptions, just a silent fail of the submitchanges()) I decided to use another approach and abandon Linq2SQL altogether:
try
{
SqlConnection thisConnection = new SqlConnection(#"Network Library=DBMSSOCN;Data Source=.\,1433;database=ZAN;Integrated Security=SSPI");
DataSet zanDataSet = new DataSet();
SqlDataAdapter zanDa;
SqlCommandBuilder zanCmdBuilder;
thisConnection.Open();
//Initialize the SqlDataAdapter object by specifying a Select command
//that retrieves data from the sample table.
zanDa = new SqlDataAdapter(#"SELECT
Idnr,
Udatum,
Uzeit,
Unr,
Uart,
Ubediener,
Uzugriff,
Ugr,
Uflags,
Usize,
Udata
FROM Zanu
WHERE (Udata IS NOT null and SubString(Udata, 1 , 2147483647) LIKE '%IXED%')
AND (Idnr = ' 2')
AND (Uzeit = '13:21')", thisConnection);
//Initialize the SqlCommandBuilder object to automatically generate and initialize
//the UpdateCommand, InsertCommand, and DeleteCommand properties of the SqlDataAdapter.
zanCmdBuilder = new SqlCommandBuilder(zanDa);
//Populate the DataSet by running the Fill method of the SqlDataAdapter.
zanDa.Fill(zanDataSet, "Zanu");
Console.WriteLine("Records that will be affected: " + zanDataSet.Tables["Zanu"].Rows.Count.ToString());
foreach (DataRow record in zanDataSet.Tables["Zanu"].Rows)
{
string strOriginal = ASCIIEncoding.ASCII.GetString((byte[])record["Udata"]);
string strFixed = strOriginal.Replace("IXED", "EXED");
record["Udata"] = ASCIIEncoding.ASCII.GetBytes(strFixed);
//string strPostMod = ASCIIEncoding.ASCII.GetString((byte[])record["Udata"]);
}
zanDa.Update(zanDataSet, "Zanu");
thisConnection.Close();
Console.ReadLine();
}
catch (SqlException e)
{
Console.WriteLine(e.Message);
}
This seems to work but any input on why the Linq does not work and whether or not my second solution is efficient/optimal or not is still very much appreciated.
Related
I have custom extractor, and I'm trying to log some messages from it.
I've tried obvious things like Console.WriteLine, but cannot find where output is. However, I found some system logs in adl://<my_DLS>.azuredatalakestore.net/system/jobservice/jobs/Usql/.../<my_job_id>/.
How can I log something? Is it possible to specify log file somewhere on Data Lake Store or Blob Storage Account?
A recent release of U-SQL has added diagnostic logging for UDOs. See the release notes here.
// Enable the diagnostics preview feature
SET ##FeaturePreviews = "DIAGNOSTICS:ON";
// Extract as one column
#input =
EXTRACT col string
FROM "/input/input42.txt"
USING new Utilities.MyExtractor();
#output =
SELECT *
FROM #input;
// Output the file
OUTPUT #output
TO "/output/output.txt"
USING Outputters.Tsv(quoting : false);
This was my diagnostic line from the UDO:
Microsoft.Analytics.Diagnostics.DiagnosticStream.WriteLine(System.String.Format("Concatenations done: {0}", i));
This is the whole UDO:
using System.Collections.Generic;
using System.IO;
using System.Text;
using Microsoft.Analytics.Interfaces;
namespace Utilities
{
[SqlUserDefinedExtractor(AtomicFileProcessing = true)]
public class MyExtractor : IExtractor
{
//Contains the row
private readonly Encoding _encoding;
private readonly byte[] _row_delim;
private readonly char _col_delim;
public MyExtractor()
{
_encoding = Encoding.UTF8;
_row_delim = _encoding.GetBytes("\n\n");
_col_delim = '|';
}
public override IEnumerable<IRow> Extract(IUnstructuredReader input, IUpdatableRow output)
{
string s = string.Empty;
string x = string.Empty;
int i = 0;
foreach (var current in input.Split(_row_delim))
{
using (System.IO.StreamReader streamReader = new StreamReader(current, this._encoding))
{
while ((s = streamReader.ReadLine()) != null)
{
//Strip any line feeds
//s = s.Replace("/n", "");
// Concatenate the lines
x += s;
i += 1;
}
Microsoft.Analytics.Diagnostics.DiagnosticStream.WriteLine(System.String.Format("Concatenations done: {0}", i));
//Create the output
output.Set<string>(0, x);
yield return output.AsReadOnly();
// Reset
x = string.Empty;
}
}
}
}
}
And these were my results found in the following directory:
/system/jobservice/jobs/Usql/2017/10/20.../diagnosticstreams
good question. I have been asking myself the same thing. This is theoretical, but I think it would work (I'll updated if I find differently).
One very hacky way is that you could insert rows into a table with your log messages as a string column. Then you can select those out and filter based on some log_producer_id column. You also get the benefit of logging if part of the script works, but later parts do not assuming the failure does not roll back. Table can be dumped at end as well to file.
For the error cases, you can use the Job Manager in ADLA to open the job graph and then view the job output. The errors often have detailed information for data-related errors (e.g. row number in file with error and a octal/hex/ascii dump of the row with issue marked with ###).
Hope this helps,
J
ps. This isn't a comment or an answer really, since I don't have working code. Please provide feedback if the above ideas are wrong.
There is a problem. I have old database with some data, and by another side I have new database with new structure.
Now I need best way (ideas) how to copy data from one table to the another. Problem is some tables have max 1000 records some 32 000 some 640 000, and time to copy 5000+ is really long.
Any best practices ? Sample code below ...
public ActionResult ImportTable1()
{
var oldTable1 = context.OLDTABLE.ToList();
foreach (var item in oldTable1)
{
try
{
var cTable = contextNew.NEWTABLE.Where(p => p.fiel1 == item.field1).FirstOrDefault();
if (cTable == null)
{
NEWTABLE nTable = new NEWTABLE
{
field1 = item.field1,
field2 = item.field2
};
contextNew.NEWTABLE.Add(nTable);
}
else
{
cTable.field1 = item.field1
cTable.field2 = item.field2;
contextNew.Entry(cTable).State = EntityState.Modified;
}
IcontextNew.SaveChanges();
}
catch (DbEntityValidationException dbEx)
{
foreach (var validationErrors in dbEx.EntityValidationErrors)
{
foreach (var validationError in validationErrors.ValidationErrors)
{
_progresLog = ("Property: " + validationError.PropertyName + " Error: {1}" + validationError.ErrorMessage);
}
}
}
return PartialView();
}
... so bulk now
public void ExperimentalPartsBulk()
{
string msisDatabase = ConfigurationManager.ConnectionStrings["old"].ToString();
string newDatabase = ConfigurationManager.ConnectionStrings["new"].ToString();
SqlConnection sourceconnection = new SqlConnection(msisDatabase);
SqlConnection sourcedestination = new SqlConnection(newDatabase);
sourceconnection.Open();
SqlCommand cmd = new SqlCommand("Select * from ELEMENTS");
cmd.Connection = sourceconnection;
SqlDataReader reader = cmd.ExecuteReader();
//Connect to Destination DataBase
SqlConnection destinationConnection = new SqlConnection(newDatabase);
destinationConnection.Open();
SqlBulkCopy bulkCopy = new SqlBulkCopy(destinationConnection);
bulkCopy.DestinationTableName = "ELEMENTSNEW";
bulkCopy.ColumnMappings.Clear();
bulkCopy.ColumnMappings.Add("fielString1", "newString1");
bulkCopy.ColumnMappings.Add("fielString2", "newStrin2");
bulkCopy.ColumnMappings.Add("fielFloat1", "newINT1");
bulkCopy.WriteToServer(reader);
reader.Close();
sourceconnection.Close();
sourcedestination.Close();
}
problem now is w differences betwen two tables
fielString1 can be null, newString1 cant be |
fielFloat1 is float now is nullable but newINT1 not
How to import with some conditions or to the different types of field ?
Siwek,
Any loop as shown in the first code sample will failed due to performance issues.... as you pointed!
The right approach here is SQL approach. The idea is to "flush" all data to new DB. Flush mean that ALL records (5,000 or 500,000) are stored to new DB with one action! And avoid any loops during extracting, filtering, editing and saving of data, because 640,000 loops takes long time....
Bulk copy is one possible. Issue with bulk copy is that it's hard for you to filter and edit data in this object.
Use ADO.net DataSet to get data from old DB, filter it, edit it, and save it on memory and flush it to new DB. DataSet take one step per action (extracting, filtering, editing, etc. ! no loops).
Or, try SQL replication. Replication is the SQL mechanism to copy data from DB "A" table "oneTable" to another DB, "B" with a table "AnotherTable" with a different schema and rules. Try it. I can specify more if you think it's a reasonable solution for you. No code needed, it's can be created using wizard on SQL Management studio, and run whenever needed (via SQL Job agent).
You should seriously consider SSIS or bcp. Otherwise you are looking at a scenario whet you're pulling data from the source server all the way down to the client box where the .net code is executing, then pushing all off that data up to the destination server. Think of the bandwidth being consumed. If you can instead do an SSIS export into the destination, at least it would be eliminating an extra layer of concern.
If you absolutely must pull data down to the client, consider writing the data into bcp formatted files, and then bulkcopying them into the destination server.
I'm pretty sure that you'll find that both of these paths are significantly faster than using plain old ADO.NET sort of approaches.
I am wondering how to insert an image on one of the fields in my postgresql table. I cannot find an appropriate tutorial re this matter. The dataype of the field is oid. Has anyone tried this? Thanks!
// All LargeObject API calls must be within a transaction
conn.setAutoCommit(false);
// Get the Large Object Manager to perform operations with
LargeObjectManager lobj = ((org.postgresql.PGConnection)conn).getLargeObjectAPI();
//create a new large object
int oid = lobj.create(LargeObjectManager.READ | LargeObjectManager.WRITE);
//open the large object for write
LargeObject obj = lobj.open(oid, LargeObjectManager.WRITE);
// Now open the file
File file = new File("myimage.gif");
FileInputStream fis = new FileInputStream(file);
// copy the data from the file to the large object
byte buf[] = new byte[2048];
int s, tl = 0;
while ((s = fis.read(buf, 0, 2048)) > 0)
{
obj.write(buf, 0, s);
tl += s;
}
// Close the large object
obj.close();
//Now insert the row into imagesLO
PreparedStatement ps = conn.prepareStatement("INSERT INTO imagesLO VALUES (?, ?)");
ps.setString(1, file.getName());
ps.setInt(2, oid);
ps.executeUpdate();
ps.close();
fis.close();
Found that sample code from here. Really very good bunch of sql operations.
To quote this site,
PostgreSQL database has a special data type to store binary data
called bytea. This is a non-standard data type. The standard data type
in databases is BLOB.
You need to write a client to read the image file, for example
File img = new File("woman.jpg");
fin = new FileInputStream(img);
con = DriverManager.getConnection(url, user, password);
pst = con.prepareStatement("INSERT INTO images(data) VALUES(?)");
pst.setBinaryStream(1, fin, (int) img.length());
pst.executeUpdate();
You can either use the bytea type or the large objects facility. However note that depending on your use case it might not be a good idea to put your images in the DB because of additional load it may put on the DB server.
Rereading your question I notice you mentioned you have a field of type oid. If this is an application you are modifying it suggests to me it is using large objects. These objects get an oid which you then need to store in another table to keep track of them.
I have a few tables in a c# application I'm currently working on and for 4/5 of the tables everything saves perfectly fine no issues. For the 5th table everything seems good until I reload the program again (without modifying the code or working with a seperate install so that the data doesn't go away) The 4/5 tables are fine but the 5th doesn't have any records in it after it has been restarted (but it did the last time it was running). Below is some code excerpts. I have tried a few different solutions online including creating a string to run the sql commands on the database manually and creating the row directly as opposed to the below implementation which uses a generic data row.
//From main window
private void newInvoice_Click(object sender, EventArgs e)
{
PosDatabaseDataSet.InvoicesRow newInvoice = posDatabaseDataSet1.Invoices.NewInvoicesRow();
Invoices iForm = new Invoices(newInvoice, posDatabaseDataSet1, true);
}
//Invoices Table save [Works] (from Invoices.cs)
private void saveInvoice_Click(object sender, EventArgs e)
{
iRecord.Date = Convert.ToDateTime(this.dateField.Text);
iRecord.InvoiceNo = Convert.ToInt32(this.invoiceNumField.Text);
iRecord.Subtotal = (float) Convert.ToDouble(this.subtotalField.Text);
iRecord.Tax1 = (float)Convert.ToDouble(this.hstField.Text);
iRecord.Total = (float)Convert.ToDouble(this.totalField.Text);
iRecord.BillTo = this.billToField.Text;
invoicesBindingSource.EndEdit();
if (newRecord)
{
dSet.Invoices.Rows.Add(iRecord);
invoicesTableAdapter.Adapter.Update(dSet.Invoices);
}
else
{
string connString = Properties.Settings.Default.PosDatabaseConnectionString;
string queryString = "UPDATE dbo.Invoices set ";
queryString += "Date='" + iRecord.Date+"'";
queryString += ", Subtotal=" + iRecord.Subtotal;
queryString += ", Tax1=" + iRecord.Tax1.ToString("N2");
queryString += ", Total=" + iRecord.Total;
queryString += " WHERE InvoiceNo=" + iRecord.InvoiceNo;
using (SqlConnection dbConn = new SqlConnection(connString))
{
SqlCommand command = new SqlCommand(queryString, dbConn);
dbConn.Open();
SqlDataReader r = command.ExecuteReader();
dbConn.Close();
}
}
dSet.Invoices.AcceptChanges();
}
//Invoice Items save [works until restart] (also from Invoices.cs)
private void addLine_Click(object sender, EventArgs e)
{
DataRow iRow = dSet.Tables["InvoiceItems"].NewRow();
iRow["Cost"] = (float)Convert.ToDouble(this.costField.Text);
iRow["Description"] = this.descriptionField.Text;
iRow["InvoiceNo"] = Convert.ToInt32(this.invoiceNumField.Text);
iRow["JobId"] = Convert.ToInt32(this.jobIdField.Text);
iRow["Qty"] = Convert.ToInt32(this.quantityField.Text);
iRow["SalesPerson"] = Convert.ToInt32(this.salesPersonField.Text);
iRow["SKU"] = Convert.ToInt32(this.skuField.Text);
dSet.Tables["InvoiceItems"].Rows.Add(iRow);
invoiceItemsTableAdapter.Adapter.Update(dSet,"InvoiceItems");
PosDatabaseDataSet.InvoiceItemsDataTable dTable = (PosDatabaseDataSet.InvoiceItemsDataTable)dSet.InvoiceItems.Copy();
DataRow[] d = dTable.Select("InvoiceNo=" + invNo.ToString());
invoiceItemsView.DataSource = d;
}
Thanks in advance for any insight.
UPDATE: October 17, 2011. I am still unable to get this working is there any more ideas out there?
you must execute your Sql Command in order to persis the changes you made.
using (SqlConnection dbConn = new SqlConnection(connString))
{
dbConn.Open();
SqlCommand command = new SqlCommand(queryString, dbConn);
command.ExecuteNonQuery();
dbConn.Close();
}
The ExecuteReader method is intended (as the name says) to read the data from a SQL table. You need to use a different method as you can see above.
We need some more info first, you haven't shown the case where your code fails.
Common mistakes on this kind of code is calling DataSet.AcceptChanges() before actually committing the changes to the database.
Second is a conflict between databound data through the binding source vs edits to the dataset directly.
Lets see the appropriate code and we can try and help.
Set a breakpoint after teh call to invoiceItemsTableAdapter and check the InvoiceItems table for the row you have added. Release the breakpoint and then close your app. Check the database again. I would say that another table may be forcibly overwriting the invoice item table.
Sorry for my English first of all. I have a problem and need help.
I have a simple tool made by myself on c#. This tool makes connect to local or remote firebird server (v.2.5). And my tool can create specified .fdb file (database) somewhere on the server.
Also I have a file with SQL statements (create table, triggers and so on). I want to execute this file after database was created. Executing this file will fill structure of user database - not data, only structure.
But then I try to execute my SQL script - firebird server returns a
SQL error code = -104 Token unknown line xxx column xxx.
That's the line on this CREATE TABLE SQL statement, for example:
CREATE TABLE tb1
(
col1 INTEGER NOT NULL,
col2 VARCHAR(36)
);
/* This next create statement causes an error */
CREATE TABLE tb2
(
col1 INTEGER NOT NULL,
col2 VARCHAR(36)
);
If I will leave only one create statement in my file - all will be good... I don't know how I explained (it's clear or not)) - another words - why can't I execute full query with many create statements in one transaction? There is my main method which executes query:
public static string Do(string conString, string query)
{
using (FbConnection conn = new FbConnection())
{
try
{
conn.ConnectionString = conString;
conn.Open();
FbTransaction trans = conn.BeginTransaction();
FbCommand cmd = new FbCommand(query, conn, trans);
cmd.ExecuteNonQuery();
trans.Commit();
}
catch (Exception ex)
{
System.Windows.MessageBox.Show(ex.ToString());
return "Transaction Fail";
}
}
return "Transaction Commited";
}
There is a query is my SQL file.
As Victor already stated in his final comment, you can use the FBScript class for batch execution.
I was just confronted with the same task. This question pointed me in the right direction but i had to do some further digging.
I this example, the source of the statements is a external script file:
private void ExecuteScript(FbConnection myConnection, string scriptPath) {
if (!File.Exists(scriptPath))
throw new FileNotFoundException("Script not found", scriptPath);
FileInfo file = new FileInfo(scriptPath);
string script = file.OpenText().ReadToEnd();
// use FbScript to parse all statements
FbScript fbs = new FbScript(script);
fbs.Parse();
// execute all statements
FbBatchExecution fbe = new FbBatchExecution(myConnection, fbs);
fbe.Execute(true);
}
This will work fine, but you may wonder why this whole thing isn't surrounded by a transaction. Actually there is no support to "bind" FbBatchExecution to a transaction directly.
The first thing i tried was this (will not work)
private void ExecuteScript(FbConnection myConnection, string scriptPath) {
using (FbTransaction myTransaction = myConnection.BeginTransaction()) {
if (!File.Exists(scriptPath))
throw new FileNotFoundException("Script not found", scriptPath);
FileInfo file = new FileInfo(scriptPath);
string script = file.OpenText().ReadToEnd();
// use FbScript to parse all statements
FbScript fbs = new FbScript(script);
fbs.Parse();
// execute all statements
FbBatchExecution fbe = new FbBatchExecution(myConnection, fbs);
fbe.Execute(true);
myTransaction.Commit();
}
}
This will result in an exception stating: "Execute requires the Command object to have a Transaction object when the Connection object assigned to the command is in a pending local transaction. The Transaction property of the Command has not been initialized."
This means nothing more than that the commands that are executed by FbBatchExecution are not assigned to our local transaction that is surrounding the code block. What helps here is that that FbBatchExecution provides
the event CommandExecuting where we can intercept every command and assign our local transaction like this:
private void ExecuteScript(FbConnection myConnection, string scriptPath) {
using (FbTransaction myTransaction = myConnection.BeginTransaction()) {
if (!File.Exists(scriptPath))
throw new FileNotFoundException("Script not found", scriptPath);
FileInfo file = new FileInfo(scriptPath);
string script = file.OpenText().ReadToEnd();
// use FbScript to parse all statements
FbScript fbs = new FbScript(script);
fbs.Parse();
// execute all statements
FbBatchExecution fbe = new FbBatchExecution(myConnection, fbs);
fbe.CommandExecuting += delegate(object sender, CommandExecutingEventArgs args) {
args.SqlCommand.Transaction = myTransaction;
};
fbe.Execute(true);
// myTransaction.Commit();
}
}
Note that i have uncommented the myTransaction.Commit() line. I was a little bit surprised by this behavior, but if you keep that line the transaction will throw an exception stating that it has already been committed. The bool parameter fbe.Execute(true) is named "autoCommit", but changing this to false seems to have no effect.
I would like some feedback if you see any potential issues with assigning the local transaction this way, or if it has any benefits at all or could as well be omitted.
Probably error in launching two create statements in one batch. Would it work if you break it to separate queries? Does it work in your SQL tool?