Sql Bulk Copy Cannot access destination table - sql

I'm trying to read data from files and to use bulk copy to insert it in the database table.
When I try to run my code, I get the error: "Cannot Access Denstination Table"
Declaration of FlatTable.
System.Data.DataTable flatTableTempData = new System.Data.DataTable("FlatTable");
DataColumn DistrictColumn = new DataColumn();
DistrictColumn.ColumnName = "DistrictName";
// Create Column 3: TotalSales
DataColumn TownColumn = new DataColumn();
TownColumn.ColumnName = "TownName";
DataColumn FarmerColumn = new DataColumn();
FarmerColumn.ColumnName = "FarmerName";
flatTableTempData.Columns.Add(DistrictColumn);
flatTableTempData.Columns.Add(TownColumn);
flatTableTempData.Columns.Add(FarmerColumn);
This is my code, with the connection string and the insertion using bulk copy:
using (SqlConnection con = new SqlConnection("Data Source=DRTARIQ-PC\\SQLEXPRESS;Integrated Security=SSPI;Initial Catalog=TestDB2"))
{
con.Open();
using (SqlBulkCopy s = new SqlBulkCopy(con))
{
s.DestinationTableName = flatTableTempData.TableName;
foreach (var column in flatTableTempData.Columns)
s.ColumnMappings.Add(column.ToString(), column.ToString());
s.BulkCopyTimeout = 500;
s.WriteToServer(flatTableTempData);
}
}

I've encountered the same problem. The table exists, the SQL user has access but SqlBulkCopy cannot access the table. My problem turned out to be I turned off the indexing to try and insert faster (rebuild index after the bulkcopy), but this made the table inaccessible. After I turned the indexing on again it worked, SqlBulkCopy has access to the table.

The table name in WriteToServer method of SqlBulkCopy must be surrounded with [ ] signs.

Related

SqlDataAdapter.update() not updating database

I am searching for (PostId,UserId) into PostLikes table using SqlDataAdapter, if the row is found , I am using SqlCommandBuilder.GetDeleteCommand() to generate the delete instruction and deleting the underlying row, if the row is not found, then I use SqlCommandBuilder.GetInsertCommand() to generate the insert command and inserting the row to the table using SqlDataAdapter.Update(). But the row is not getting inserted to the table in database. Here is what I have done so far
SqlConnection con = new SqlConnection(connectionStrings);
SqlDataAdapter sqlDataAdapter=new SqlDataAdapter("select * from PostLikes where PostId like "
+postlike.PostId+" and UserId like "
+postlike.UserId,con);
DataSet ds = new DataSet();
sqlDataAdapter.Fill(ds, "Result");
con.Open();
SqlCommandBuilder sqlCommandBuilder = new SqlCommandBuilder(sqlDataAdapter);
if(ds.Tables["Result"].Rows.Count==1)
{
sqlDataAdapter.DeleteCommand = sqlCommandBuilder.GetDeleteCommand(true);
msg = "Data is deleted";
}
else
{
sqlDataAdapter.InsertCommand = sqlCommandBuilder.GetInsertCommand(true);
msg = "Data is inserted";
}
sqlDataAdapter.Update(ds, "Result");
and the tablePostLikes(LikeId,PostId,UserId)
There are a couple of issues:
You are looking to reuse the same command to both detect whether the row exists, and to supply to the SqlAdapter for the SqlCommandBuilder.
You should parameterise the initial select query to protect against SqlInjection attacks (and there is a minor performance benefit). The CommandBuilder will automatically parameterize the Insert / Delete commands
After creating the Insert / Delete commands with the SqlCommandBuilder, you then need to change the underlying dataset in order for any changes to be made to the table during the Update.
Note that many of the Sql objects are IDisposable and should be disposed ASAP - using scopes help here.
.
var postId = 1;
var userId = 1;
string msg;
using (var con = new SqlConnection(#"data source=..."))
using (var selectCommand = new SqlCommand(
"select LikeId, PostId, UserId from PostLikes WHERE PostId=#PostId AND UserId=#UserId", con))
using (var sqlDataAdapter = new SqlDataAdapter(selectCommand))
using (var ds = new DataSet())
{
con.Open();
selectCommand.Parameters.AddWithValue("#PostId", postId);
selectCommand.Parameters.AddWithValue("#UserId", userId);
sqlDataAdapter.Fill(ds, "Result");
using (var sqlCommandBuilder = new SqlCommandBuilder(sqlDataAdapter))
{
if (ds.Tables["Result"].Rows.Count == 1)
{
sqlDataAdapter.DeleteCommand = sqlCommandBuilder.GetDeleteCommand(true);
ds.Tables["Result"].Rows[0].Delete();
msg = "Data will be deleted";
}
else
{
sqlDataAdapter.InsertCommand = sqlCommandBuilder.GetInsertCommand(true);
// Null because LikeId is Identity and will be auto inserted
ds.Tables["Result"].Rows.Add(null, postId, userId);
msg = "Data will be inserted";
}
sqlDataAdapter.Update(ds, "Result");
}
}
I've assumed the following Schema:
CREATE TABLE PostLikes
(
LikeId INT IDENTITY(1,1) PRIMARY KEY,
PostId INT,
UserId INT
)
And I've assumed you want to 'toggle' the insertion or deletion of a row with the postId, userid combination.

How to save the SQL Server table xml column data in physical path as .xml format?

I have an sql server database table which has xml column name called "MESSAGE" and which will store xml data.
The database table look like,
Now I need to get this "MESSAGE" column data and save into System physical path as xml file(Ex: test.xml etc.,)
Any suggestion how to implement this using c#.net?
You could try something like this (using plain ADO.NET and a very basic SQL query):
static void Main(string[] args)
{
// get connection string from app./web.config
string connectionString = "server=.;database=yourDB;Integrated Security=SSPI;";
// define query
string query = "SELECT MESSAGE FROM dbo.SamTest WHERE ID = 1;";
// set up connection and command
using (SqlConnection conn = new SqlConnection(connectionString))
using (SqlCommand selectCmd = new SqlCommand(query, conn))
{
// open connection, execute query to get XML, close connection
conn.Open();
string xmlContents = selectCmd.ExecuteScalar().ToString();
conn.Close();
// define target file name
string targetFileName = #"C:\tmp\test.xml";
// write XML out to file
File.WriteAllText(targetFileName, xmlContents);
}
}

GetOleDbSchemaTable(OleDbSchemaGuid.Indexes, ...) always returning zero rows access database

When querying an Access 2000 database, using:
schemaTable = cn.GetOleDbSchemaTable(OleDbSchemaGuid.Indexes, New Object() {Nothing, Nothing, tableName})
Where cn is a valid and open connection, schemaTable always contains zero rows, despite the tableName specified having many indexes.
This documentation, here http://msdn.microsoft.com/en-us/library/cc668764.aspx suggests that MS Access provides this information.
What gives?
It appears that when retrieving .Indexes the third member of the restrictions array corresponds to the Index name, not the Table name. So to retrieve the indexes for a given table it looks like we need to retrieve all of the indexes (no restrictions) and then filter out the ones we don't want.
The following C# code works for me:
using (OleDbConnection con = new OleDbConnection())
{
con.ConnectionString = myConnectionString;
con.Open();
object[] restrictions = new object[3];
System.Data.DataTable table = con.GetOleDbSchemaTable(OleDbSchemaGuid.Indexes, restrictions);
// Display the contents of the table.
foreach (System.Data.DataRow row in table.Rows)
{
string tableName = row[2].ToString();
if (tableName == "Clients")
{
foreach (System.Data.DataColumn col in table.Columns)
{
Console.WriteLine("{0} = {1}",
col.ColumnName, row[col]);
}
Console.WriteLine("============================");
}
}
con.Close();
}

Copy data between databases using Entity Framework, LINQ and MVC

There is a problem. I have old database with some data, and by another side I have new database with new structure.
Now I need best way (ideas) how to copy data from one table to the another. Problem is some tables have max 1000 records some 32 000 some 640 000, and time to copy 5000+ is really long.
Any best practices ? Sample code below ...
public ActionResult ImportTable1()
{
var oldTable1 = context.OLDTABLE.ToList();
foreach (var item in oldTable1)
{
try
{
var cTable = contextNew.NEWTABLE.Where(p => p.fiel1 == item.field1).FirstOrDefault();
if (cTable == null)
{
NEWTABLE nTable = new NEWTABLE
{
field1 = item.field1,
field2 = item.field2
};
contextNew.NEWTABLE.Add(nTable);
}
else
{
cTable.field1 = item.field1
cTable.field2 = item.field2;
contextNew.Entry(cTable).State = EntityState.Modified;
}
IcontextNew.SaveChanges();
}
catch (DbEntityValidationException dbEx)
{
foreach (var validationErrors in dbEx.EntityValidationErrors)
{
foreach (var validationError in validationErrors.ValidationErrors)
{
_progresLog = ("Property: " + validationError.PropertyName + " Error: {1}" + validationError.ErrorMessage);
}
}
}
return PartialView();
}
... so bulk now
public void ExperimentalPartsBulk()
{
string msisDatabase = ConfigurationManager.ConnectionStrings["old"].ToString();
string newDatabase = ConfigurationManager.ConnectionStrings["new"].ToString();
SqlConnection sourceconnection = new SqlConnection(msisDatabase);
SqlConnection sourcedestination = new SqlConnection(newDatabase);
sourceconnection.Open();
SqlCommand cmd = new SqlCommand("Select * from ELEMENTS");
cmd.Connection = sourceconnection;
SqlDataReader reader = cmd.ExecuteReader();
//Connect to Destination DataBase
SqlConnection destinationConnection = new SqlConnection(newDatabase);
destinationConnection.Open();
SqlBulkCopy bulkCopy = new SqlBulkCopy(destinationConnection);
bulkCopy.DestinationTableName = "ELEMENTSNEW";
bulkCopy.ColumnMappings.Clear();
bulkCopy.ColumnMappings.Add("fielString1", "newString1");
bulkCopy.ColumnMappings.Add("fielString2", "newStrin2");
bulkCopy.ColumnMappings.Add("fielFloat1", "newINT1");
bulkCopy.WriteToServer(reader);
reader.Close();
sourceconnection.Close();
sourcedestination.Close();
}
problem now is w differences betwen two tables
fielString1 can be null, newString1 cant be |
fielFloat1 is float now is nullable but newINT1 not
How to import with some conditions or to the different types of field ?
Siwek,
Any loop as shown in the first code sample will failed due to performance issues.... as you pointed!
The right approach here is SQL approach. The idea is to "flush" all data to new DB. Flush mean that ALL records (5,000 or 500,000) are stored to new DB with one action! And avoid any loops during extracting, filtering, editing and saving of data, because 640,000 loops takes long time....
Bulk copy is one possible. Issue with bulk copy is that it's hard for you to filter and edit data in this object.
Use ADO.net DataSet to get data from old DB, filter it, edit it, and save it on memory and flush it to new DB. DataSet take one step per action (extracting, filtering, editing, etc. ! no loops).
Or, try SQL replication. Replication is the SQL mechanism to copy data from DB "A" table "oneTable" to another DB, "B" with a table "AnotherTable" with a different schema and rules. Try it. I can specify more if you think it's a reasonable solution for you. No code needed, it's can be created using wizard on SQL Management studio, and run whenever needed (via SQL Job agent).
You should seriously consider SSIS or bcp. Otherwise you are looking at a scenario whet you're pulling data from the source server all the way down to the client box where the .net code is executing, then pushing all off that data up to the destination server. Think of the bandwidth being consumed. If you can instead do an SSIS export into the destination, at least it would be eliminating an extra layer of concern.
If you absolutely must pull data down to the client, consider writing the data into bcp formatted files, and then bulkcopying them into the destination server.
I'm pretty sure that you'll find that both of these paths are significantly faster than using plain old ADO.NET sort of approaches.

Problem with batch update using DataAdapter

I am updating the sql server 2005 database using batch update, as shown below
cmd = new SqlCommand("update Table1 set column1 = #column1 where EmpNo = #EmpNo", con);
cmd.Parameters.Add(new SqlParameter("#column1", SqlDbType.VarChar));
cmd.Parameters["#column1"].SourceVersion = DataRowVersion.Current;
cmd.Parameters["#column1"].SourceColumn = "Column";
cmd.Parameters.Add(new SqlParameter("#EmpNo", SqlDbType.Int));
cmd.Parameters["#EmpNo"].SourceVersion = DataRowVersion.Current;
cmd.Parameters["#EmpNo"].SourceColumn = "EmpNo";
cmd.UpdatedRowSource = UpdateRowSource.None;
sqlDa = new SqlDataAdapter();
con.Open();
sqlDa.UpdateCommand =cmd;
sqlDa.UpdateBatchSize = 10;
sqlDa.Update(dt);
con.Close();
But the data is not updated.I am unable to figure out what is the problem.Any help is appreciated.
I would suggest that you look at the dt right before you issue the update command. Make sure there are some rows that have RowState of Updated or Added. If not, there's nothing in your (I'm assuming) DataTable to update to the database.
Also, try removing the .SourceVersion property set operation.
If everything looks good, start a trace on the database right before you issue the .Update.
These are just a couple first steps to try.
SqlDataAdapter approach
using (SqlCommand insertCommand=new SqlCommand(
"INSERT BulkLoadTable(FieldA, FieldB) VALUES (#FieldA, #FieldB)", connection))
{
insertCommand.Parameters.Add("#FieldA", SqlDbType.VarChar, 10, "FieldA");
insertCommand.Parameters.Add("#FieldB", SqlDbType.Int, 4, "FieldB");
// Setting UpdatedRowSource is important if you want to batch up the inserts
insertCommand.UpdatedRowSource = UpdateRowSource.None;
using (SqlDataAdapter insertAdapter = new SqlDataAdapter())
{
insertAdapter.InsertCommand = insertCommand;
// How many records to send to the database in one go (all of them)
insertAdapter.UpdateBatchSize = myDataTable.Rows.Count;
// Send the inserts to the database
insertAdapter.Update(myDataTable);
}
}