Insert SQL query result into Excel worksheet using VSTO - vsto

I'm trying to write Excel application-level add-in. It makes SQL query to the database and populates worksheet with its results. I thought, it should be simple... But not.
Is there any way to insert DataTable into excel worksheet? Something like this:
using (SqlConnection connection = new SqlConnection(connectionString))
{
string cmdString = "SELECT * FROM [Table];" // Simplifyed query, in my add-in I need to JOIN 3 tables
SqlCommand cmd = new SqlCommand(cmdString, connection);
SqlDataAdapter sda = new SqlDataAdapter(cmd);
DataTable dt = new DataTable();
sda.Fill(dt);
// What I need to write here to insert my DataTable contents into worksheet?
}
Maybe, I should use another approach (not DataTable)? However, my query can return up to 100000 rows of data with 4 columns. So, cell-by-cell pasting will not work, I think.

If you are prepared to jump through a few hoops then CopyFromRecordset is hands-down the fastest way I have seen to handle this:
http://msdn.microsoft.com/en-us/library/office/ff839240(v=office.15).aspx
I can populate a million rows from SQL Server into Excel in a few seconds using CopyFromRecordset.
Basically you need to create a recordset instead of a data table and you need to use ADO (or DAO). Personally I like to keep this code separate from everything else and only use ADO for this function as it has inherent weaknesses. For example you can't use "using" with ADO connections, etc.
Here is a more complete example (in VB but easy enough to change to C#):
http://support.microsoft.com/kb/246335/

Related

ADO.NET with VB.NET - SELECT INTO query

I have been successful with setting up a connection and retrieving information from my MS Access database to my Windows Form in Visual Studio using VB.NET and SELECT statements with some of the sample code below.
Dim dataset As New DataSet
Dim datatable As New DataTable
dataset.Tables.Add(datatable)
Dim data_adaptor As OleDbDataAdapter
data_adaptor = New OleDbDataAdapter("SELECT * FROM test1", odbconnect)
data_adaptor.Fill(datatable)
However when I set my DataAdaptor to a query like below, I have run into some trouble executing a SELECT INTO query.
data_adaptor = New OleDbDataAdapter("SELECT first_name INTO newtable FROM test1", odbconnect)
I do not get any error but the table does not get created in the my access database. I understand that I would have to redeclare or declare a new, Data Adaptor to actually get information from the newly created "newtable" but if I execute this, then open my Access Database, "newtable" does not exists. However, if I run this query in Access, it works but I get a pop up box confirming that I want to paste data into a new table. I imagine this could be a possible hang up with working with MS Access in this manner. Is there some other setting or object I must use to execute a SELECT INTO query to create a new table?
Do not use a OleDbDataAdapter for this. Use simply a OleDbCommand and call ExecuteNonQuery
dbCmd = New OleDbCommand("SELECT first_name INTO newtable FROM test1", odbconnect)
cbCmd.ExecuteNonQuery()

SqlBulkCopy Column Mappings 500 Columns Plus New Columns

I searched through several posted questions related to SqlBulkCopy and ColumnMappings and could not find a solution. I am using C# ASP .NET MS VS 2010 with SQL Server 2008 R2.
I have been using SqlBulkCopy to load a staging table with data from a DataTable Source. I want to add a ColumnMappings property to my SqlBulkCopy object but since I have so many columns I am trying to avoid listing out all 500+ columns in my code that I need mapped to columns in the DB staging table.
The reason why I need to use ColumnMappings is that I have to drop and add columns so the schema is dynamic which is not a good design but for this purpose it is ok by our internal client and they understand the risks.
I also have captured all the column names into a collection. I don't think ColumnMappings method takes a collections as source and destination.
I've thought about writing two SqlBulkCopy objects to handle this but not sure that would work. Another option would be to use T-SQL..
What is the best way to handle this or what solution would you suggest I explore?
--set the connection string
//
string strConn = ConfigurationManager.ConnectionStrings["CPDM"].ToString();
//
SqlConnection scCPDM = new SqlConnection(strConn);
//
//
--create the SqlBulkCopy opbject
//
SqlBulkCopy bulkcopy = new SqlBulkCopy(strConn);
bulkcopy.BulkCopyTimeout = 3000;
bulkcopy.DestinationTableName = "CPDE_STAGING";
//
--need a to map all column mappings + new columns
--looking to do something like this --> bulkcopy.ColumnMappings.Add(*,*);
--then this ----> bulkcopy.ColumnMappings.Add(new_column,new_table_column);
--for each new column
bulkcopy.WriteToServer(dtNewRaw);
I ended up using this and works great.
//sqlbulkcopy does a blind insert into the db table.
//we have to add a column mapping property to tell
//sqlbulkcopy how to map each column from source to destination correctly
foreach (string sColumn in columnNamesNew)
{
string sNewColumn = sColumn.Replace(' ', '_');
bulkcopy.ColumnMappings.Add(sColumn, sNewColumn);
}

Retrieving Large Table from Access using OleDbDataReader (Slow)

I'm currently migrating an old application that was built using VB6. The application is mainly used to process Data from an Access Database. I was using the DAO library to do all the work. I'm now migrating it to VB.net (using Visual Studio Express 2010) and starting to use the OleDb library but im facing a problem regarding speed. When i try to open and iterate a large table that is around 7 columns and 25 million rows (the datatypes are mainly doubles) from a network location it takes about 10 minutes. When i was using DAO it took me around 1.5 minutes to open the table and run through all the records. This is a major difference in speed in my application, i was not expecting that.
This is the procedure that im using to open that table:
Public Const ConnectionString As String = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source="
Public Const DataBasePath = "My DataBase Location"
Public BE As OleDbConnection = New OleDbConnection(ConnectionString & DataBasePath)
Public Sub OpenLargeTable
BE.Open()
Dim QueryExecute As OleDbCommand, DataSet as OleDbDataReader
Try
QueryExecute = New OleDbCommand("Select * From LargeTable Order By Field1, Field2, BE)
DataSet = QueryExecute.ExecuteReader() : DataSet.Read()
QueryExecute.Dispose() : QueryExecute = Nothing
Catch
'Error Handling
End Try
End Sub
I need to sort the table, and i was using an SQL statement with an Order By clause. I have found out that without the Order By it is very fast, it's just when i order the data that, in comparison with the DAO, it turns out to be really slow. I have tried to create a Query in the access database that sorts the records and then calling the query on the SQL statement in the OleDbCommand object but that doesn't help. My question is, is the OleDbDataReader the best method/object to open a large sorted dataset?
Thanks

DataTable.Load() vs. DataReader

Getting large data around 200MB using stored procedure from Database. Previously used DataTable.Load() method to populate this data in DataTable. But it caused a performance issue and the DataTable is not responding because of size of Data.
Using reader As DbDataReader = cmdHelper.ExecuteReader(CommandBehavior.CloseConnection)
Using rstResult As New DataTable()
rstResult.Locale = CultureInfo.InvariantCulture
rstResult.Load(reader)
Return rstResult
End Using
End Using
But Now inorder to improve the performance started using DataReader directly, but since DatReader is connected architecture, the DB connections will be open till the BusinessLogic is done.
Dim cnHelper As New DbConnectionHelper(_strDBConnection)
Dim cmdHelper As DbCommandHelper = cnHelper.CreateCommandHelper(strSP)
cmdHelper.Command.CommandType = CommandType.StoredProcedure
Dim reader As DbDataReader = cmdHelper.ExecuteReader(CommandBehavior.CloseConnection)
Return reader
So, i don't want to use DatReader since DB Connections will be open till BusineesLogic is executed.
In this scenario is there any alternative to improve the performance without using DataReader?
In this scenario is there any alternative to improve the performance
without using DataReader?
Since DataReader is connection oriented it will keep the connection open with the database, and if you can't afford that can't load all the data in DataTable/DataSet then I guess other option would be to load data in chunks from the database in DataTable and work on that. Otherwise I don't think there are other options.
Use Stored Procedure to fetch data
Optimize your query
If you do not want to use DataReader than load data in parts (one of technique is paging).

How can I take each array item and insert in a SQL database?

I have the following vb.net code to take the values out of a textbox on a webpage (actually space delimited tags) and split them with a space delimiter into an array. This works exactly how I want.
mySample.Tags = tagsTextBox.Text
Dim tags As String = mySample.Tags
Dim tagarray() As String
Dim count As Integer
tagarray = tags.Split(" ")
For count = 0 To tagarray.Length - 1
Next
My issue is that I don't know how to take each of the values in the array, after this code runs, to insert them as separate records in a table.
I also will not know how many items will be in the array.
As Ian said this may be vurnerable for Sql injections. At the very least you should do a Server.HtmlEncode() for each tag you want to insert.
To insert your data you could do the following:
using (SqlConncetion conn = new SqlConnection(connstring))
using (SqlCommand cmd = conn.CreateCommand())
{
cmd.CommandText = "INSERT INTO table(tag) values (#tag)";
cmd.Parameters.Add("#tag", SqlDbType.VarChar);
conn.Open();
foreach(string tag in tags)
{
cmd.Parameters["#tag"].Value = Server.HtmlEncode(tag);
cmd.ExecuteNonQuery();
}
}
This should work properly, but doing it in a stored procedure and you should be safe against sql injections since you use parameters.
Also you should see here for a discussion around the use of parameters.
It all depends on the performance requirements and the general practices you use. Rune's answer can be perfectly fine. If you are inserting 100,000 rows look at a bulk inserter.
If you are used to writing stored procs and you are lucky enough to be running SQL 2008 you can make use of table valued params
This allows you to do stuff like this:
SqlCommand cmd = new SqlCommand("usp_ins_Portfolio", conn);
cmd.CommandType = CommandType.StoredProcedure;
//add the ds here as a tvp
SqlParameter sp = cmd.Parameters.AddWithValue("#Portfolio", ds.Tables[0]);
//notice structured
sp.SqlDbType = SqlDbType.Structured;
cmd.ExecuteNonQuery();
Then a single call to a stored proc can insert all the rows required into the Tag table.
For SQL 2005 and below I usually will use a single comma separated param for all the values, and split it in TSQL inside a stored proc. This tends to perform quite well and avoids mucking around with temp tables. It is also secure, but you have to ensure you use a text input param for the proc or have some sort of limit or batching mechanism in code (so you do not truncate long lists).
For ideas on how to split up lists in TSQL have a look at Erland's excellent article.
Sql 2000 version of the article is here.