Problem :
I need to save a datatable to a binary file , in binary format , in order to make the process fast because the datatable may contain up ten millions rows. So , XML is not favorable because it makes the file large sized , and the process will be slow.
I managed to save the datatable to a binary file , and it works fine , but the problem when I try to add new rows to the existing binary file (using a datatable with the same schema , but different rows data) , it copies the schema of the datatable to the binary file , making it very large.
What is needed:
I need to be able to only add rows to the existing binary file. If you try the below code and run it 3 times , it should create the binary file and then add 5 rows for each save , ie , save 15 rows . But this is not the case. it will save table schema & 5 rows (in binary format) >>> then table schema & 5 rows (in binary format) >>>table schema & 5 rows (in binary format). Table schema itself is very large , and consumes much file size. I need to save the file schema only once , then the 15 rows.
My Code:
Function GetTable() As DataTable
Dim table As New DataTable ' Create new DataTable instance.
table.Columns.Add("Dosage", GetType(Integer)) ' Create four typed columns in the DataTable.
table.Columns.Add("Drug", GetType(String))
table.Columns.Add("Patient", GetType(String))
table.Columns.Add("Date", GetType(DateTime))
' Add five rows with those columns filled in the DataTable.
table.Rows.Add(25, "Indocin", "David", DateTime.Now)
table.Rows.Add(50, "Enebrel", "Sam", DateTime.Now)
table.Rows.Add(10, "Hydralazine", "Christoff", DateTime.Now)
table.Rows.Add(21, "Combivent", "Janet", DateTime.Now)
table.Rows.Add(100, "Dilantin", "Melanie", DateTime.Now)
Return table
End Function
Private Sub SaveDataTabletoBinary()
dt = GetTable()
Dim format As New Binary.BinaryFormatter
Dim ds As New DataSet
' ds = DataGridView1.DataSource
Using fs As New FileStream("c:\sar1.txt", FileMode.Append)
dt.RemotingFormat = SerializationFormat.Binary
'Other option is SerilaizationFormat.XML
format.Serialize(fs, ds)
End Using
End Sub
Any Idea. ?
Thanks
Try using a binary file which has the extension of .bin
Related
So the boss comes to me and says "I want the value of each agent and the project on one line and the average of all the other agents on the next line so I can easily see if they are above or below average."
the table looks like this:
dt.Columns.Add("AGENT", GetType(String))
dt.Columns.Add("PROJECT", GetType(String))
dt.Columns.Add("Sales", GetType(Integer))
dt.Columns.Add("Declines", GetType(Integer))
dt.Columns.Add("Margin", GetType(Integer))
Ok its all good. One row in the datatable is the agent and project. The next row is the average of all the other agents and project like so:
row 1:
John Smith,
ProjectName,
(other column values)
row 2:
John Smith,
ProjectName & " AVERAGE/TOTAL",
(other column values)
The project name is removed in the SSRS report on the AVERAGE/TOTAL line because of space constraints on the piece of paper it is printed on.
I do the sorting by our standard way of sorting a datatable.
Dim dataView As New DataView(dt1)
dataView.Sort = "AGENT,PROJECT"
dt1 = dataView.ToTable
Return dt1
But now the boss has a new requirement later on. He wants to be able to sort by other columns in the table but keep the two rows the (agent/project and the agent/project AVERAGE/TOTAL) together. So in essence he wants to be able to sort not by one row but the two rows together but the sort value could be "AGENT,Margin". Obviously to keep the two rows together I have to find a way to sort the Project value too.
So I am stumped and would appreciate any thoughts you might have. C# ideas are welcome as well.LINQ is fine but it is going to have to become a datatable.
so you create two tables. One with the row values one one with the average values. Loop through the row values and then do another loop inside that loop to match project names. Its a hack but it worked.
If SortValue = "Default" Then
dt1.Merge(dt)
Dim dataView As New DataView(dt1)
dataView.Sort = "AGENT,PROJECT"
dt1 = dataView.ToTable
Else
Dim dataView As New DataView(dt)
dataView.Sort = SortValue
dt = dataView.ToTable
Dim dtCopy As New DataTable
dtCopy = dt.Clone
For Each row As DataRow In dt.Rows
dtCopy.ImportRow(row)
For i = 0 To dt1.Rows.Count - 1
If dt1.Rows(i).Item("PROJECT").ToString.Replace(" AVERAGE/TOTAL", "") = row.Item("PROJECT") And dt1.Rows(i).Item("AGENT") = row.Item("AGENT") Then
dtCopy.Rows.Add(dt1.Rows(i).Item("AGENT"), dt1.Rows(i).Item("PROJECT"), dt1.Rows(i).Item("SALES"), dt1.Rows(i).Item("Declines"), dt1.Rows(i).Item("Margin"))
End If
Next
Next
dt1 = dtCopy
End If
I have created a textbox and want it to search through a database of customers by name. Most of the questions are using an external dataset but this is just using a table created in the program using a csv file.
You could take advantage from BindingSource, to be used as DataSource of your DataGridView. That way, acting on the BindingSource Filter property, you could set any type of filters, based on you columns name.
Please check the following snippet:
Dim dt As New DataTable("Sample")
dt.Columns.Add("Id")
dt.Columns.Add("TimeStamp")
For i As Int32 = 0 To 9999
dt.Rows.Add(New Object() {i, DateTime.Now})
Next
Dim bs As New BindingSource
bs.DataSource = dt
bs.Filter = "Id > 10 AND Id < 20"
DataGridView1.DataSource = bs
As you can see, i've defined a DataTable with two columns, namely "Id" and "TimeStamp". Then, with a simple loop i've populated my DataTable with some random records, for Id = 0 to Id = 9999.
After that, we declare a BindingSource, specifying its DataSource is our DataTable. On the Bindinf Source, we could set any filter, using the Filter property, the columns names, and the common logical operators.
In my example, i've requested the filter to be on the only Id column, to visualize those record whose Id is between 11 and 19.
Then, we could use the BindingSource as our DataGridView DataSource.
And note that filters doesn't need to be apply before assigning the DataGridView DataSource: in fact, after the binding, each filter application will reflect immediately on the visualized rows.
Hope this helps
I am having an issue where SqlBulkCopy is not copying all of the data in a column over to the destination table. I have verified that the source data (which is a .CSV file) has values in the column in all of the rows, but only the first 40 or so rows in that column are getting copied over.
The destination table's columns are set to NVARCHAR(255) and all of them are allowed to be nullable.
Here is my function to do the bulk copy:
Private Sub loadDataFromCSV(ByVal pathToFile As String, ByVal connString As String, ByVal file As String, ByVal colCount As Integer)
Dim fileLocation As String = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" & pathToFile & ";Extended Properties='text;HDR=NO;FMT=Delimited(,)';"
Dim qry As String = "select * from " & file
Dim CompData As OleDbDataReader
Using destConnection As SqlConnection = _
New SqlConnection(connString)
destConnection.Open()
Using sourceConnection As New OleDbConnection(fileLocation)
Dim cmdSourceData As New OleDbCommand(qry, sourceConnection)
sourceConnection.Open()
CompData = cmdSourceData.ExecuteReader()
Using bulkCopy As SqlClient.SqlBulkCopy = New SqlClient.SqlBulkCopy(connString)
bulkCopy.DestinationTableName = "dbo.Records"
bulkCopy.BatchSize = 10000
bulkCopy.BulkCopyTimeout = 90
Try
bulkCopy.WriteToServer(CompData)
Catch ex As Exception
Console.WriteLine(ex.Message)
Finally
CompData.Close()
End Try
End Using
End Using
End Using
End Sub
As far as I can tell, all of the data from the table is making it over into the correct columns, with the exception of the 7th column. In the 7th column, I get the first 40 or so rows of data, and then the rest of the values for the column are NULL.
I've run out of ideas for what could be going wrong, so any help would be greatly appreciated.
Thanks.
My guess would be it is coversion errors. OLEDB will infer the datatype of a column based on the first 8 rows (I think) so if your first rows are:
SomeColumn
----------
1
2
3
4
5
6
7
8
9
apple
This initially looks like an integer column, so this is what it is mapped to, but then when it gets to "apple" it can't convert it to an integer, so returns DbNull.
The solution to this is to add IMEX=1 to your connection string, this means that no implicit conversion will be done, and the OleDbReader will just read exactly what is in the csv.
The downside of this is that you will probably then get conversion errors when trying to call the SqlBulkCopy.WriteToServer(DataReader) method. You may need to create a DataTable in the same format as your database table, and iterate over your OleDbReader doing explicit conversions where necessary, then write this DataTable to the database with SqlBulkCopy.
As a hacky workaround I simply put a character string into each column in the top 8 rows of my csv file. It fools sqlbulkcopy into treating all fields as strings. Then in sql delete the records that contain the character string.
I have this working fine in a DataTable, however the DataSet does not have the .Rows property. All fields will not be encrypted, thus they will not all be decrypting. I am assuming it would be some kind of loop, like:
For (i = 0, i < DataSet.ColumnIndex [Or something], i++)
However, I am not sure how to perform this.
Essentially, when I bring back data using a SELECT queries based on input parameters the user enters (first name, last name) I would like to decrypt specific rows.
How I currently use it:
Try
For i As Integer = 0 To dt.Rows.Count - 1
dt.Rows(i)("FIRST_NM_TXT") = clsEncrypt.DecryptData(dt.Rows(i)("FIRST_NM_TXT"))
dt.Rows(i)("LAST_NM_TXT") = clsEncrypt.DecryptData(dt.Rows(i)("LAST_NM_TXT"))
Next
Catch ex As Exception
MessageBox.Show("Either the first name or last name did not match. Please check your spelling.")
End Try
The reason I need a DataSet is because I need to run reports off of this decrypted data. I have tried with my DataTable, however I have not been successful. From research, it seems as though DataSet is the common choice anyway.
A DataSet object is just a collection of DataTable objects
You can access the DataTables in a DataSet by:
Oridinal Dim MyDataTable as DataTable = MyDataSet.Tables(2) or
Name Dim MyDataTable as DataTable = MyDataSet.Tables("Customers")
So just use one of the above methods to decrypt the data once you have the DataSet
For i As Integer = 0 To MyDataTable.Rows.Count - 1
MyDataTable.Rows(i)("FIRST_NM_TXT") = clsEncrypt.DecryptData(MyDataTable.Rows(i)("FIRST_NM_TXT"))
MyDataTable.Rows(i)("LAST_NM_TXT") = clsEncrypt.DecryptData(MyDataTable.Rows(i)("LAST_NM_TXT"))
Next
Firstly I used the following to populate the DataGridView:
dta = New OleDbDataAdapter("Select * From [" & ActName & "$B6:E" & LastEntryRow & "]", cn)
dts = New DataSet
dta.Fill(dts, "Detailtable")
DataGridView1.DataSource = dts
DataGridView1.DataMember = "Detailtable"
I then formatted the DataGridView which included the following code:
Dim currencyCellStyle As New DataGridViewCellStyle
currencyCellStyle.Format = "C2"
With Me.DataGridView
.Columns(1).DefaultCellStyle = currencyCellStyle
.Columns(2).DefaultCellStyle = currencyCellStyle
End with
This worked well. Columns displayed their values as $1234.00.
When new values were added to the columns they immediately displayed as $1234.00. (working so far)
If a column did not have any values when the dataset was made, no values showed in the datagridview for that column. (no problem so far)
However, all new values added to the blank column display as 1234.00. Not $1234.00.
I have tried refreshing the DataGridView
I have re-formatted the DataGridView after the change to the cell.
It still displays as 1234.00.
If I save the changes, recreate the DataSet and repopulate the DataGridView all is OK.
I need the DataGridView to reflect the correct format ($1234.00) when new values are added directly to the column?????
If you use this line
DataGridView1.Columns(1).DefaultCellStyle.Format = "C2"
it must be work but if your database column format is not Decimal (but TEXT), this type of format it'll never works.