Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
In VB.NET, can a parallel loop be performed on the tables inside a data set?
This has been answered.
My project is this:
Read a CSV file into a data table in memory.
Group the rows by sorting the data.
Process the data in synchronous loop detecting each group by a change in key values.
There are many separate processes that read these CSV files which each have their own layout and processing specifics.
My goal is to process these "groups" of data in parallel without using Entity Framework.
Is the Data Set/Data Table approach the best?
Yes, you can. A DataSet and a DataTable are in memory collections. So if you can process them independently it's a good candidate for a parallel For Each.
Parallel.ForEach(
dataSet.Tables.Cast(Of DataTable),
Sub(table)
ProcessTable(table)
End Sub
)
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
I merged two databases for analyses purposes. One of these databases is out of production, so the data is not changing. The other db is live and continuously growing.
What's the best practice in terms of data management/storage? Do I have to delete all rows and reload + union data from both databases or is there a better way to manage this?
Thanks in advance
Sam
If you know the SSIS then make a package which will check keys and based on that it will inserted only unique rows.
You can easily apply lookup via ssis in source and destination.
Let me know if you need any help
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I come to know that this is used to speed up data retrieval..I like to know more about it .Thanks .
Indexing is a way of sorting a number of records on multiple fields. Creating an index on a field in a table creates another data structure which holds the field value, and pointer to the record it relates to. This index structure is then sorted, allowing Binary Searches to be performed on it.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have to optimize a bunch of many table and data included SQL queries, and I would like to ask some questions:
1) There are situations, when the same functions are needed in different queries. Which results lower processor time: if the functions are calculated separately in every queries or if it is calculated in one query, and they are linked through each other?
2) When creating queries, which is better in giving table relationships? If I have 1 main query, and all of the other queries are related to that, or if there is a serial connection between all the tables?
The 2 main tables are relatively big (~30MB) Excel-tables.
Thank you in advance.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
i'm wanting to write data into memory only for a temp time. the format is essentially the same as an sql table with say 5 columns and 1,000 rows, give or take. simply i want to store this data and run queries against it to make calculations, sorting it, querying it to then produce chart reports and excel data.
I looked at custom psobjects and then sql and i can't see why i'd use custom psobjects over sql, what do you think?
I also couldn't see that adding multiple rows as such, using psobjects was as straight forward as adding another row in sql.
thanks
steve
I guess it depends on what you're more comfortable with, but if you're going to do it in Powershell then using PS custom objects seems like a logical choice since the cmdlets were designed to work with those.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I am designing a database for a form which contains many select boxes and check boxes lists.
I am unsure whether to populate these lists from a table in the database or from the select html text.
as part of db design best practice which is the preferred method.
If you expect the form elements (checkboxes, lists) are likely to change often, or are conditional (based on configurable permissions/roles), then they should come from a database.
However, if they are mostly static (rarely change, not dependent on configurable permissions), then you should hard-code them. The big benefit of hard-coding them is less traffic on your DB. This will yield the best performance.