Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
What is the best way to write a query out of these 2 methods?
1. Use of a sub query
2. Use of joins
Which way is faster and better when writing queries?
Thank you.
Shashika
In most cases JOINs are faster than sub-queries and it is very rare for a sub-query to be faster.
In JOINs RDBMS can create an execution plan that is better for your query and can predict what data should be loaded to be processed and save time, unlike the sub-query where it will run all the queries and load all their data to do the processing.
The good thing in sub-queries is that they are more readable than JOINs: that’s why most new SQL people prefer them; it is the easy way; but when it comes to performance, JOINS are better in most cases even though they are not hard to read too.
— Kronas, Stackoverflow
Coming from a perspective of someone who's done a lot of tuning and fixing legacy code, I prefer joins (and when possible, with conditions on the ON clause). It can really improve readability, especially on the procedures that are regularly expanded.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I'm rewokring a huge SQL query (2k lines), containing lots of CASE where each CASE is another query. I'd like to know what to do and what to avoid, in terms of performance.
Should I make a bigger general query, that JOINs everything I'll need, and then, condition each CASE with the things I joined.
Or do I write a new query for each CASE? (with most subqueries from CASEs using the same tables).
I've also seen subqueries with an AS at the end, to use the resulting datas in the select or in conditions.
And WITH, before the SELECT, for mostly the same effect, creating a kind of table for conditions and display.
Which one is better to use in terms of performance?
Thanks
First of all, look if some CASE or subselect is repeated several times, do a WITH containing that part of the query and then replace it in the code so you only do the select one time in the with.
Second of all, try to avoid subselects as much as you can, again using WITH is a good way to do that.
Try to clarify each CASE so it's easy to read in vertical to avoid making mistakes when modifying it if necessary.
Remind that if the WITH is too big is possible that you consume all the memory so the query won't work, so add there the most repetitive sentences, but not all of them.
If possible, try to make the big query a lot of little querys splitted and add them all into a package so it's easier to keep track of the error and a good control of the process.
edit:
All this guessing you're using Oracle!!
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
Optimization was never one of my expertise. I have users table. every user has many followers. So now I'm wondering if I should use a counter column in case that some user has a million followers. So instead of counting a whole table of relations, shouldn't I use a counter?
I'm working with SQL database.
Update 1
Right now I'm only writing the way I should build my site. I haven't write the code yet. I don't know if I'll have slow performance, that's why I'm asking you.
You should certainly not introduce a counter right away. The counter is redundant data and it will complicate everything. You will have to master the additional complexity and it'll slow down the development process.
Better start with a normalized model and see how it works. If you really run into performance problems, solve it then then.
Remember: premature optimization is the root of all evils.
It's generally a good practice to avoid duplication of data, such as summarizing one data point in another data's table.
It depends on what this is for. If this is for reporting, speed is usually not an issue and you can use a join.
If it has to do with the application and you're running into performance issues with join or computed column, you may want to consider summary table generated on a schedule.
If you're not seeing a performance issue, leave it alone.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have seen this mentioned at 2-3 places that subquery factoring (Oracle With clause, also called as Common Table Expression) is more performant than materialized views in some cases, though no one mentions when.
Are there some cases when this is true and one should use Subquery Factoring on those occasions to get more performance.
The Oracle WITH clause is primarily a means to make your statements more readable. When the subquery appears more than once in your statement, Oracle may choose to evaluate the WITH clause only once and put its results into a temporary table. The undocumented hint MATERIALIZE is said to encourage Oracle to do so.
But anytime Oracle may also choose to inline the WITH clause in which case there would be no performance difference at all, compared to repeated subqueries. All in all my experiences with the performance benefits of WITH were somewhat disappointing.
Materialzed views in constrast do some work upfront and may improve performance no matter how many times they are referenced in a statement.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
Currently we have a complex business object which need around 30 joins on our sql database to retrieve one item. (and this is our main use case). The database is around 2Gb in sql server.
We are using entity framework to retrieve data and it takes around 3,5sec to retrieve one item. We haved noticed that using subquery in a parrallel invoke is more performant than using joins when there is a lot of rows in the other table. (so we have something like 10 subqueries). We don't use stored procedure because we would like to keep the Data Access Layer in "plain c#".
The goal would be to retrieve the item under 1sec without changing too much the environnement.
We are looking into no sql solutions (RavenDB, Cassandra, Redis with the "document client") and the new feature "in-memory database" of sql server.
What do you recommend ? Do you think that just one stored procedure call with EF would do the job ?
EDIT 1:
We have indexes on all columns where we are doing joins
In my opinion, if you need 30 joins to retrieve one item, it is something wrong with the design of your database. Maybe it is correct from the relational point of view but what is sure it is totally impractical from the funcional/performance point of view.
A couple of solutions came to my mind:
Denormalize your database design.
I am pretty sure that you can reduce the number of joins improving your performance a lot with that technique.
http://technet.microsoft.com/en-us/library/cc505841.aspx
Use a NoSQL solution like you mention.
Due to the quantity of SQL tables involved this is not going to be an easy change, but maybe you can start introducing NoSQL like a cache for this complex objects.
NoSQL Use Case Scenarios or WHEN to use NoSQL
Of course using stored procedures for this case in much better and it will improve the performance but I do not believe is going to make a dramatic change. You should try id and compare. Also revise all your indexes.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I know this could be a stupid question, but a beginner I must ask this question to experts to clear my doubt.
When we use Entity Framework to query data from database by joining multiple tables it creates a sql query and this query is then fired to database to fetch records.
"We know that if we execute large query from .net code it will
increase network traffic and performance will be down. So instead of
writing large query we create and execute stored procedure and that
significantly increases the performance."
My question is - does EF not using the same old concept of creating large query that leads to degrade performance.
Experts please clear my doubts.
Thanks.
Contrary to popular myth, stored procedure are not any faster than a regular query. There are some slight, possible direct performance improvements when using stored procedures ( execution plan caching, precompiltion ) but with a modern caching environment and newer query optimizers and performance analysis engines, the benefits are small at best. Combine this with the fact that these potential optimization were already just a small part of the query results generation process, the most time-intensive part being the actual collection, seeking, sorting, merging, etc. of data, these stored procedure advantages are downright irrelevant.
Now, one other point. There is absolutely no way, ever, that by creating 500 bytes for the text of a query versus 50 bytes for the name of a stored procedure that you are going to have any effect on a 100 M b / s link.