Is there any way to test run a query and gain insight into possible query performance?
Related
Does using 'custom SQL' instead of joins in Tableau increase the performance of extract refresh on the server? Can someone explain it briefly?
The answer to almost every performance question is first, "it depends" and second, test and understand the measurement results. Real results carry more weight than advice from anyone on the Internet (from me or anyone else)
Still, custom SQL is usually not helpful for increasing performance in Tableau, and often hurts. It is usually much better to define your relationships in Tableau and let Tableau then generate optimized SQL for each view -- just as you let a compiler generate optimized machine code.
When you use custom SQL, you prevent Tableau from optimizing the SQL it generates. It has to run the SQL you provide in a subquery.
The best use case for custom SQL in Tableau is to access database specific features, or possibly windowing queries. Most other SQL functionality is available by using the corresponding Tableau features.
If you do have a complex slow custom SQL query that you must use, it is usually a good idea to make an extract so you only pay the performance cost during extract refresh.
So in your case, I'd focus effort on streamlining or eliminating the custom SQL, monitoring the query plan for the generated SQL, and indexing your database to best support that query.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I've always had this problem that went like this:
Have a slow SQL query that needs to be faster
Be confused why it's slow
Look at execution plan
Realize why it is slow
Know exactly what changes to the execution plan would likely result in a faster query
Attempt to formulate SQL query to get desired execution plan
Repeat previous step ~20 times
Learn to live with a slow query
This is not asking how to formulate the query from a desired plan, since that depends on the situation. The question is: Why bother?
Sure, for simple queries the execution plan is something I don't really care about or want to think about. But for complex queries, it feels like I'm programming in brainf*ck. I don't mean to brag, but I do believe I am much better at formulating execution plans than the optimizer is. Not only that, but it's an extra step to slow everything down more. The optimizer can't know many things than I know. It always feels like I'm fighting it as opposed to working with it.
I've looked online best I could, and there seems to be no way to write an execution plan directly, although I could have missed something.
Why do we write SQL queries instead of writing execution plans directly?
Side note: In SQLite, it's always baffled me that despite running in the same process as the program querying, SQLite still asks for a textual, character-array query and then has to parse it. In the cases of dymanic query generation, this means the query is generated and then immediately parsed by the same process.
Because SQL is a fifth generation programming language (the only successful one I know of) - it's main feature is that it writes the code for you. It inspects the contents of the database and determines the best way to do things.
That said, there are ways to manually change the execution plan on the fly via RECOMPILE. However, I suggest you stick to using HINTS rather than trying to do anything overly fancy. Generally, the planner does a better job than you possibly could.
One common way to solve consistently slow execution plans is to add WITH RECOMPILE to the end of your query. It causes the execution plan to be recompiled each time you execute it - not great for memory performance, but it is worth testing to see if it improves a highly active (many reads/writes) database.
SQL is a declarative language and its sole purpose is to help you write declarative code. You have to simply tell what you want to achieve. Then onwards it is the language's responsibility to figure out the best way to achieve it. Because the language still isn't that advanced, users often have to "know exactly what changes to the execution plan would likely result in a faster query". The ideal declarative language would perfectly decipher user's intentions.
For SQL Server you can use optimizer hint USE PLAN which allow to you give to SQL Server your best execution plan. Also, there any other optimizer hints like specifying physical joins, optimizing for some parameters, using specific table join and others. Yes, you can give best execution plan on current moment. But you maybe will have problems in the future because data changes. You can add indexes on table columns, drop it, distribution of data in the columns can changed. Your best plan now can be worst plan in the future. SQL engine will calculate good plan in the any case maybe it won't best.
SQL query is something like abstraction which allows not to think about best execution plan. It is our advantage not drawback.
I've just inherited an old PostgreSQL installation and need to do some diagnostics to find out why this database is running slow. On MS SQL you would use a tool such as Profiler to see what queries are running and then see how their execution plan looks like.
What tools, if any, exist for PostgreSQL that I can do this with? I would appreciate any help since I´m quite new with Postgres.
Use pg_stat_statements extension to get long running queries. then use select* from pg_stat_statements order by total_time/calls desc limit 10 to get ten longest. then use explain to see the plan...
My general approach is usually a mixture of approaches. This requires no extensions.
set the log_min_duration_statement to catch long-running queries. https://dba.stackexchange.com/questions/62842/log-min-duration-statement-setting-is-ignored should get you started.
Use profiling of client applications to see which queries they are spending their time on. Sometimes one has queries which take a small duration but are so frequently repeated to cause performance problems.
Of course then explain analyze can help. If you are looking inside plpgsql functions however, often you need to pull out the queries and run explain analyze on them directly.
Note: ALWAYS run explain analyze in a transaction that rolls back or a read-only transaction unless you know that it does not write to the database.
What is the most efficient way to get the sql query from EF? I need this so I can run the query analysis and find its execution plan.
I know I can hook profiler to SQL server, but this step is a pain and a tremendous hit on productivity, almost enough to give up ORM altogether.
Is there a better, more efficient way to optimize EF queries?
The ToTraceString method can do this for you.
If you cast your IQueryable to a ObjectQuery, you can use the ToTraceString()-method to see the SQL query.
Is there a better, more efficient way to optimize EF queries?
Yes, buy EF Profiler or Huagati Profiler. Alternatively use EF Provider wrapper for tracing.
I know I can hook profiler to SQL server, but this step is a pain and
a tremendous hit on productivity, almost enough to give up ORM
altogether.
Profiler is only half of the story. There is also Database Engine Tuning Advisor and these tools are the main tool set if you really want to optimize SQL queries - but optimizing SQL queries you don't have under your control is very hard and sometimes impossible.
I've seen the sql server execution plan and the mysql explain, which give details about where queries are not performing as well as they might. I think I need to create some indexes, but I'm not sure where.
Is there a tool for sqlite ?
Don't mind if its on the mac or windows.
SQLite has a built-in query explain statement. You probably want EXPLAIN QUERY PLAN.