I am using ado.net entities, against a SQL azure database. One of the queries is taking an extremely long time, most likely pulling data it doesn't need. Is there a way to match up the query in C# with the query execution in Azure.
Please enable Query Store on SQL Azure to identify the T-SQL equivalent of the LINQ query. Use this article for more details.
Below command helps you enabled query store
alter database current set query_Store on
Hope this helps.
Related
I dont seem to find a way to write the output from a previous step in the flow into a SQL table, using the SQL recipes. When I read the documentation, it seems both types of SQL action can only take as an input a SQL dataset? This cant be write, as you would imagine you would want to create datasets in the flow and then commit them to a database?
https://doc.dataiku.com/dss/latest/code_recipes/sql.html
In the docs above, it describes In\Out parameters as needing to be SQL.
Indeed, it doesn't seem possible with a SQL recipe which executes fully in the database.
That being said you can probably use a sync recipe to put your non-SQL dataset in your SQL db so that you can execute a SQL recipe.
On Azure SQL Database:
UPDATE SomeLargeTable
SET [nonPKbutIndexedColumn] = newValue
WHERE [nonPKbutIndexedColumn] = value;
UPDATE SomeLargeTable
SET [nonPKbutIndexedColumn] = newValue
WHERE [PKcolumn] IN (SELECT [PKcolumn] FROM SomeLargeTable
WHERE [nonPKbutIndexedColumn] = value);
What about the performance of these queries? Other suggestions also are welcome...
The performance of any Data Manipulation Language (DML) command depends on many factors like volume of data in the tables, how efficiently the schema is designed, etc.
As long as your tables are properly indexed, both queries will run fine. There shouldn't be any performance issue. You can check the time taken for the query at the bottom of Query Editor in Azure SQL Database.
Additionally, you can use Query Performance Insight in Azure SQL Database which provides intelligent query analysis for single and pooled databases. It helps identify the top resource consuming and long-running queries in your workload. This helps you find the queries to optimize to improve overall workload performance and efficiently use the resource.
I would like to know if there is any tool which will give me the optimized SQL query for which ever query I specify. So that I can improve my DB as well as query performance. I use SQL Server 2008.
Thanks in advance.
The old Rule of DBs still applies, don't try to optimize sql statements, since the DB query parser will do its own optimizations anyway, instead do right away what we all do in the end:
Create indexes to increase performance
Don't get me wrong of course sql queries can be written stupidly and will therefore perform badly, but as long as you created a sensable 'normal' query, the query optimizer will do the rest together with the indexes.
SQL Server will even tell you if a query will clearly benefit from an index when you look at the execution plan. It will even generate the DDL statement to create the index, so all you have to do is copy/paste and run it to have the index your query needs.
You can already watch the execution plan that gives you SQL Server Management Studio.
You can try Redgate, they have evaluation versions for most of their products:
Redgate Website
SQL Server 2005 and up comes with a Query Optimizer. This can help, but tools can't really do too much optimization suggesting for you because they don't know what you are trying to accomplish.
You might try taking a look instead at some ways in which you can optimize your queries. Here are some links to get you started.
Tips, Tricks, and Advice from the MS SQL Query Optimization Team
SQL Server Rules for Optimizing Queries, best practices
Statistics Used by the Query Optimizer in SQL Server 2008
SQL Server 7.0 / 2000 came with 'index tuning wizard' this functionality has been around for a long time.
I'd reccomend having a look at 'select * from sys.dm_db_missing_index_details'
It tells you which indexes are 'missing', it's trivial to look in that table and then create indexes
I would like to copy parts of an Oracle DB to a SQL Server DB. I need to move the data because the Oracle box is being decommissioned. I only need the data for reference purposes so don't need indexes or stored procedures or contstaints, etc. All I need is the data.
I have a link to the Oracle DB in SQL Server. I have tested the following query, which seemed to work just fine:
select
*
into
NewTableName
from
linkedserver.OracleTable
I was wondering if there are any potential issues with using this approach?
Using SSIS (sql integration services) may be a good alternative especially if your table names are the same on both servers. Use the import wizard via and it should create the destination tables for you and let you edit any mappings.
The only issue I see with that is you will need to execute that of course for each and every table you need. Glad you are decommissioning the oracle server :-). Otherwise if you are not concerned with indexes or any of the existing sprocs I don't see any issue in what you are doing.
The "select " approach could be very slow if tables are large. Consider writing pro*C in that case or use Fastreader http://www.wisdomforce.com/products-FastReader.html
A faster and easier approach might be to use the Data Transformation Services, depending on the number of objects you're trying to copy over.
I have a rather large (many gigabytes) table of data in SQL Server that I wish to move to a table in another database on the same server.
The tables are the same layout.
What would be the most effecient way of going about doing this?
This is a one off operation so no automation is required.
Many thanks.
If it is a one-off operation, why care about top efficiency so much?
SELECT * INTO OtherDatabase..NewTable FROM ThisDatabase..OldTable
or
INSERT OtherDatabase..NewTable
SELECT * FROM ThisDatabase..OldTable
...and let it run over night. I would dare to say that using SELECT/INSERT INTO on the same server is not far from the best efficiency you can get anyway.
Or you could use the "SQL Import and Export Wizard" found under "Management" in Microsoft SQL Server Management Studio.
I'd go with Tomalak's answer.
You might want to temporarily put your target database into bulk-logged recovery mode before executing a 'select into' to stop the log file exploding...
If it's SQL Server 7 or 2000 look at Data Transformation Services (DTS). For SQL 2005 and 2008 look at SQL Server Integration Services (SSIS)
Definitely put the target DB into bulk-logged mode. This will minimally log the operation and speed it up.