How to run the same query multiple times in SQL Server?
Simple example, I have a query
select * from sys.databases
I wanted to run it N times, because I wanted to return the data in a dashboard in "real time", until I stopped the execution, the select would need to continue running "example: as SQL Server Profiler does, while I don't stop, it keeps bringing the information in the screen".
What would be the best way for this type of situation?
Remembering that the query and SQL Server profiler are just examples.
create a job and insert your query to run, and under the schedule set the timings to execute your query.
Because your dashboard code must poll the dbms every so often to get the latest data, you must decide how often as part of your system design.
Once a minute? That is very often to poll the data. But only you can decide how out-of-date your dashboard users will tolerate.
Whatever you do, avoid promising "real time" if you possibly can. We programmers can't deliver on that promise with polling the database.
Related
Well, I have a stored procedure on SQL server 2012. When I execute it with the same parameters from SSMS it always takes different time to get results.I have observed that I need from 10sec to 10mins to wait. What could be a reason? Where is to start digging? I can not post the code here because it's too large, but I think some common recommendations might appear.
Well, the time difference between runs is rather large, so it could be that the system may be under load when you run this queries. Is this in a production environment?
To troubleshoot:
Turn on Actual Query Plan and execute the query/
Check that other queries are not blocking your query (sp_who2)
You can also run SQL Profiler when you run your query.
I have written a procedure that runs a bunch of select into statements from remote linked servers.
These servers have been known to simply hang and not respond for some reason, however my insert into statement will continue endlessly.
Is there a way in SQL server I can monitor the destination table to see if data is going in? I am not using transactions. When I try to select from the destination table, it must be locked because it is basically sitting there waiting. I changed my isolation level to READ UNCOMMITTED and can get a select on the table, but the count isn't moving, so I am assuming the data goes in batches?
I am running a tcpdump on the remote server and can see the packets flowing through, just hoping there is an easier way to see it through MSSQL somewhere.
Any advise appreciated!
SQL profiler is your friend.
open SSMS and goto Tools -> SQL Server Profiler. Then setup the criteria, and there are ALOT OF THEM. I always start big and whittle it down.
Some advise, make sure you know what the account is that is being used to execute the statement. As well, only run it for no more than 10-20 mins, cause these Profiler files can eat up 100GB/session in minutes. While looking for the statement, only start it before you execute the statement and stop it after you get the result and then find your statement information and then refine the Profiler for what you need.
We have an AS400 mainframe running our DB2 transactional database. We also have a SQL Server setup that gets loaded nightly with data from the AS400. The SQL Server setup is for reporting.
I can link the two database servers, BUT, there's concern about how big a performance hit DB2 might suffer from queries coming from SQL Server.
Basically, the fear is that if we start hitting DB2 with queries from SQL Server we'll bog down the transactional system and screw up orders and shipping.
Thanks in advance for any knowledge that can be shared.
Anyone who has a pat answer for a performance question is wrong :-) The appropriate answer is always 'it depends.' Performance tuning is best done via measure, change one variable, repeat.
DB2 for i shouldn't even notice if someone executes a 1,000 row SELECT statement. Take Benny's suggestion and run one while the IBM i side watch. If they want a hint, use WRKACTJOB and sort on the Int column. That represents the interactive response time. I'd guess that the query will be complete before they have time to notice that it was active.
If that seems unacceptable to the management, then perhaps offer to test it before or after hours, where it can't possibly impact interactive performance.
As an aside, the RPG guys can create Excel spreadsheets on the fly too. Scott Klement published some RPG wrappers over the Java POI/HSSF classes. Also, Giovanni Perrotti at Easy400.net has some examples of providing an Excel spreadsheet from a web page.
I'd mostly agree with Buck, a 1000 row result set is no big deal...
Unless of course the system is looking through billions of rows across hundreds of tables to get the 1000 rows you are interested in.
Assuming a useful index exists, 1000 rows shouldn't be a big deal. If you have IBM i Access for Windows installed, there's a component of System i Navigator called "Run SQL Scripts" that includes "Visual Explain" that provides a visual explanation of the query execution plan. View that you can ensure that an index is being used.
On key thing, make sure the work is being done on the i. When using a standard linked table MS SQL Server will attempt to pull back all the rows then do it's own "where".
select * from MYLINK.MYIBMI.MYLIB.MYTABE where MYKEYFLD = '00335';
Whereas this format sends the statement to the remote server for processing and just gets back the results:
select * from openquery(MYLINK, 'select * from mylib.mytable where MYKEYFLD = ''00335''');
Alternately, you could ask the i guys to build you a stored procedure that you can call to get back the results you are looking for. Personally, that's my preferred method.
Charles
My boss and I have been trying to see what sort of auditing plan we could try for our stored procedures. Currently there're two external applications taking information from our database through stored procedures and we're interested in auditing when they're being executed, and what values are passed as parameters. So far what I've done is simply create a table for the stored procedures one of the apps is using, and as they use the same input parameters, have one column per parameter. Obviously this isn't the best choice, but we wanted to get quick info to see if they were running batch processes and when they were running them. I've tried SQL Server Audit, but it doesn't catch the parameters unless you're executing a SP in a query.
SQL Server Profiler will do this for you; its included for free. Setup a trace and let it run.
You can also apply quite a bit of filtering to the trace, so you don't need to track everything; you can also direct the output to a file, or sql table for later analysis. This is probably your best bet for a time limited audit.
I think I've used the SQL Server Profiler (http://msdn.microsoft.com/en-us/library/ms181091.aspx) in the past to audit SQL execution. It's not something you would run all the time, but you can get a snapshot of what's running and how it's being executed.
I haven't tried using them, but you might look at event notifications and see if they will work for you.
From BOL
Event notifications can be used to do the following:
Log and review changes or activity occurring on the database.
Is there a way to tell MS SQL that a query is not too important and that it can (and should) take its time?
Likewise is there a way to tell MS SQL that it should give higher priority to a query?
Not in versions below SQL 2008. In SQL Server 2008 there's the resource governor. Using that you can assign logins to groups based on properties of the login (login name, application name, etc). The groups can then be assigned to resource pools and limitations or restrictions i.t.o. resources can be applied to those resource pools
SQL Server does not have any form of resource governor yet. There is a SET option called QUERY_GOVERNOR_COST_LIMIT but it's not quite what you're looking for. And it prevents queries from executing based on the cost rather than controlling resources.
I'm not sure if this is what you're asking, but I had a situation where a single UI click added 10,000 records to an email queue (lots of data in the body). The email went out over the next several days so it didn't need to be a high priority, in fact it would bog the server every time it happened.
I split the procedure into 10,000 individual calls, ran the process on the UI in a different thread (set to low priority) and set it to sleep for a second after running the procedure. It took a while, but I had very granular control over exactly what it was doing.
btw, this was NOT spam, so don't flame me thinking it was.