I'm looking for advice on how to avoid Excel not responding during a long SQL query.
If I run my VBA code with the data already in place, the execution takes less than one second. When running the code with the SQL query, execution time jumps up to about 11 seconds, which is the same amount of time it takes for the query to run in Microsofts SQL Server Management Studio.
While the code finishes without further problems every time, Excel turns unresponsive after about 6 seconds, showing a blank screen and a "Not responding" text. ScreenUpdating, Calculation, and Events are all disabled.
The user doesn't need to see anything happening while the macro is running, but I'd rather not have my colleagues see a blank screen with the "Not responding" text every time they run the macro.
Any general advice or reading on how to fix this?
Related
I hope that you can help me.
That's my situation: daily I'm importing in Power Pivot some data through a query on a SQL database.
Actually every morning I open the Power Pivot and I refresh it for import the data of the previous day present in the database.
This action require 20 minutes because I have a lot of data to import.
I was wondering if there is a way to do this action during the night, maybe an automatic refresh, so that I can open the file in the morning and I alredy have the data of the previous day.
I hope that I was clear with my request, thanks in advice.
If the Excel workbook is on a machine that does not shut down, you can keep the workbook open and configure the query to automatically refresh ever x minutes.
Or you can keep the workbook open and run VBA code to refresh the query on a timer.
There are plenty of examples for VBA timers if you just care to search.
Or you can configure the queries to refresh automatically when the file is opened, then create a Windows Task Scheduler job to open the workbook at a specific time. Again, the computer running this must be turned on.
You see that there are many options and they are all well documented and just a short google search away.
I'm currently in the process of developing a file which is used to fill in a form based on the data from the sheet. The issue I'm facing is that the page gets refreshed for each entered field for one section of the code. The refresh time varies based on the computer.
My current solution is to induce a wait of approximately 5-6 seconds which can be inefficient based on the terminal used. Is there a way that the execution can be paused simply till the page is done loading? Also, there are two versions of Excel that are in use, essentially 2013 and 2003. Not sure if that's relevant but I've had to sort out numerous compatibility issues.
I have some VBA code that runs every 15 minutes continually. This code runs fine at first but after a period of time (which varies from around 4 hours to 5 days) Excel crashes.
It is not a VBA crash (with debug option) but it is Excel freezing and saying the application needs to be restarted.
Am I trying to use Excel in a way it was not designed for? Should I be clearing some memory/cache to avoid this?
I believe it is due to a fail of the OnTime method of the application, I had your same problem some time ago with an Excel tool which was scraping data from the web every 5 minutes. Sometimes it just crashed with no specific log or error/warning. Here is the workaround I have implemented:
1) In the Workbook_Open event I have put :
myMacro 'the call to my procedure when the workbook is opened
ThisWorkbook.Save 'I was saving the results, I don't know if you need this
ThisWorkbook.Close 'I was closing my Workbook
2) In the Windows system, I was using a tool (installed usually with the system) called Task Scheduler, whose executable lies into the system folder (C:\Windows\System32\taskschd.msc) and it's really intuitive and easy-to-use. I have scheduled a task every 5 minutes which consisted on opening the workbook (once this was done, the open-workbook-event macro was triggered so my procedure was called, and after its execution the Workbook was closed and saved with the two lines of code I have posted you above).
Even if this is just my opinion (that I cannot prove with technical evidences), I don't trust too much the Application.OnTime in the long-term; it works fine for a little task every 10 seconds for a few minutes, but when the macro should be running regularly and you want to avoid seeing that, once every 5 hours, the Excel process crashes... then I would suggest to let the scheduling task to the Microsot tool rather than to the Excel application method.
MY EXPERIENCE
- my Excel, with the OnTime method in the macro, was crashing in a period between 1 hour (minimum crash) and 7 hours (maximum length of time before crashing). I was forced to open a distance connection from home and running it in the morning before to go out, to make sure the job was done before I got at work.
- my same macro, with the procedure I have described you above, never crashed anymore. I believe the system process is much more reliable than the Excel's one (but again, I don't have any evidence to prove you this so don't take it as gold).
I have a basic SELECT query fired from an Access form that takes a while to execute, so I would like to run the query asynchronously and allow my users to continue using the form (or at least keep them updated on the progress).
The problem is that Access freezes the application when code is being executed, appearing to the users to have crashed - even to the point of Windows marking it 'Not Responding' and offering to kill it. Obviously not very user-friendly! I have tried using the code listed on the MSDN here and a variant method here.
Both these solutions do seem to run the query 'asynchronously' (the code block firing the async query completes, and the rs_FetchProgress and rs_FetchComplete events (or cn_ExecuteComplete event in the second solution) fire and run... but Access's interface still locks up until the query is done executing. Calling Repaint and DoEvents in various places (such as the rs_FetchProgress event) does not seem to have any effect.
I doubt it's relevant information, but the view being SELECTed from is in SQL Server, the view doesn't return a huge amount of data but does take about 20 seconds to process.
Do not pull a huge recordset when you open the form. Base the form on a query which pulls only a few or even no records when it first loads. Then give the users a method to select a different reasonably-sized subset of records.
Try to avoid pulling huge recordsets regardless of whether your data source is a linked Access table or a client-server database.
I have a routine that examines thousands of records looking for discrepancies. This can take upwards of 5 minutes to complete and although I provide a progress bar and elapsed time count, I'm not sure I want to encourage folk pressing ctrl-break to quit the report should it be taking longer than expected.
A button in the progress bar won't work as the form is non-modal, so is there any neat way of allowing users to quit in this situation?
You need DoEvents and a variable whose scope is greater than the scope of what you're running. That is, if it's just a procedure, you need a module level variable. If it's more than one module, you need a global variable. See here
Stopwatch at DDoE
Normally, the VB engine will tie up the processor until it's done. With DoEvents, however, VB allows the processor to work on whatever is next in the queue, then return to VB.
I don't think there is a way to do it like you would want it to work. VBA is a scripting language so when you start your procedure, it's gonna run until it's done. If you had another button somewhere that even WOULD let you click it while the original procedure was running, I'm not sure how you would reference that procedure and stop it.
You could do something like ask the user if they want to contine, but that would make it run even longer.
Also you could have your procedure check for a condition outside of Excel and keep running as long as it's true. Something easy might be check if a certain text file is in a folder. If you wanted the procedure to stop, open the folder and move the file. On your loop's next iteration, it wouldn't see the file and stop running. Cludgy, inefficient, and not elegant, but it would work. You could also have it check a cell, checkbox, radiobutton, basically any control in another Excel sheet running in another instance of Excel. Again cludgy.
CTRL+Break works. Accept it and move on. One neat trick about that though, is that if you password protect your code and they hit CTRL+Break, the debug option is unavailable and they will only get Continue or End.
If this is code that is run frequently, have you considered scripting something that runs it during times when a human is not using the computer? I used to run telnet screen scraping macros that would take hours to go through our widgets, but I always had them run either on a separate computer or when I wasn't there (nights/weekends).