Web application hangs after multiple requests - apache

The application is using Apache Server as a web server and Tomcat as an application server.
Operations/requests can be triggered from the UI, which can take time to return from the server as it does some processing like fetching data from the database and performing calculations on that data. This time depends on the amount of data in the database and the duration of data it is processing. It could be as long as 30min to an hour or 2 min's based on the parameters.
Apart from this, there are some other calls which fetche small amount of data from the database and return immediately.
Now when I have multiple, say 4 or 5 of these long heavy calls to the server, and they are currently running, when I make a call that is supposed to be smaller and return immediately, this call also hangs as it never reaches my controller.
I am unable to find a way to debug this issue or find a resolution. Please let me know if in case you happen to know how to proceed with this issue.
I am using Spring, with c3p0 connection pooling with Hibernate.

So I figured out what was wrong with the application, and thought about sharing it in case someone somewhere faces the same issue. It turns out nothing was wrong with the application server or the web server, when technically speaking it was the browsers fault.
I found out that the browser can only have a limited number of open concurrent calls to a domain. In the case of the latest version of chrome at the time of writing is 6. This is something all the browsers do to prevent DDOS attacks.
As in my application, the HTTP calls take a lot of time to return until the calculations are completed several HTTP calls accumulate concurrently and as a result, the browser stops sending any further calls after the 6th concurrent call and it feels like the application is unresponsive. You can read about the maximum no of concurrent calls by a browser in SO.
A possible solution I have thought is either polling or even better Long Polling. I would have used WebSockets but then we would need to make a lot of changes.

Related

OData Service Slow to Wake Up after not used for a while

I have created an Odata service that i use for my IPad application to talk to an SQL Server database. The problem is the first time the odata server is accessed every morning (with no-one using it over night) it takes along time to connect. Once the first connection is connected than all connections are instant after that.
Does anyone know what i need to do to stop this from happening? I dont mind extending the timeout of an app pool if needed.
Thanks
Two options
1) Keep app alive simply by making occasional calls (Every 5 mins or so). I generally just write a quick vbs or js file to make this call and schedule a task
or (probably cleaner, but haven't done this yet)
2) See ScottGu's 4.0 approach here

How can i protect my server from multiple queries on port 80?

i have a very simple server running WAMP on a windows machine, with a php code who is a simple API for my clients that returns an XML. The things is that the hardware is very modest, and if a user calls the link to the API and hits F5 many times (calls the link repeatedly) the server performance goes down a little (response time goes up). Is there a way to limit the queries on port 80?
I know how to limit this in the the php code, but i think it is not good practice because even if you limit the queries on the php code the query is already made and I'm consuming resource checking with php if the user is making many queries.
Well, if you want to catch it before it reaches PHP, an Apache module would be one approach, e.g. mod_cband. Other than that, your firewall might help you, but I don't know if the default Windows one is up for that.
Other than that, handling it in your PHP code wouldn't be that bad. Yes, checking a DB consumes time, but it's still faster than collecting and returning XML.
Implement access control to the resources, keep track of active sessions and don't initiate heavy tasks while that particular user has a task open...?

monotouch - updating data locally

In my ap we serilaize the data locally for offline use. To ensure the app is always up to date I fire off an update on launch.
To do this I have a set of WCF services that will get a delta for the requested data. Rather than complicate things I have a service to update events, a service to update stages, a service to update acts etc. Which means i have to daisy chain these calls in the callbacks so they run one after the other.
The problem with this is that they can take a short while to update and it seems a bit clunky chaining them like this.
What is the prefered/advised way of updating from multiple services to achieve what i need to here?
Cheers
w://
For Cracklytics (http://cracklytics.com) as well as a few other enterprise apps I've worked on, I run two service calls in parallel at the same time, instead of doing one after the other.
I spent quite a lot of time testing the performance of making calls one-at-a-time vs two-at-a-time vs three-at-a-time, etc, and I got the best results under 2G and 3G by running 2 threads at once. On wireless, I could start up like 8-10 threads together and they would run really fast.
Besides those two calls, Cracklytics also downloads a few charts from Google at the same time as those 2 calls, but I didn't notice any performance impact from that.
For the implementation, I have one main class that keeps track of all the webservices class and controls when they should be started and finished.
Just as important though is to figure out when web services calls should be canceled, though; for example, if you're downloading data for a table, but the user moves to another screen, you should cancel the call right away, so it doesn't impact the downloading of data for the next screen.
Hope this helps.

Worker process reached its allowed processing time

We are experiencing this issue approximately once a month. It is very hard to pinpoint the cause so any help would be appreciated. This causes the App pool to stop and brings the site down. We have gone through all log files and have concluded nothing. We are using the 2.0.3 version on IIS 6.
I've noticed IIS defaults web apps on a 29-hour recycle schedule, which can be troublesome since it may recycle at times your users do not expect it to.
For example: web app starts at 12 am, which means the next day it recycles at 5am, the day after that at 10am, the day after that at 3pm, etc. (this is assuming there is enough request activity against your app to keep it alive so it does not shutdown due to inactivity)
If your web app relies heavily on in-memory session state this is especially bad because the recycle will kill sessions and possibly force users to re-authenticate and lose any unsaved work. (if you don't design your app to work seamlessly with recycling)
Check the recycle schedule and make sure it recycles at a time that you expect. See this for screenshots: http://remy.supertext.ch/2010/08/iis7-worker-process-reached-its-allowed-processing-time-limit/
Not sure about the infinite loop suggestion... sounds like you just have a recycling configuration issue to resolve.
This likely indicates an infinite loop in your application code.
Basically, every time a request comes into the web server, IIS hands the request off to a worker process. You can configure in IIS how many of those workers there are, and what the timeout value is. The timeout is to keep things moving in case the application code hangs -- it gets killed so the thread can go back in the pool to keep servicing new requests.
So look through your code for likely infinite loops. Or alternatively, it could be an extremely long-running database query that could have eventually finished but exceeded the timeout value. Perhaps your web application offers the end user an opportunity to make too broad of a query that returns too much data or requires too much DB processing time.
It's hard to give a specific cause for you, of course, but try to think along these lines.
If you're experiencing a crash as a result (sounds like you are) then you might want to grab a copy of Debugging Tools for Windows and spend some time reading Tess Ferrandez' blog--she offers great advice on performing post mortem crash analysis and makes WinDbg a whole lot more approachable.

WinForms ReportViewer: slow initial rendering

UPDATE 2.4.2010
Yeah, this is an old question but I thought I would give an update. So, I'm working with the ReportViewer again and it's still rendering slowly on the initial load. The only difference is that the SQL database is on the reporting server.
UPDATE 3.16.2009
I have done profiling and it's not the SQL that is making the ReportViewer render slowly on the first call. On the first call, the ReportViewer control locks up the UI thread and makes the program unresponsive. After about 5 seconds the ReportViewer will unlock the UI thread and display "Report is being generated" and then finally show the report. I know 5 seconds is not much but this shouldn't be happening. My coworker does the same thing in a program of his and the ReportViewer immediately displays the "Report is being generated" upon any request.
The only difference is that the reporting server is on one server and the data is on another server. However, when I am developing the reports within SSRS, there is no delay.
UPDATE
I have noticed that only the first load of the ReportViewer takes a long time; each subsequent load of the same or different reports loads fast.
I have a WinForms ReportViewer that I'm using in Remote processing mode that can take up to 30 seconds to render when the ReportViewer.RefreshReport() method is called. However, the report itself runs fast.
This is the code to setup my ReportViewer:
rvReport.ProcessingMode = ProcessingMode.Remote
rvReport.ShowParameterPrompts = False
rvReport.ServerReport.ReportServerUrl = New Uri(_reportServerURL)
rvReport.ServerReport.ReportPath = _reportPath
This is where the ReportViewer can take up to 30 seconds to render:
rvReport.RefreshReport()
I found the answer on other forums. MSDN explains that a DLL is searching for some Verisign web server and it takes forever... there are 2 ways to turn it off, one is a checkbox in internet explorer and another is adding some lines to the app.config file of the app.
You can pull a report in two modes, local and server. If you're running in local mode, it's going to pull both the data and the report definition onto your machine, then render them both. In server mode, it's going to just let SSRS do all the work, then pull back the information to render.
If you're using local mode, it could be a hardware issue. If you've got a huge dataset, that's a lot of data to store in memory.
Other than that, that's not a lot of info to go on...
Update: since you've noticed it's only the first call that takes a while, have you done any profiling to determine if the bulk of the work is done on the backend SQL calls or is spent in the actual report render?
If it's faster on subsequent calls, it's possible you're (incidentally) caching at one level or another. You can cache reports (http://www.sqlservercurry.com/2007/12/configure-report-to-be-cached-ssrs-2005.html) or it could be that the execution plan to return the data is being cached deep in SQL Server.
In summary of the various ideas already presented, it could be
startup time for the report viewer infrastructure on the client
cache loading time on the client
query execution time at the server
report rendering time at the server
Try running the report, closing down the client, restarting the client and running the report again. If the report is much faster the second time, repeat this experiment but load, run and unload another large application in between report runs.
If the second report run continues to be much quicker, then the difference you are seeing has more to do with the SQL Server's I/O cache than what's happening on the client. You can further test this by deliberately displacing the MSSQL cache by running a query that pulls a lot of data from tables that aren't used in the report.
All of the above is interesting but unimportant. If you want to ensure snappy report response Reporting Services provides extensive support for scheduled generation of reports, so that when the consumer requests the report, the only delay is network delivery.
If your users insist on reporting on up to the minute (live) data they'll either have to specify tighter constraint parameters or get used to waiting.
ReportServer always takes a while to wake up because it's running under IIS. There is a process time out on each AppPool. We have the same issue with our ASP.NET application's report viewer. You could try increasing the AppPool keep alive times in the IIS settings.
See here:
http://www.sqlreportingservices.net/Ask/5536.aspx
http://www.developmentnow.com/g/115_2005_9_0_0_597422/First-run-of-reports-is-SLOW.htm
I'm assuming you're running SQL2005 SSRS of course.
One option is to upgrade to 2008 where SSRS no longer depends on IIS.
Thinking way out of the box: Is the report server on different machine to the one running the application? The network could be taking a long time to resolve "reportServerURL". Once resolved the name would be cached and hence subsequent calls would be quicker..
I have had this problem before with badly configured DNS servers. Try replacing "reportServerUrl" with "reportServerIPAddress" and see if the initial call to ReportViewer is any faster.
I was having this same problem.
i find out that changing the default printer(slow network here) fix the problem.
The ReportViewer gets some information from the default printer,
and since the network here is very slow, i was having 10 seconds of delay
Hope it helps
UPDATE
I have noticed that only the first load of the ReportViewer takes a long time; each subsequent load of the same or different reports loads fast.
You are set to run on server which means the SRS server needs to do the rendering as such the first time there will be a delay for one or all of the following reasons (these are the slowest of the bunch, there are others but they are quicker):
DNS resolution: The URL needs to be resolved to an IP address. Once this is done it is cached locally which speeds it up.
ASP.NET/IIS needs time to warm up. There is all kinds of compilation and initial loading that must occur - after loaded it will remain in the servers memory until you restart IIS or the default clean up time occurs.
Reporting Services needs time to warm up in the same way as ASP.NET/IIS does.
To test for this use a network monitor such as Netmon (if you are a Microsoft fan) or Wireshark (my recommendation) and watch the traffic from your machine to the server. You'll see the DNS request, then the HTTP requests go and the delay will be in the returning data. On second call you will see the speed is vastly different in the return and DNS checks.
What you could do to prevent this is a warm up script - I don't know one for SRS but here is a link to a SharePoint one which would not be hard to change since it has the exact same issues.
It seems as though you are going after the SSRS report directly. You may want to hit the SSRS web service instead. That may improve your performance.
Here is a possible resolution for your problem:
Try to access the first report from web before accessing any report with the application.
If the problem doesn't appear, you could make an application that will "preload" the first report, in order to allow reporting services to do their start-up.
I've seen this kind of solution for some demo applications from Microsoft. The applications where using Analysis Services and Reporting Services.
Good luck otherwise
To my knowledge, I think it's a problem Microsoft is finding it tough resolve.
Initially, the report loader is only slow at firt time rendering of report and subsequent reports loads noramal (a bit faster).
To help counter this, place a Startup Form with a label (Label1) and Timer (Timer1) control. Set Label1.Text="Please, wait (about 15 secs)". Set Timer1.Interval=3.
At the form_Load event of the Startup Form, set Timer1.Start.
At Tick event of Timer1 place "frmMyReportForm.reportViewer1.SetDisplayMode(Microsoft.Reporting.WinForms.DisplayMode.Normal)"
"frmMyReportForm" any of the forms in your project containing a reportviewer control.
All the delays will be caught here so that when you generate the actual report, there will be no delays.
I hope this might be helpful to my fellow developers.