calling a long running vb6 com object in classic asp .. time out error - com

I have a long running vb6 Com object called from a classic asp page
it works perfectly when there is not a lot to do but it times out if it has to do a lot.
is there a way of calling it async so it wont time out or
could i call a progress bar to keep on refreshing the client so it wouldnt time out ?
Set objQReport = Server.CreateObject("ReportGenerator")
mainRpt = objQReport.GenerateReport(MySessionRef) ' times out here sometimes
Set objQReport = nothing
any tips would be helpful

Web technology is not really suited for long-running tasks, but you have several options:
One option is to do an AJAX-call to a second ASP page. As soon as you're ASP is running, the server will finish the process, even if the client (the browser/AJAX that did the actual call) is no longer connected.
This method does use web-technology to process a long running task and the downside is that you are burdening your IIS machine with performing this long-running-task, leaving less performance for the thing IIS is good at; serving webpages.
So in your landing page (say default.asp) do an AJAX call to your (long-running) report page. How to do an ajax call depends on what (if any) javascript library you use. in Jquery it would be something like this:
<script type="text/javascript">
/* AJAX call to start the report generation */
$(document).ready(function(){
$.get("[URL_OF_YOUR_LONG_RUNNING_PROCESS]", function(data)
{
alert(data);
});
});
</script>
As you can see I am alerting any data that is returned from this URL, but in your case that is probably not what you want. You want your visitor to keep browsing while the long running process keeps working.
This way, the URL is called asynchronously. The server will start processing the URL and the browser doesn't have to wait for it. The server will continue and finish the long running task in the background.
Please note that you will still have to increase the server.scripttimeout on your asp page that runs the long process. This just makes sure that the user can continue browsing, the server will still respect the server.scripttimeout setting that is configured, and fail if it takes too long.
A widely used second option is to use a message queue. A message queue accepts messages and guarantees delivery of these messages, even if the computer or network goes down.
Microsoft Windows has MSMQ built in (you'll have to enable it in the software settings), and you can use this from classic ASP. The queue will store messages and deliver them to a consumer. The consumer is something you need to write yourself; an application that can read a queue and process the messages inside.
What you do is have ASP write a message to the MSMQ, containing information on what task to perform and its parameters.
Your consumer application will have to poll the MSMQ, read the message and start the long-running process. This will then run completely independent of IIS, and can even be run on a totally different computer (MSMQ can run across networks).
The downside of this second method is that you will have to write a consumer, most likely in a bit more low level language like VB or C# (though you might be able to use Python for example), and preferably write it as a service. I don't know how comfortable you are in (one of) these languages, but if you wrote the COM object yourself, it would be trivial to write an executable in VB6 that polls an MSMQ and calls the COM object.
Below are some links to get you started.
http://support.microsoft.com/kb/173339
http://technosock.blogspot.nl/2007/07/microsoft-message-queue-from-classical.html
http://www.informit.com/articles/article.aspx?p=131272&seqNum=6
Hope this helps,
Erik

Related

Async WCF and Protocol Behaviors

FYI: This will be my first real foray into Async/Await; for too long I've been settling for the familiar territory of BackgroundWorker. It's time to move on.
I wish to build a WCF service, self-hosted in a Windows service running on a remote machine in the same LAN, that does this:
Accepts a request for a single .ZIP archive
Creates the archive and packages several files
Returns the archive as its response to the request
I have to support archives as large as 10GB. Needless to say, this scenario isn't covered by basic WCF designs; we must take additional steps to meet the requirement. We must eliminate timeouts while the archive is building and memory errors while it's being sent. Both of these occur under basic WCF designs, depending on the size of the file returned.
My plan is to proceed using task-based asynchronous WCF calls and streaming mode.
I have two concerns:
Is this the proper approach to the problem?
Microsoft has done a nice job at abstracting all of this, but what of the underlying protocols? What goes on 'under the hood?' Does the server keep the connection alive while the archive is building (could be several minutes) or instead does it close the connection and initiate a new one once the operation is complete, thereby requiring me to properly route the request through the client machine firewall?
For #2, clearly I'm hoping for the former (keep-alive). But after some searching I'm not easily finding an answer. Perhaps you know.
You need streaming for big payloads. That is the right approach. This has nothing at all to do with asynchronous IO. The two are independent. The client cannot even tell that the server is async internally.
I'll add my standard answers for whether to use async IO or not:
https://stackoverflow.com/a/25087273/122718 Why does the EF 6 tutorial use asychronous calls?
https://stackoverflow.com/a/12796711/122718 Should we switch to use async I/O by default?
Each request runs over a single connection that is kept alive. This goes for both streaming big amounts of data as well as big initial delays. Not sure why you are concerned about routing. Does your router kill such connections? That's a problem.
Regarding keep alive, there is nothing going over the wire to do that. TCP sessions can stay open indefinitely without any kind of wire traffic.

How to capture screen shot of 1000 web pages concurrently in c#

I need to get screenshot of 1000 URLs using Parallel.Foreach in windows service. I tried to use WebBrowser control but it throws error since it runs only in STA. Kindly tell me how to achieve this task using Parallel.Foreach...
Edit : I am using a third party trial version DLL in below code to process it...
Parallel.ForEach(webpages, webPage=>
{
GetScreenShot(webPage);
}
public void GetScreenShot(string webPage)
{
WebsitesScreenshot.WebsitesScreenshot _Obj;
_Obj = new WebsitesScreenshot.WebsitesScreenshot();
WebsitesScreenshot.WebsitesScreenshot.Result _Result;
_Result = _Obj.CaptureWebpage(webPage);
if (_Result == WebsitesScreenshot.
WebsitesScreenshot.Result.Captured)
{
_Obj.ImageFormat = WebsitesScreenshot.
WebsitesScreenshot.ImageFormats.PNG;
_Obj.SaveImage(somePath);
}
_Obj.Dispose();
}
Most of the time this code runs fine upto processing of 80 urls but after that some tasks are being blocked. I don't know why...
Some times error is ContextSwitchDeadlock....as given below...
ContextSwitchDeadlock was detected
Message: The CLR has been unable to transition from COM context 0x44d3a8 to COM context 0x44d5d0 for 60 seconds. The thread that owns the destination context/apartment is most likely either doing a non pumping wait or processing a very long running operation without pumping Windows messages. This situation generally has a negative performance impact and may even lead to the application becoming non responsive or memory usage accumulating continually over time. To avoid this problem, all single threaded apartment (STA) threads should use pumping wait primitives (such as CoWaitForMultipleHandles) and routinely pump messages during long running operations.
This error indicates that a CLR thread is not sending any messages for an extended period of time. If a process is resource starved causing extended waits during processing this error can occur.
Given that you are trying to process 1000 web pages simultaneously, it would be no surprise that at least some of the threads will become resource starved. Personally, it is surprising to me that you can hit 80 websites without seeing errors.
Back off the number of websites you are trying to processing in parallel and your problems will likely disappear. Since you are running the trial version, there is little else you can do. If you licensed the commercial version you might be able to get support from the vendor. But at a guess, they would simply tell you to do the same thing.
The Websites.Screenshot library can be quite resource intensive depending upon the web page, esp. if the pages have flash. Think of it as being logically equivalent to opening 80 tabs simultaneously in a web browser.
You don't mention whether you are using the 32-bit or the 64-bit version. But the 64-bit version is likely to have fewer resource constraint, esp. memory. IMHO The .Net framework does a poor job of minimizing memory usage, so memory problems can crop up earlier than you would think.
ADDED
Please try limiting the number of threads threads first, e.g.
Parallel.ForEach(
Webpages,
new ParallelOptions { MaxDegreeOfParallelism = 10 }, // 10 thread limit
webPage => { GetScreenShot(webPage); }
);
Without access to the source code, you may not be able to change the threading model at all. You might also try setting the timeout to a higher value.
I don't have this control personally and am not willing to install it on my machine to answer a question re: changing the threading model. Unless it is a documented feature, you probably won't be able to do it without changing or at least inspecting the source.

Find the best solution for read file and call web service

I have a text file which has about 100,000 records of identifier.
I must read all of record, each record i do request to web service and receive the result from web service, the result i write to another file.
I'm confuse between two solution:
- Read identifier file to a list of identifier, iterate this list, call web service, ....
- Read identifier line on each line, call web service, .....
Do you think what solution will be better ? program will do faster ?
Thanks for all.
As Dukeling says, using different threads to read the file, send requests and write to file can increase the speed of the program, rather the one thread solutions you propose.
I recommend that you would start using asynchronous calls to your web service. You make the call, but don't wait for a response (you handle the responses in the callback). When you make a lot of calls to the web service in parallel (as you want speed), this frees up some I/O threads on your hosting machine and can improve the rate/time of processed requests sometimes.
Then you can have a thread that reads from the file, starts the asynchronous call and repeats. On the callback function you implement the writing to file. You should at this level implement a logic that insures that your responses are written in the right order.
On the other hand, calling the web service for each record may be too chatty.
I would suggest an implementation similar to pagging: loading a certain amount of records, sending them to operation and receiving the responses in bulk. You should take care of not failing the whole package for one recors, have a logic for resending only a part of the tasks and so on.

Progress notification in WCF for long running processes - How?

I have to design and implement a way to deal with long running processes in a client/server application. A typical long running process would/could take 2-3 minutes. I also need to report progress to the UI in the meantime and keep the UI responsive.
Having these in my mind I though of a few solutions:
One async request to start the process which starts the server-side process and returns an assigned LRPID (Long Running Process ID) then poll periodically from the client using that LRPID. (Pro: simple to deploy, no firewall messing around Con: Unelegant, resource consuming etc.)
Use a duplex binding (such as NetTcpBinding) and initiate callbacks from the server as progress is being made (Pro: Elegant, efficient, Con: Deployment nightmare)
[Your suggestion???]
What would be your take on this?
Here is a post by Dan Wahlin about how to create a WCF Progress Indicator for a Silverlight Application. This should be of some help.
If you do not want to have to worry about the client's firewall, etc... I would probably go with your first solution and use a BackGroundWorker to make the call in order to keep from blocking the UI thread. I did this recently for an app where a request to generate a report is put on a queue and is retrieved once it is done. It seems to work well.
Another way (without having to change the WCF binding) is to use a WebBrowser control in the WPF client, and SignalR to post progress messages from the server to that control.
Note that to avoid javascript errors that happen with the WebBrowser control (because by default it seems to use Internet Explorer version 7 which doesn't seem to be compatible with jQuery.js), you will need to add keys to the registry on the client machine to change the default for the client app to use IE10 or later - see http://weblog.west-wind.com/posts/2011/May/21/Web-Browser-Control-Specifying-the-IE-Version).
This could be a deployment nuisance (because admin rights seem to be needed - eg on a 64 bit Windows 8.1 pc - to add the registry keys).
Also, it still seems necessary to call the long running WCF method in a separate thread, otherwise the WebBrowser control doesn't seem to update its display to show the SignalR messages it is receiving. (This makes sense because the UI thread would otherwise have to wait until the WCF call had finished).
But I mention it as an alternative approach using a newer tool (SignalR) :)

Silverlight WCF Proxy async only?

Why do the Silerlight-generated WCF proxy class(es) offer only async calls?
There are cases where I don't really need the async pattern (for example in a BackgroundWorker)
EDIT : Sometimes I need to process the results of two WCF calls. It would have been much simpler if I could have waited (the business of the app allows that) for both calls to end and then process.. but noooo.... async! :P
As I understand it, the aim here is to make it hard for people to do the wrong thing (sync. IO from the UI). If you are using the WCF classes, you'll probably have to live with it.
There's actually a technical reason you can't do sync calls, at least from the 'main' browser thread, which is that the browser invokes all the plug-in API calls on the same thread, so if SL were to block that thread while waiting for the network callback, the network callback wouldn't get through and the app would deadlock. That said, the sync API would work fine if initiated from a different thread -- ie, if the application first does a QueueUserWorkItem to get off the browser thread -- but we felt it would be confusing to offer the sync option and have it only work some of the time.
Andrei, there ar emethods that even using the async pattern, allows you write expressive code, esasy to read and maintian, without becoming crazy wating 4 async requests, by just simplifying the way you write your code.
give a look to this library http://syncwcf.codeplex.com/