How can I redirect to a "Processing..." page while my CGI program is working? - cgi

I am trying to write a Perl script that takes data from a user, creates a HTML file on the basis of that data, redirects to this HTML file and then performs some computations. The problem I am facing is that the browser does not redirect to the new HTML page unless the computations are completed. Please suggest a solution.

It sounds like you need to run your computations in the background. One way to do this is to use a fork() call. I think you may find this question helpful.

See Randal Schwartz's article Watching long processes through CGI.

print "Location: http://yoursite.com/path/to/your/page.html\n";
(note this is how you do it in Perl but "Location" is an HTTP directive. Outputting that string works in any language)

Thanks for your help. In the meantime I took a different approach. I dont know if it is good or not but it works. I started writing a logfile putting in parameters necessary for the computations and started a cronjob. So as soon as any new entry in that logfile, computations start in the background.
Thank you once again.

Related

How to check the contents of postgres

I'm running tests with Matchstick and my save() calls don't seem to be working (I set up my tests by saving some entities, but then my application code doesn't see them when it goes to load).
Is there any way to check the current state of the backend and see what's in there? Mainly just trying to troubleshoot.
Turns out, you just have to read the docs
https://thegraph.com/docs/en/developer/matchstick/
logStore()

Intercepting with XMLHttpRequest for a specific address using greasemonkey

I'm trying to write a greasemonkey script that will work on either Chrome and Firefox.. a script that will block XMLHttpRequest to a certain hard-coded url..
I am kind of new to this area and would appreciate some help.
thanks.
it possible now using
#run-at document-start
http://wiki.greasespot.net/Metadata_Block#.40run-at
but it need more improvement, check the example
http://userscripts-mirror.org/scripts/show/125936
This almost impossible to do with Greasemonkey. It is the wrong tool for the job. Here's what to use, most effective first:
Set your hardware firewall, or router, to block the URL.
Set your software firewall to block the URL.
Use Adblock to block the URL.
Write a convoluted userscript that tries to block requests from one set of pages to a specific URL. Note that this potentially has to block inline src requests as well as AJAX, etc.

Proper way to check system requirements for a WordPress plugin

I am curious about the proper way to stop a user from activating my plugin if their system does not meet certain requirements. Doing the checks is easy and I don't need any help with that, I am more curious how to tell WordPress to exit and display an error message.
Currently I have tried both exit($error_message) and die($error_message) in the activation hook method. While my message is displayed and the plugin is not activated, a message saying Fatal Error is also displayed (see image below).
Does anyone know of a better way, that would display my message in a proper error box without displaying Fatal error, it just looks really bad for new users to see that.
Thanks for any help in advance.
This is a little undocumented, as you might have noticed. Instead of die(), do it like this:
$plugin = dirname(__FILE__) . '/functions.php';
deactivate_plugins($plugin);
wp_die('<p>The <strong>X</strong> plugin requires version WordPress 2.8 or greater.</p>','Plugin Activation Error',array('response'=>200,'back_link'=>TRUE));
The lines above wp_die() are to deactivate this plugin. Note that we use functions.php in this case because that's where I have my Plugin Name meta data comment declaration -- and if you use a different file, then change the code above. Note that the path is very specific for a match. So, if you want to see what your path would normally be, use print_r(get_option('active_plugins'));die(); to dump that out so that you know what path you need. Since I had a plugin_code.php where the rest of my plugin code was, and since it was in the same directory as functions.php, I merely had to do dirname(__FILE__) for the proper path.
Note that the end of the wp_die() statement is important because it provides a backlink and prevents an error 500 (which is the default Apache code for wp_die()).
It is only a idea though. Try checking the wordpress version and compare then use php to through custom exception/error. PHP 5.0 try catch can be a good way to do it. Here is some resources.
http://www.w3schools.com/php/php_exception.asp
http://php.net/manual/en/internals2.opcodes.throw.php
You can try the first link. It is pretty basic. Thanks! hope the information will be helpful.

IIS/Cache problem?

I have a program that checks if a file is present every 3 seconds, using webrequest and webresponse. If that file is present it does something if not, ect, that part works fine. I have a web page that controls the program by creating the file with a message and other variables as entered into the page, and then creates it and shoots it over to the folder that the program is checking. There is also a "stop" button that deletes that file.
This works well except that after one message is launched and then deleted, when it is launched the second time with a different message the program still sees the old message. I watch the file be deleted in IIS, so that is not the issue.
I've thought about meta tags to prevent caching, but would having the file be dynamically named solve this issue also? How would I make the program be able to check for a file where only the first part of the filename is known? I've found solutions for checking directories on local machines, but that won't work here.
Any ideas welcome, thanks.
I'm not that used to IIS, but in Apache you can create a .htaccess and set/modify HTTP-Headers.
With 'Cache-Control' you can tell a proxy/browser not to cache a file.
http://www.w3.org/Protocols/rfc2616/rfc2616-sec13.html
A solution like this may work in IIS too if it is really a cache problem.
(To test this, open using your preffered browser with caching turned off
A simple hack is to add something unique to the url each time
http://www.yourdomain.com/yourpage.aspx?random=123489797
Adding a random number to the URL forces it to be fresh. Even if you don't use the querystring param, IIS doesnt know that, so executes the page again anyways.

Up and download directly - no waiting

I would want to program something where you upload a file on the one side and the other person can download it the moment I start uploading. I knew such a service but can't remember the name. If you know the service I'd like to know the name if its not there anymore I'd like to program it as an opensource project.
And it is supposed to be a website
What you're describing sounds a lot like Bit Torrent.
You might be able to achieve this by uploading via a custom ISAPI filter (if you use IIS) -- all CGI implementations won't start to run your script until the request has completed, which makes sense, as you won't have been told all the values just yet, I'd suspect ISAPI may fall foul of this as well.
So, your next best bet is to write a custom HTTP server, that can handle the serving of files yet to finish uploading.
pipebytes.com I found it :)