Delete a file even if it is running - vb.net

I need to remove a file even if it is used by a running process.
Firstly, of course the process needs to be shut down, and after that the file should be deleted, if it exists.
I'm using the following code:
Sample:
Dim Processes() As Process = Process.GetProcessesByName("test")
For Each Process As Process In Processes
Process.Kill()
Next
My.Computer.FileSystem.DeleteFile(C:\ProgramFiles\Test\test.exe)
I tried the code above, it does not work, file is still running and also it is not removed! Can you please provide a reliable solution to this issue?
Thanks.

It may take a little while for the after-effects of Process.Kill() to complete.
If you add a delay between the Kill command and the File.Delete, such as Threading.Thread.Sleep(2000), it may be sufficient for the latter to complete.
A more robust solution might involve repeatedly attempting to delete the file, up to some reasonable number of tries, with a small time delay between attempts.

Related

Is there a way in vb.net to make process start closing my program?

My program checks if there is a new version of itself. If yes it would exit and start an updater that replaces it and then restarts.
My problem is that I haven't found any info on how to make process start right after closing the actual program.
Any suggestions?
Thanks in advance
I intended to add a comment, but I'm too low in points here. The updater itself should probably contain a check to determine whether your application is running an instance, and it should contain a timeout loop that performs this check and factor the timeout following it's startup state. That way you can awaken it, and close your application. The updater should just determine your application is not running, compare versions perform the intended update operation.
a possible solution would also be to create a task via tash sceduler or cron job, starting an out of process application, like CMD.exe.. which brings me to my original comment-question: in regards to what Operating System(s) and Platform(s) is your program intended for?

How to close/stop a .NET application and re-execute it?

My application updates(running a vba script) an excel shared workbook, and since it is shared, there shouldn't be problems when someone else is using the same file at the same time. But for some reason, sometimes it simply freezes, without any error message, just freezes.
Is there a way to programatically make the application stops/closes automatically when frozen or after some minutes(In normal conditions, this updating process shouldn't take more than 1 minute)?
And, if possible, re-launch the app again automatically after some minutes for at least 5 attempts?
This way would ensure process completes succesfully.
I have had to do this same thing before but because I had an application that would look for updates to it's self on the network and then update it locally. Problem is, you cannot update the exe that is running.
What I did to get around it is to create another program that would wait a second, update the exe, then run the exe again.
Because I did this with a few different apps, I made my "Updater" generic so I could send some command line parameters and it would use those to copy and run.
If you want to try something else, you might be able to accomplish this same thing by creating a BAT file and running it. I'm not real good on BAT files so I can't help you there. But, it is another way to handle it.

PHP script stops running arbitrarily with no errors

I have a PHP script that seemed to stop running after about 20 minutes.
To try to figure out why, I made a very simple script to see how long it would run without any complex code to confuse me.
I found that the same thing was happening with this simple infinite loop. At some point between 15 and 25 minutes of running, it stops without any message or error. The browser says "Done".
I've been over every single possible thing I could think of:
set_time_limit ( session.gc_maxlifetime in the php.ini)
memory_limit
max_execution_time
The point that the script is stopped is not consistent. Sometimes it will stop at 15 minutes, sometimes 22 minutes.
Please, any help would be greatly appreciated.
It is hosted on a 1and1 server. I contacted them and they don't provide support for bugs caused by developers.
At some point your browser times out and stops loading the page. If you want to test, open up the command line and run the code in there. The script should run indefinitely.
Have you considered just running the script from the command line, eg:
php script.php
and have the script flush out a message every so often that its still running:
<?php
while (true) {
doWork();
echo "still alive...";
flush();
}
in such cases, i turn on all the development settings in php.ini, of course on a development server. This display many more messages, including deprecation warnings.
In my experience of debugging long running php scripts, the most common cause was memory allocation failure (Fatal error: Allowed memory size of xxxx bytes exhausted...)
I think what you need to find out is the exact time that it stops (you can set an initial time and keep dumping out the current time minus initial). There is something on the server side that is stopping the file. Also, consider doing an ini_get to check to make sure the execution time is actually 0. If you want, set the time limit to 30 and then EVERY loop you make, continue setting at 30. Every time you call set_time_limit, the counter resets and this might allow you to bypass the actual limits. If this still isn't working, there is something on 1and1's servers that might kill the script.
Also, did you try the ignore_user_abort?
I appreciate everyone's comments. Especially James Hartig's, you were very helpful and sent me on the right path.
I still don't know what the problem was. I got it to run on the server with using SSH, just by using the exec() command as well as the ignore_user_abort(). But it would still time out.
So, I just had to break it into small pieces that will run for only about 2 minutes each, and use session variables/arrays to store where I left off.
I'm glad to be done with this fairly simple project now, and am supremely pissed at 1and1. Oh well...
I think this is caused by some process monitor killing off "zombie processes" in order to allow resources for other users.
Run the exec using "2>&1" to log anything including stderr.
In my output I managed to catch this:
...
script.sh: line 4: 15932 Killed php5-cli -d max_execution_time=0 -d memory_limit=128M myscript.php
So something (an external force, not PHP itself) is killing my process!
I use IdWebSpace which is excellent BTW but I think most shared hosting providers impose this resource/process control mechanism just to be sane.

Kill process after launching with AuthorizationExecuteWithPrivileges

If I launched a shell script using AuthorizationExecuteWithPrivileges what would be the easiest way to kill the script and any other processes that it spawned.
Thanks
It's running as root, so you can't kill it from a regular-user process. You're going to have to ask it nicely to exit on its own.
Apple has sample code that uses stdout to pass the PID back to the caller.
Use the communications pipe that AuthorizationExecuteWithPrivileges() returns by reference in its last argument, FILE **communicationPipe, to send a message to the child process that asks it to take itself and its descendants out. It can then kill itself and all its descendants using kill(0, SIGINT), or, if more drastic measures are required, SIGKILL.
The message you use can be as simple as closing the file while the child waits for the file to close; at that point, it knows you're done talking to it and it's time to take itself out.
There are some caveats about the descendants that will actually receive this message, for which see the kill(2) manpage. The caveats mostly won't matter so long as the process you started via AEWP hasn't dropped privileges, though one implicit issue is that this approach won't work if any child processes have put themselves in a new process group.

How can I speed up batch processing job in Coldfusion?

Every once in awhile I am fed a large data file that my client uploads and that needs to be processed through CMFL. The problem is that if I put the processing on a CF page, then it runs into a timeout issue after 120 seconds. I was able to move the processing code to a CFC where it seems to not have the timeout issue. However, sometime during the processing, it causes ColdFusion to crash and has to restarted. There are a number of database queries (5 or more, mixture of updates and selects) required for each line (8,000+) of the file I go through as well as other logic provided by me in the form of CFML.
My question is what would be the best way to go through this file. One caveat, I am not able to move the file to the database server and process it entirely with the DB. However, would it be more efficient to pass each line to a stored procedure that took care of everything? It would still be a lot of calls to the database, but nothing compared to what I have now. Also, what would be the best way to provide feedback to the user about how much of the file has been processed?
Edit:
I'm running CF 6.1
I just did a similar thing and use CF often for data parsing.
1) Maintain a file upload table (Parent table). For every file you upload you should be able to keep a list of each file and what status it is in (uploaded, processed, unprocessed)
2) Temp table to store all the rows of the data file. (child table) Import the entire data file into a temporary table. Attempting to do it all in memory will inevitably lead to some errors. Each row in this table will link to a file upload table entry above.
3) Maintain a processing status - For each row of the datafile you bring in, set a "process/unprocessed" tag. This way if it breaks, you can start from where you left off. As you run through each line, set it to be "processed".
4) Transaction - use cftransaction if possible to commit all of it at once, or at least one line at a time (with your 5 queries). That way if something goes boom, you don't have one row of data that is half computed/processed/updated/tested.
5) Once you're done processing, set the file name entry in the table in step 1 to be "processed"
By using the approach above, if something fails, you can set it to start where it left off, or at least have a clearer path of where to start investigating, or worst case clean up in your data. You will have a clear way of displaying to the user the status of the current upload processing, where it's at, and where it left off if there was an error.
If you have any questions, let me know.
Other thoughts:
You can increase timeouts, give the VM more memory, put it in 64 bit but all of those will only increase the capacity of your system so much. It's a good idea to do these per call and do it in conjunction with the above.
Java has some neat file processing libraries that are available as CFCS. if you run into a lot of issues with speed, you can use one of those to read it into a variable and then into the database
If you are playing with XML, do not use coldfusion's xml parsing. It works well for smaller files and has fits when things get bigger. There are several cfc's written out there (check riaforge, etc) that wrap some excellent java libraries for parsing xml data. You can then create a cfquery manually if need be with this data.
It's hard to tell without more info, but from what you have said I shoot out three ideas.
The first thing, is with so many database operations, it's possible that you are generating too much debugging. Make sure that under Debug Output settings in the administrator that the following settings are turned off.
Enable Robust Exception Information
Enable AJAX Debug Log Window
Request Debugging Output
The second thing I would do is look at those DB queries and make sure they are optimized. Make sure selects are happening with indicies, etc.
The third thing I would suspect is that the file hanging out in memory is probably suboptimal.
I would try looping through the file using file looping:
<cfloop file="#VARIABLES.filePath#" index="VARIABLES.line">
<!--- Code to go here --->
</cfloop>
Have you tried an event gateway? I believe those threads are not subject to the same timeout settings as page request threads.
SQL Server Integration Services (SSIS) is the recommended tool for complex ETL (Extract, Transform, and Load) work, which is what this sounds like. (It can be configured to access files on other servers.) The question might be, can you work up an interface between Cold Fusion and SSIS?
If you can upgrade to cf8 and take advantage of cfloop file="" which would give you greater speed and the file would not be put in memory (which is probably the cause of the crashing).
Depending on the situation you are encountering you could also use cfthread to speed up processing.
Currently, an event gateway is the only way to get around the timeout limits of an HTTP request cycle. CF does not have a way to process CF pages offline, that is, there is no command-line invocation (one of my biggest gripes about CF - very little offling processing).
Your best bet is to use an Event Gateway or rewrite your parsing logic in straight Java.
I had to do the same thing, Ben Nadel has written a bunch of great articles uses java file io, to allow you to more speedily read files, write files etc...
Really helped improve the performance of our csv importing application.