how can you delete a netbeans lock file - netbeans-8

When I try and save a NetBeans file, I get an error message similar to:
there were some problems while saving xxx.cpp
cause: cannot write to locked file:
/home/xxx/xxx.cfg
I had to kill NetBeans because it froze, and it has left a lock file somewhere

Related

How to know locked file process VB6

Facing one sporadic issue in legacy application developed in VB6. Application produces intermediate file and then tries to delete it once required output is produced. Application does delete files properly but sometimes, I am getting error stating "Path/File access error". I have tried adding delay to delete but this issue is not getting resolved.
I wanted to search if there is any possibility to check process name which has acquired lock in VB6. I tried searching but no luck so far.
Could anyone please tell me any way I can get process name which has locked file and causing to not delete?
Please note that this issue happens infrequently.
Windows does not keep a global database of who has what file open. This is for speed reasons. Although it does keep a list of other computers with files open on this computer.
For debugging purposes you can enable a global database. Remember to un-enable it.
So to enable
Openfiles /local on
then reboot.
To query
Openfiles /query /v
See
Openfiles /?
Openfiles /local /?
Openfiles /query /?

Upon Remote-ssh the setup waits on the file vscode-scp-done.flag . the tar file is present and so are the locks but waiting never seems to end

The tar file is present and so are the locks but waiting never seems to end on the subject file. Thank you. I am doing this using visual Studio Code and Extension Remote-ssh

How to delete remote file using Kettle Pentaho

I have a directory in remote Linux machine where files are being archived and kept for a certain period of time. I want to delete a file from remote (Linux) machine using kettle transformation based on some condition.
If file does not exists then job should not throw any error but if file exists at remote location, then job should delete file or raise an error in case some other reason, i.e., permission issue.
Here, the file name will be retrieved as a variable from previous steps of transformation and directory path of archived files will be fixed one.
How can I achieve this in Pentaho Kettle transformation?
Make use of "Run SSH commands" utility to pass commands to your remote server.
Assuming you do a rm -f /path/file it won't error for a non-existent file.
You can capture the output and perform an error handling as well (Filter rows and trigger the course of action).
Or you can mount remote directory to machine where kettle is, and try to delete file as regular.
Using ssh, i think, non trivial. It needs a lots of experiments to find out error types, to find way to distinguish errors. It might be and error with ssh connection or error to delete file.

drop repository fails on file error

Context: GraphDB 7.1.0
Using the openrdf-console, when requesting to drop a repository:
drop myrepo .
I get an error/exception:
[ERROR] 2016-09-13 09:44:32,369 [repositories/myrepo | o.o.h.s.ProtocolExceptionResolver] Error while handling request (500)
org.openrdf.http.server.ServerHTTPException: org.openrdf.repository.RepositoryException: Unable to clean up resources for removed repository myrepo
Caused by: java.io.IOException: Unable to delete file/nas/install/graphdb/graphdb-se-7.1.0/graphdb-se-7.1.0/data/repositories/myrepo/storage/.nfs000000016e3e49b200000006
Any further attempt to drop the repo again or to add anything to it then fails on the same error.
Apparently GraphDB tries to delete the repository directory without closing the file descriptors pointing to files in this directory.
In my case, the data directory is potentially big and lies on a NAS which is attached through NFS.
When asked to delete an opened file, a temporary .nfs000XXXis created, and it stops the remove directory command.
A workaround is to stop GraphDB, delete the repository's directory by hand and restart GraphDB.

My Debian repository is throwing a "Hash Sum mismatch" error

We maintain a Debian repository for an app and all .deb files are stored on a s3 bucket.
We wrote a script to upload the files and update the Packages.gz file. All went fine until one of the developers found deb-s3 and tried using it.
After the first package upload we started getting this error message:
W: Failed to fetch s3://s3.amazonaws.com/myapp/dists/test/main/binary-amd64/Packages Hash Sum mismatch
I've tried to restore an old version of our Packages.gz file with no success. I've searched for this error and removing the /var/lib/apt/lists/ does not work either.
What would deb-s3 do that could break our entire repo?
Looks like deb-s3 creates a Releases file under dist/test and that conflicts with Packages.gz.
Removing the Release file restored our repository back to what it was.