NSFileManager: move file to a NTFS hard disk - objective-c

I'm trying to copy some files from local HD (journaled, of course) to a USB hard disk, partitioned with NTFS.
I'm simply using
[fileManager copyItemAtPath:src toPath:dest error:&err]
which works fine.
However, for some files (I didn't understand the relation between these files), when I try to delete them from the NTFS drive I get a Finder error
The operation could not be completed.
An unexpected error occurred (error code -50).
I tried to search the web, I read it may be a problem of illegal characters in the name ? / " | > *, but none of those characters is in the name of the files that give me the error (I didn't check for every file that gives me that error, also because I have hundreds of files, but one that gives that error every time is asdf.txt, thus I think it's not related to file names...).
Besides (I don't know if it's related or not), when the popup with the error appears I have, as you know, two choices. The file gets deleted even if I cancel the operation.
Any suggestion on how I can solve? (except, of course, not using a NTFS drive)
I'm using MacOSX 10.7.2 and Xcode 4.2.1.

Related

Why Google Colab thinks my file does not exist despite it being mounted from my Drive?

I made sure every single thing is correct with the related file names and their content addresses and so on yet I always get stuck upon executing the !sudo line command which should open and use a given file, it returns this error :
"Error: Cannot read file '/content/-p': No such file or directory"
As I have already said, the file does exist and is located in my Google Drive, I even copy pasted its specific path to make sure I put it right and yet the issue is still there, why? how can I solve it?
Thanks in advance for any help.
Two scenarios in my experience:
(1) It's the first time running the lines of code after mounting your google drive and for no apparent reason it runs for tens of minutes, produces not output, raises an error (e.g. "File does not exist"), and the session crashes; you repeat the exact same steps, then it works.
(2) There are thousands (or more) files within the folder that contains the file you're trying to read or write; when a google drive folder contains many files (i.e. thousands or more) it may crash for that reason.

Access 2007 won't open accdb file unless it's renamed

I have some programs written in VB6 which use a 30MB Access 2007 data file named BidBase.accdb. I have been able to open this file in Access 2007 for years, but now when I try, Access says "The document 'BidBase.accdb' caused a serious error the last time it was opened."
Here's where it gets strange:
If I rename the file to anything else, such as BidBase1.accdb, Access will open it.
If I copy the same file to another directory Access will open it.
My VB6 programs which use this file still open it and work with it okay.
My PC is setup to make daily backups of my VB6 projects to an external hard drive.
Opening the same file in one of those backup directories works, but if I copy that file to the root directory of that drive, it doesn't work. But on my internal drive, it's just the opposite - it won't load from in my VB6 directory but will load when copied to C:.
I don't have to open it with Access very often. I recently got a new PC on which I installed Access from the original disc, and that's where I first got the error, but when I went back to the old PC, the same thing happens, so it would not seem to be a problem with my PC. In other words, it's hard to imagine that reinstalling Access or any other potential solutions specific to this PC would help.
In 40 years of working with PCs, this is the most bizarre, inexplicable thing I've ever seen.
Try this:
Backup your file
Rename the file
Open the file in Access
"Save as" the file naming it BidBase.accdb
Close Access
Open BidBase.accdb
I tried loading it under a different name but when I did a Save-As, it gave me the same error message as when I tried to open it, even though the file is not present by that name any more, indicating that Access is keeping track of the problem file by name, but still doesn't explain why it flagged the file under that name but not the same file under other names.
But the next oddity is that it did save it under the name BidBase.accdb, but that file is only 5MB while the original is 31MB. I've compared the two files and the contents appear to be the same.
And now that my attention has been drawn to that, I realize that the largest table in the database has about 10k records, meaning that each record in the 31MB DB would have 3,100 bytes, which isn't remotely possible.
I also tried using the newly saved file under the old name in my programs and they work, but then they also worked with the old file under the old name when Access wouldn't.
So I'll keep working with the new file for a while to make sure it's okay, but this is all still a bizarre mystery to me.

FTP client sees a file that isn't there... How can I successfully delete/overwrite this "ghost" file?

So we have a client that creates "training packages" and then uploads them via ftp to their website. They create the training packages in PowerPoint, and then use some program to convert them into html/swf files and package them within a folder. When they upload, they use Filezilla, and just transfer the entire folder over. The folder is uniquely named, uses no spaces or special characters.
These files have uploaded fine for about a year. Recently, they've run into a problem. Whenever they try to upload training package folder, they are immediately presented with the "This file already exists, do you want to overwrite?" message. Except... the folder they're moving is brand new, and the file it's asking to overwrite DOESN'T EXIST. When they choose "Overwrite" the file looks like it transfers, but the file size is wrong, and the training package doesn't work correctly.
This happens with every training package they try to upload. It's not just a badly outputted package. Also, it's always the same file that has the problem--it's the main "player" for the training package, and though it contains different content for every package, it is the same file name (cplayer.swf) every time.
Things they've tried without success:
-Re-uploading the file again by itself, and overwriting
-Deleting the "bad" file and re-uploading the single file - Get the overwrite message again, even though the file DOES NOT EXIST.
-Renaming the file on the server and re-uploading the single file - Get the overwrite message.
-Renaming the single file locally within the package and uploading/renaming it - Won't let us rename because the file already exists.
-Used another FTP client - Same results as above, so not a client specific problem.
-Used a different FTP login - Same results as above, so not a permissions problem.
Other things of note:
-The file is small--it's not a time out problem. Plus, all other files upload fine, and some are a lot larger.
-They've emailed this file to me, and I've uploaded it successfully.
I am completely at my wits end. Does anyone have any ideas where I can at least troubleshoot a little further?
Thanks for the non-help, the downvote, and the general lack of response on what was a pretty serious issue for me.
In case anyone else has a similar problem, here's what was going on:
Virus software (specifically Malware Bytes) was blocking THIS ONE SINGLE FILE. All I had to do was exclude the folder that contained the file.

How to determine if a NTFS Folder is corrupted?

Recently, I encountered an unknown problem causing particular folder in NTFS folder to be corrupted in multiple computers. I need to detect if the folder is corrupted and perform actions like relocate the folder or send notifications. However I do not know how to do it yet. The normal APIs, like OpenFile/CreateFile seems to be malfunctioning with the corrupted folder and I can not use them to determine if a Folder is corrupted. So I plan to parse MTF structure and check for problem directly.
Therefore, I began to study the NTFS MFT structure. I found that $Volume has a dirty flag to determine if a drive needs chkdisk. But it is not directly related to file corruption and will be set if Windows is shutdown unexpectedly. DI failed to find a particular flag or anything to determine if an INDEX or FILE is corrupted in MFT structure.
Could I know if there is a way to determine a corrupted NTFS Folder?
Any help is appreciated!
I found 3 things that are related with NTFS disk corruption issues. It is incomplete; however, without updated NTFS source code, it is very hard to find out what Microsoft was really doing in chkdisk. I will just post what I found it in case if anyone needs to know it.
1 Dirty Flag in $BadClus of "File Records" section
If the flag in $BadClus is set to ON, then the operating system will perform a disk scan at boot-up. I believe NTFS module would set the flag to ON if encounter disk operation.
2 "BAAD" in identification field of a file record
If there is something wrong with file record, for example USA/USN unmatched, then MFT may replace "FILE" with "BAAD" in identification field of a file record structure. It can be used to identify corrupted file/directory quickly.
3 Compare USA/USN in every FILE/INDX record
Both FILE/INDX structure contains USA/USN for corruption check. Scan through the system and compare USA and USN could help you discover corruption issue.

Watch folder for files being Read

I am trying to watch files in a directory to determine when files are opened/accessed. I thought FileSystemWatcher would do the trick using the event Changed.
Problem is that some applications do not create a lock on the file they open/access or change either the date modified or date accessed (even after fsutil behavior set disablelastaccess 0). Notepad for example. Apparently is makes a copy of the file in memory and plays with it there until you save it. Nor does it update the Date Accessed.
How can I monitor a directory of files and be notified when a file is simply opened/accessed by any program (e.g. Notepad)? Files may be opened from another computer, not necessarily on the computer running the "watcher".
I found lots of similar questions but did not see one focusing on file "access".
This is quite normal. Updating an existing file is quite dangerous since it can cause irretrievable data loss. A disk error (like disk full) while writing is very bad news. The common algorithm used:
rename the original file
write a new file using the original name
no error: delete the renamed file
error: delete the new file, rename original file back
Clearly this doesn't cause a Changed event to be raised, no file was changed.
Sorry, I didn't read the question well enough. There is no notification whatsoever for an app just opening a file for reading. FSW can only detect changes to the file system. There is no ready alternative either, this requires a custom file system filter driver that snoops on driver requests. Like the kind that SysInternals' ProcMon utility uses. I'm not aware of such a driver ready for use in a C# program, you can't write them in C# either. This just isn't a common requirement.