I am using the "Overwrite if different size or source newer" option in FileZilla when I am uploading files, which is a great feature. But I need one more step before I push files which would be to scan the whole upload section and show me what files are different sizes or sources newer then I would confirm the exact changes.
Is it possible with FileZilla or would I use another software to do that?
Go to View > Directory Comparison > Click "Hide Identical Files". Then, you can toggle Directory Comparison with CTRL+O. I also recommend using the hotkey to toggle Synchronized Browsing (CTRL+Y).
You can choose Compare Filesize or Compare modification time. In my mind, if you're going to rely on the mod time, I would also set it to "Preserve timestamps of transferred files" under the "Transfer Menu" (CTRL+U to toggle). I think that makes more logical sense than having to worry about uploading your file within a minute to keep them "identical". You can also have a threshold of time set for this value. Go to Edit > Settings > Interface > File Lists. Under "Directory Comparison" on that screen you can increase the Comparison Threshold.
Is that what you're looking for?
Related
I am working with a vendor to debug an issue with a Lotus Domino 6.5 application. The vendor had originally asked me to make a copy of the application by going to File > Database > New Copy > Design Only, and then to copy some sample documents (with sensitive info removed) from the original database to the new database. But when I copy documents in this way, the views do not display correctly.
I then tried making a copy of the application by going to File > Database > New Copy > Design and Documents, and then deleting all but a few documents. When I copy documents in this way, the views display correctly, but the database size is huge, which makes me think that the original documents have not really been deleted. For privacy reasons, all deleted documents need to be unrecoverable.
My question: How can I purge documents that have been marked for deletion in this way?
While in the database, choose File - Application - Properties from the menu, click the last tab (the propeller-hat) and deselect these two database properties if they are selected:
"Don't overwrite free space" - Obviously you want to overwrite the free space.
"Allow Soft Deletions" - Deselecting this will purge any documents that are deleted but recoverable.
Once you've done that, choose File - Replication - Options for this Application..., click the "Other" tab, and change the date/time to the current date/time. This gets rid of the "deletion stubs" which remain for a time so that, if replication occurs with a different machine, that machine learns that the document was deleted rather than telling yours "Oh! Here's a brand new document!" This is completely unnecessary unless you want to ensure that even the count or internal ids of deleted documents are removed. The actual content will be gone.
After that, compact the database because you'll have free space or, if you want to be really paranoid, create a copy of your copy.
It's been a long time since I did any Lotus Notes development, but you can compact the database.
Open a database.
Choose File - Database - Properties.
Click the Info tab.
Click % used.
If the percentage of a database in use drops below 90% (it contains more than 10% unused space), click Compact to compact the database.
Remote clients will upload images (and perhaps some instructional files in specially formatted text) to a "drop folder." Once the upload is complete we need to begin processing these images. It would be an easy, but flawed, solution to just have a script automatically begin processing any files in the folder every few seconds (the files can be move out of the folder once processed); but problems would arise when attempting to process large images which are only partially transfered.
What are some tricks I can use to ensure the files are fully uploaded before processing them?
A few of my own thoughts:
The script can check the validity of the file; ie, a partial jpeg would result in an error and you could respond to that error in the script, this would be fairly CPU intensive though. Some files have special markers on the end, but I can't count on this, I'm not sure what formats I'll be dealing with.
I've heard of "file handles" but haven't really figured out the basics of what they are and how I can tell if there is a "file handle" on a particular file. Basically the FTP daemon (actually, I'm on Windows, so "service") would keep a "handle" on the file while it's being uploaded and you would know not to process that file. These are just a few of my thoughts but I'm not really sure if they will work or if there are better or more accepted ways of solving this problem.
If you have an server-side script upload system (PHP, ASP, JSP, whatever), you could instruct the script to call another script to process the files, or to create a flag-file indicating the upload is done, something like this.
If your server is Linux-based, you can use lsof to check if the file is open. As your ftp/script/cgi will close the file after upload completes, lsof will not show the file in the list.
If your server is Windows-based, you can use Process Explorer to list the open files.
By what method are your users uploading the images?
Is there a way (via shell extension or registry setting) to tell Windows Explorer that it shouldn't read files in the folder being shown in order to extract metadata or create thumbnails?
The problem is that when the user navigates to the folder, Windows Explorer attempts to read all files in the folder and extract certain metadata from them. If the medium is slow, this takes ages and causes unnecessary load on the file system. This is especially true in case of thumbnails, when the whole graphic file is read.
I am looking for ways to do this (restrict Explorer) in code, so "don't use Thumbnail mode" is not an acceptable answer :).
Upd: per-user settings won't work unfortunately cause we as a disk provider can deal only with our own disk (and the user might want to have separate settings for regular disks and virtual disks). I believe there must be some way to "explain" the OS that the drive is slow.
Maybe there's some IRP on driver level that we need to handle to tell the OS that the medium is slow?
Is there a way (via shell extension or
registry setting) to tell Windows
Explorer that it shouldn't read files
in the folder being shown in order to
extract metadata or create thumbnails?
Not that I know off, but depending on the priorities regarding the use case details you outlined there might be two options still to approximate the desired result:
Via group policy
Note that this essential expands/details the network folder related aspect of Freds answer, which you dismissed in your update; however, you claim to be able to deploy shell extensions or registry settings and the following two group policies simply execute the latter by administrative means:
User Configuration -> Administrative Templates -> Windows Components -> Windows Explorer:
Turn off the display of thumbnails and only display icons **on network folders**
Turns off the caching of thumbnails in hidden thumbs.db files.
This boils down to the following registry settings:
Windows Registry Editor Version 5.00
[HKEY_CURRENT_USER\Software\Policies\Microsoft\Windows\Explorer]
"DisableThumbnailsOnNetworkFolders"=dword:00000001
"DisableThumbsDBOnNetworkFolders"=dword:00000001
Of course this is still not per folder, but at least limited to network folders and ignores regular disks and virtual disks.
Via hackish workaround
Given your statement we as disk provider can deal only with our own disk there might be a hackish workaround, though I'm afraid it lacks the last mile (untested by myself).
Starting from Chris W. Reas own answer to How can I suppress those annoying Thumbs.db files in Windows Vista and Windows 7?:
Also worth knowing: In Vista and Windows 7, Thumbs.db applies to network folders only. For local folders, Vista and Windows 7 instead save thumbnail cache information to a database in a local folder at "%userprofile%\AppData\Local\Microsoft\Windows\Explorer"
Continuing from there, Wil claims the following potentially clever solution to work on a per folder basis:
Go to the drive and create a file called thumbs.db (in notepad or anything), then change the permissions on the file for everyone (including SYSTEM) to deny all.
Unfortunately, aside from the automation requirements to create the dummy thumbs.db in each folder, the outcome depends on how Explorer will react on the inaccessible file - because caching is optional as per group policy, it might as well display thumbnails without caching them, making the bandwidth issue even worse in turn ...
Good luck!
I'm not sure if you can disable thumbnail generation/display for certain folders but this article talks about a script which could quickly disable it via context menu.
The script modifies a value in the registry key HKCU\Software\Microsoft\Windows\CurrentVersion\Explorer\Advanced\. I suppose you could find something similar in that key for the other metadata. ShowInfoTip sounds promising. There might be relevant information in other nearby keys.
This may be a complete non-answer depending on your needs, but how about storing the files without file extensions that the OS wants to make thumbnails of? Call it file.jpg.abc and it won't be reading thumbnails, for sure.
I have a folder full of binary files and I want to make a change to these files so that the hash of these files will change. I want to do this is a fashion that doesn't pertinently corrupt the files. Meaning that the change should still allow the file to operate normally or that I should be able to undo the change at any point in time.
Does anyone know of a script that I could use to do this or many a program that will automate this?
Cheers
UPDATE
Its a edge case that I am trying to deal with. I have a system that only allows me to store a file with a given hash once. Hence I am wanting to change the content hash of the file to allow the file to be stored. Note the system in question is not one I control or can change.
Couldn't I just add a random 1 to the end of the file and then remove it afterward without breaking anything? I'm just not sure how to script this - as in how to modify the binary data in this way. Note I'm in a windows environment.
Without knowing the format of the files, we can't tell. It may in fact be impossible - for instance if these binary files are self-signed with some private key. Changing any single bit within the file is likely to render it invalid.
Is your hash calculated purely from the contents, and not any other metadata that you can change (such as filename or modified date)? If so, you're probably out of luck. If the hash is meant to detect when the content changes, but you're trying to change the hash without actually changing the content, you've clearly got a problem...
What is the hash used for? Why do you want to change it? There may be an alternative solution if you could give us more information about the bigger picture.
EDIT: One alternative is to effectively create your own container format - so while a file is stored in your container format, it's not usable in its original form, but it can be extracted easily. Your container could be as simple as "add four bytes at the end as a seed to disturb the hash" - "extracting" the file would just involve copying it and removing the last four bytes. But the important point is that what you end up with isn't an MP3 file or whatever you started with - it's your custom format, simple as it is. You need to package/extract the file any time you interact with the store.
Suppose a user selects a file in a dialogue box, and the app then opens the file for reading, etc. Users can open "incorrect" files--they can select a binary file, for example, even if the file they're supposed to be selecting is a text file.
I recognize that sometimes improper file types generate exceptions, which can be handled. But sometimes the files don't create exceptions; instead, they just cause the application to work improperly.
What's the standard way to code for these kinds of situations?
Put a unique identifier into the file (usually the first line or some tag)
Restrict the file extension
Do a check on the file whether it's OK
Use 1. if possible or use both 2. and 3.
A lot of operating systems help you out with this by providing filesystem APIs that are at least somewhat file-type-aware (in Cocoa for Mac OS X, there's a setAllowedFileTypes: method on NSOpenPanel, for example). Aside from that, you should make sure to define your file format in a way that's easy for you to identify when your program opens a file. A few well-known bytes at the start of your file is probably enough to protect you from most random-file problems.