Empty files on S3 prevent from downloading using s3cmd and s3sync - amazon-s3

I am trying to setup a backup/restore using S3. The upload sync worked well using s3sync. However, next to each folder there is an empty file with matching name. I read somewhere that this is created to define the folder structure but I am not sure about that as it doesn't happen if I create a folder using a different method s3fox etc.
These empty files prevent me from restoring the directories/files. When I do s3cmd sync, I get an error message "can not make directory: File exists" as it first creates that empty file and that fails when trying to create the directory. Any ideas how I can solve this problem?

Related

How can I have Alluxio show all the not-yet-accessed files in the directory?

When mounted an s3 bucket under alluxio://s3/, the bucket already has objects. However, when I get the directory list (either by alluxio fs ls or ls the fuse-mounted directory or on the web ui) i see no files. When I write a new file or read an already existing object via Alluxio, it appears in the dir list. Is there a way I can have Alluxio show all the not-yet-accessed files in the directory? (rather than only showing files after writing or accessing them)
a simple way is to run bin/alluxio fs loadMetadata /s3 to force refresh the Alluxio directory. There are other ways to trigger it, checkout “How to Trigger Metadata Sync” section in this latest blog:
https://www.alluxio.io/blog/metadata-synchronization-in-alluxio-design-implementation-and-optimization/

How do I fix Directus error code 3 when downloading stored files

I can store files to the specified storage location via the GUI. I can see the files are in the storage location.
When I try to download them using the GUI, I get this every time.
{"error":{"code":3,"message":"Unauthorized request","class":"Directus\\Exception\\UnauthorizedException","file":"\/var\/www\/directus\/src\/helpers\/app.php","line":287}}
When I try the links from the File library, I get the same error.
I found some old topics concerning a "_" project. I do not see any "_" entries in my project.php configuration.
Everyone has read permissions for the storage directory.
The rest of the system appears to run without error.
check the folder in the server and what is been set, the default should be, like
So then if you want to access the 300x300 the URL should be like:
domain.com/public/uploads/Directus/generated/w300,h300,fcrop,q80/file-name.jpg

How to prevent file from access if any file failed to write to a required folder

I have 3 xml file to be written to a folder for client. while writing the 2 files got written perfectly but 3rd file failed. what are the ways by which I can prevent the client to open any file or all the files got deleted or locked if anything failed?
If your application has delete privileges on the system, keep a record of the filenames to the files you're writing. If a file fails for whatever reason, go through the list of file names and delete the files from the directory. A simple string list with a for loop should do it.

Sync with S3 with s3cmd, but not re-download files that only changed name

I'm syncing a bunch of files between my computer and Amazon S3. Say a couple of the files change name, but their content is still the same. Do I have to have the local file removed by s3cmd and then the "new" file re-downloaded, just because it has a new name? Or is there any other way of checking for changes? I would like s3cmd to, in that case, simply change the name of the local file in accordance with the new name on the server.
s3cmd upstream (github.com/s3tools/s3cmd master branch) and 1.5.0-rc1 latest published version, can figure this out, if you used a recent version to put the file into S3 in the first place that used the --preserve option to store the md5sum of each file. Using the md5sums, it knows that you have a duplicate (even if renamed) file locally, and won't re-download it, but instead will do a local copy (or hardlink) from the file system name to the name from S3.

Flowgear access to files on the local file system

I am creating a Flowgear workflow that needs to process a raft of XML data.
I have the xml data contained in a set of .xml files (approximately 400 files) in a folder on my local machine hard-drive and I want to read them into a workflow, run an XSLT transform and then write out the resultant XML to another folder on the same local hard-drive.
How do I get the flowgear workflow to read these files?
It depends on the use case, the File Enumerator works exceptionally well to loop (as in for-each) through each file. Sometimes, one wants to get a list of files in a particular folder and check whether a file has been found or not. For this, I would recommend a c# script to get a list of files with code:
Directory.GetFiles(#"{FilePath}", "*.{extension}", SearchOption.TopDirectoryOnly);
Further on, use the File node to read, write, or delete files from a file directory.
NB! You will need to install a DropPoint on the PC/Server to allow access to the files. For more information regarding Drop Points, please click here
You can use a File Enumerator or File Watcher to read the files up. The difference is that a File Enumerator will enumerate all files in a folder once, the File Watcher will watch a folder indefinitely and provide new files to the workflow as they are copied into the folder.
You can then use the File node to write the files back the the file system.