How to delete files in subfolder in Amazon S3 using Apache Camel - amazon-s3

I am using Apache Camel to upload and delete files from S3.
While uploading the files, if the file name has a '\', then it creates folders and subfolders and puts the files inside them.
.setHeader(AWS2S3Constants.KEY, simple("${header.FileName}"))
.setHeader(AWS2S3Constants.CONTENT_LENGTH, simple("${header.fileLength}"))
.to("aws2-s3://{{aws_bucket}}?deleteAfterWrite=false&region={{aws_region}}&accessKey={{aws_access_key}}&secretKey=RAW({{aws_access_secret}})")
But while deleting the files, if the file name has '\', then it is not able to delete the files from the respective subfolders.
.setHeader(AWS2S3Constants.KEY, simple("${header.FileName}"))
.setHeader(AWS2S3Constants.S3_OPERATION,constant("deleteObject"))
.to("aws2-s3://{{aws_bucket}}?region={{aws_region}}&accessKey={{aws_access_key}}&secretKey=RAW({{aws_access_secret}})&fileName=${header.FileName}")
Is there some property or something that I need to specify so that it deletes the files from the subfolders?

Related

Apple Automator to add password to all files within a folder including subfolders and over right files

I have a folder with multiple subfolders, each subfolder contains 1 or more PDF files.
I would like to add password to each file and save over.
My current automator, can get all the files, add password, but then cannot replace the files into their original folders.

How to delete all the .icls files

I wanna aks is there a way to delete all the .icls file from the IntelliJ-idea-color scheme at once.
In which folder the imported files are stored

Boto3: Get Folder names without files within them?

I am using Boto3. I have a bucket, and within that i have a prefix, that contains directories. Those directories then contain a bunch of files.
Eg.
Bucket
Prefix
Directories (I want this) 700 items
Each directory has a 100 files
Everytime I call bucket.objects.filter(Prefix=target_dir).all() it takes over a minute to complete, and thats because its loading every directory and its files within. Is there a way to load the directories only?

How to copy artifacts folder to ftp folder in TFS?

I'm trying to publish artifacts and it's other folders files as well.I've read all the docs file provide by microsoft from here and used them but none of them worked for me.
I' tried File patterns as
** =>which copied all root files to ftp
**\* => which copied all sub folders file to ftp's root directory.
What I've wanted is copy folder to folder in ftp aswell.
-artifacts ftp
--a.dll --a.dll
--subfolder --subfolder
---subfolder_1.dll ---subfolder_1.dll
what's happening is
ftp
--a.dll
--subfolder_1.dll
It's copying all sub directories file to root directory of ftp.
I've use curl and ftp both giving me same result.
How can i achieve folder to folder copy in TFS 2017.
It's not related File patterns, to upload the entire folder content recursively, simply specify **.
All you have to do is checking the Preserve file paths in Advanced option.
If selected, the relative local directory structure is recreated under
the remote directory where files are uploaded. Otherwise, files are
uploaded directly to the remote directory without creating additional
subdirectories.
For example, suppose your source folder is: /home/user/source/ and
contains the file: foo/bar/foobar.txt, and your remote directory
is: /uploads/. If selected, the file is uploaded to:
/uploads/foo/bar/foobar.txt. Otherwise, to: /uploads/foobar.txt.

How to download multiple file objects of particular directory from AWS s3 using ASP.Net Core?

How to download multiple file objects of particular directory (eg. folder name batch No. having multiple folders, inside subfolders having files like images or pdf files ) from AWS s3 using ASP.Net Core.
In this regard i would recommend creating a zipped file with all the directory contents and download it as one file