Boto3: Get Folder names without files within them? - amazon-s3

I am using Boto3. I have a bucket, and within that i have a prefix, that contains directories. Those directories then contain a bunch of files.
Eg.
Bucket
Prefix
Directories (I want this) 700 items
Each directory has a 100 files
Everytime I call bucket.objects.filter(Prefix=target_dir).all() it takes over a minute to complete, and thats because its loading every directory and its files within. Is there a way to load the directories only?

Related

How to delete files in subfolder in Amazon S3 using Apache Camel

I am using Apache Camel to upload and delete files from S3.
While uploading the files, if the file name has a '\', then it creates folders and subfolders and puts the files inside them.
.setHeader(AWS2S3Constants.KEY, simple("${header.FileName}"))
.setHeader(AWS2S3Constants.CONTENT_LENGTH, simple("${header.fileLength}"))
.to("aws2-s3://{{aws_bucket}}?deleteAfterWrite=false&region={{aws_region}}&accessKey={{aws_access_key}}&secretKey=RAW({{aws_access_secret}})")
But while deleting the files, if the file name has '\', then it is not able to delete the files from the respective subfolders.
.setHeader(AWS2S3Constants.KEY, simple("${header.FileName}"))
.setHeader(AWS2S3Constants.S3_OPERATION,constant("deleteObject"))
.to("aws2-s3://{{aws_bucket}}?region={{aws_region}}&accessKey={{aws_access_key}}&secretKey=RAW({{aws_access_secret}})&fileName=${header.FileName}")
Is there some property or something that I need to specify so that it deletes the files from the subfolders?

WinSCP: Is it possible to prevent folder of certain name from being uploaded?

I often use NPM and the node_modules folder contains thousands upon thousands of sub directories. Even though they are small, remote file transfers of this many files or directories takes hours to write so many files.
Many times I am deploying a whole directory and don't realize that there is a node_modules that I have not manually deleted somewhere deep in the tree structure.
Is it possible to ignore folders of a certain name or alert me if uploading a certain folder?
In git I can just ignore it with **/node_modules, but I don't know of an equivalent for FTP transfers.

rsync backs up everything

I am trying to backup some of the essential folders in the / in my ubuntu system. I am using
sudo rsync -aAXv --delete --include="/etc" --include="/home" --include="/usr/local" // /home/$USER/Desktop/bkup/
This command should only copy /etc, /home, /usr/local dirs and leave the rest of the files. But, when I run this command this copies every dir and every file in the / dir.
I am not sure what wrong I am doing here.
Includes without any excludes are meaningless.
--exclude='*' would exclude everything not explicitly included, from every subfolder, even the included ones.
--exclude='*/' would exclude every directory not explicitly included, but allow copying files within included directories (and the root).
--exclude='/*' would exclude all root directories and files not explicitly included, but allow all directories and files within included directories. You probably want this one.
You should add your exclude rule after your include rules. The rule is that, for each directory and file, it's the first matching include/exclude rule that matters, and the default (when no rule matches) is to include.
By "root" I mean the root of the copied directory, not the root of the whole file system.
P.S. Your command also has the destination directory inside the source directory; you probably want an exclude rule for that!

Uploadify - How to upload contents of a folder recursively

I need to upload files in a folder recursively.
Is it possible to use Uploadify to upload contents of a folder recursively by selecting the folder instead of selecting individual files in a folder?
I would not think so. The browser is in control of the file selection dialog (which will not allow you to select folders, only files).
I have not tried the file drag-drop feature of the paid Uploadifive (HTML 5 version), but it might support dragging a folder. Not sure what HTML 5 allows with with drag-drop

S3: Move all files from subdirectories into a common directory

I have a lot of subdirectories containing a lot of images (millions) on S3. Having the files in these subdirectories has turned out to be a lot of trouble, and since all file names are actually unique, there is no reason why they should reside in subdirectories. So I need to find a fast and scalable way to move all files from the subdirectories into one common directory or alternatively delete the sub directories without deleting the files.
Is there a way to do this?
I'm on ruby, but open to almost anything
I have added a comment to your other question, explaining why S3 does not have folders, but file name prefixes instead (See Amazon AWS IOS SDK: How to list ALL file names in a FOLDER).
With that in mind, you will probably need to use a combination of two S3 API calls in order to achieve what you want: copy a file to a new one (removing the prefix from the file name) and deleting the original. Maybe there is a Ruby S3 SDK or framework out there exposing a rename feature, but under the hood it will likely be a copy/delete.
Related question: Amazon S3 boto: How do you rename a file in a bucket?