How to download multiple file objects of particular directory from AWS s3 using ASP.Net Core? - amazon-s3

How to download multiple file objects of particular directory (eg. folder name batch No. having multiple folders, inside subfolders having files like images or pdf files ) from AWS s3 using ASP.Net Core.

In this regard i would recommend creating a zipped file with all the directory contents and download it as one file

Related

How can I have Alluxio show all the not-yet-accessed files in the directory?

When mounted an s3 bucket under alluxio://s3/, the bucket already has objects. However, when I get the directory list (either by alluxio fs ls or ls the fuse-mounted directory or on the web ui) i see no files. When I write a new file or read an already existing object via Alluxio, it appears in the dir list. Is there a way I can have Alluxio show all the not-yet-accessed files in the directory? (rather than only showing files after writing or accessing them)
a simple way is to run bin/alluxio fs loadMetadata /s3 to force refresh the Alluxio directory. There are other ways to trigger it, checkout “How to Trigger Metadata Sync” section in this latest blog:
https://www.alluxio.io/blog/metadata-synchronization-in-alluxio-design-implementation-and-optimization/

How to add the files of s3bucket folder to a zipfile and download the zip file

I have a folder in s3bucket. I want to zip the files inside it and then download the zip file. Whatever i found was related to lambda. Is there a way i can do it without using lambda? if not then what is the proper way to do it.
Thank you in advance.
S3 can't zip it on the fly for you since it's only a file storage service. You could use lambda of course, but the simplest way to download a "folder" on S3 is to use the AWS CLI.
aws s3 sync s3://<bucket_name>/<folder_key> <local_dest_path>
You can then zip it on your local machine if needed.

How to copy artifacts folder to ftp folder in TFS?

I'm trying to publish artifacts and it's other folders files as well.I've read all the docs file provide by microsoft from here and used them but none of them worked for me.
I' tried File patterns as
** =>which copied all root files to ftp
**\* => which copied all sub folders file to ftp's root directory.
What I've wanted is copy folder to folder in ftp aswell.
-artifacts ftp
--a.dll --a.dll
--subfolder --subfolder
---subfolder_1.dll ---subfolder_1.dll
what's happening is
ftp
--a.dll
--subfolder_1.dll
It's copying all sub directories file to root directory of ftp.
I've use curl and ftp both giving me same result.
How can i achieve folder to folder copy in TFS 2017.
It's not related File patterns, to upload the entire folder content recursively, simply specify **.
All you have to do is checking the Preserve file paths in Advanced option.
If selected, the relative local directory structure is recreated under
the remote directory where files are uploaded. Otherwise, files are
uploaded directly to the remote directory without creating additional
subdirectories.
For example, suppose your source folder is: /home/user/source/ and
contains the file: foo/bar/foobar.txt, and your remote directory
is: /uploads/. If selected, the file is uploaded to:
/uploads/foo/bar/foobar.txt. Otherwise, to: /uploads/foobar.txt.

S3: Move all files from subdirectories into a common directory

I have a lot of subdirectories containing a lot of images (millions) on S3. Having the files in these subdirectories has turned out to be a lot of trouble, and since all file names are actually unique, there is no reason why they should reside in subdirectories. So I need to find a fast and scalable way to move all files from the subdirectories into one common directory or alternatively delete the sub directories without deleting the files.
Is there a way to do this?
I'm on ruby, but open to almost anything
I have added a comment to your other question, explaining why S3 does not have folders, but file name prefixes instead (See Amazon AWS IOS SDK: How to list ALL file names in a FOLDER).
With that in mind, you will probably need to use a combination of two S3 API calls in order to achieve what you want: copy a file to a new one (removing the prefix from the file name) and deleting the original. Maybe there is a Ruby S3 SDK or framework out there exposing a rename feature, but under the hood it will likely be a copy/delete.
Related question: Amazon S3 boto: How do you rename a file in a bucket?

iOS save all files from remote directory at local document directory

I am properly sending a nsurlrequest to download a known path file and then save it to document directory. However, now I need to download all files that a remote directory contains. I know path for directory but not which files are inside. How to list that files in order to build paths to download it? Thank you.
Unfortunately unless you parsed a index file containing a list of the files, this is not possible as the HTTP protocol does not support directory listing. You would have to use an FTP server instead