Dropbox images to s3 bucket using java - amazon-s3

We are having a hugh set of images inside dropbox and manually downloading and uploading to S3 is not a practical method.Can anyone suggest the best method to transfer dropbox files with folder to s3 bucket.Java is the technology that we are planning to use.
Thanks in advance.

There's no fix for this, since you don't have access to dropbox servers. If you did own the servers that the files were stored on, then you'd have this similar issue: How to upload files directly to Amazon S3 from a remote server?

You can either write a program to copy the files from Dropbox to AWS s3.OrYou can use a third-party tool.
DropBox
Here is the official GitHub project dropbox-sdk-java with code examples.
AWS
Here is the AWS Java SDK example code for upload-object
Full APIs references
Here are the APIs for Dropbox getting-started-api
Here are the APIs for AWS S3 API Reference
Here are the javadoc for AWS JAVA SDK
Tools
There are third-party tools for doing this.
Here are two examples:
inclowdz
multcloud

Related

TCL Amazon S3 Interaction

I have a TCL/TK Windows application that creates a small executable that I distribute to my customers. Because it is an exe file I can not email the file. Instead I upload it to an Amazon S3 bucket then create a URL link and email the link to them. They download the file from the link and run the exe.
What I would like to do is add the ability upload to an Amazon bucket within the application that will enable me to upload the file and create a URL that I can copy and email to the customer. I have seen Amazon S3 API's written for other languages, python, java, but not TCL. Has anyone done this? How hard is it? Can you point me to a tutorial?
Actually I do not have to use a S3 bucket. If there is another suggestion for how to distribute small files to customers from within TCL programs I am open to suggestions. Besides what has been laid out above the only other requirement is that multiple people must be able to upload to the same location, the TCL program runs on Windows and I would like to not use a 3rd party program. Security is not a major concern, nor is privacy, these things are handled other ways.
Actually, Tcl does provide an S3 package, but since I don't have Amazon S3 account, I cannot test it out.

Play framework 2.2.1 uploading videos to server on cloudbees

I am trying to get my play 2.2.1 java application to upload video files to server.
i am using cloud-bees for hosting the application. i am not sure what is the correct way to do this.
what should be in the model only a string (path) to file location on server ?
or a #Lob with binary data if so how do i extract bin data from request ?
what is the common practice ?
can you show/direct me to java code snippets showing how to do this properly ?
What you don't specify is the place where you would like to store your video files. Be aware that for a multi-tenant service like CloudBees offers, the filesystem is not persistent. You have more information about this here.
For video storage you could use Amazon S3 or Youtube for example. For Amazon S3 you can read this article which explains you how to do it. You can look up for more examples on Internet. It is pretty straight forward, and as long as you create your bucket in Virginia (US) or Dublin (Europe), you will have low latency between your app and your file system.

How do I simply download an object from S3 in Mac OS X?

After seeing this question on aws, I have downloaded the AWS iOS SDK and linked the .framework into my OSX app. If I try to import AWSiOSSDK/S3/AmazonS3Client.h into one of my classes, I now get a linker error for the classes whose header files are in the aws framework, and all I want to do is write a method to download a file from S3 and save it to a specific folder.
I tried setting up a TVM (token vending machine) on Elastic Beanstalk, but don't know how/if I should use it for authenticating, and I just can't seem to find documentation on how to download a file from S3 via a Mac App.
Currently if I try to access an S3 URL it fails due to missing credentials. I don't want to make the s3 files public, so I need to send creds with the request. Anyone who can point me in the right direction to get started or any guidance is appreciated!
While it is not strictly an "SDK for OSX" you might find the AWS SDK for iOS a good starting point.
SDK download
SDK source on GitHub

How can I remotely upload files to Amazon S3?

I am looking for a way to transfer files from a server to Amazon S3 bucket, without first downloading the files to my computer. All of the files I plan to transfer can be accessed publicly (e.g. http://something.com/file.ext). Everything I tried only allows me to directly upload files from my Mac to S3.
P.S. Although I have access to windows, a Mac app that can do this would be great... or maybe a browser-based solution :)
You can check out this PHP class (and a net tuts tutorial on it), it works well, I've been using it for a while now. It includes bucket creation, deletion, adding files and more. You can easily add files remotely from another server, or from the same server you're running it on.

Updating permissions on Amazon S3 files that were uploaded via JungleDisk

I am starting to use Jungle Disk to upload files to an Amazon S3 bucket which corresponds to a Cloudfront distribution. i.e. I can access it via an http:// URL and I am using Amazon as a CDN.
The problem I am facing is that Jungle Disk doesn't set 'read' permissions on the files so when I go to the corresponding URL in a browser I get an Amazon 'AccessDenied' error. If I use a tool like BucketExplorer to set the ACL then that URL now returns a 200.
I really really like the simplicity of dragging files to a network drive. JungleDisk is the best program I've found to do this reliably without tripping over itself and getting confused. However it doesn't seem to have an option to make the files read-able.
I really don't want to have to go to a different tool (especially if i have to buy it) to just change the permissions - and this seems really slow anyway because they generally seem to traverse the whole directory structure.
JungleDisk provides some kind of 'web access' - but this is a paid feature and I'm not sure if it will work or not.
S3 doesn't appear to propagate permissions down which is a real pain.
I'm considering writing a manual tool to traverse my tree and set everything to 'read' but I'd rather not do this if this is a problem someone else has already solved.
Disclaimer: I am the developer of this tool, but I think it may answer your question.
If you are on Windows you can use CloudBerry Explorer Amazon S3 client. It supports most of the Amazon S3 and CloudFront features and It is freeware.
I use the Transmit Mac app to modify permissions on files I've already uploaded with JungleDisk. If you're looking for a more cross-platform solution, the S3Fox browser plugin for Firefox claims to be able to modify permissions on S3 files as well.
If you need a web based tool, you can use S3fm, free online Amazon S3 file manager.
It's a pure Ajax app that runs in your browser and doesn't require sharing your credentials with a 3rd party web site.
If you need a reliable cross-platform tool to handle permissions, you can have a look at CrossFTP Pro. It supports most of the Amazon S3 and CloudFront features as well.