I am trying to get my play 2.2.1 java application to upload video files to server.
i am using cloud-bees for hosting the application. i am not sure what is the correct way to do this.
what should be in the model only a string (path) to file location on server ?
or a #Lob with binary data if so how do i extract bin data from request ?
what is the common practice ?
can you show/direct me to java code snippets showing how to do this properly ?
What you don't specify is the place where you would like to store your video files. Be aware that for a multi-tenant service like CloudBees offers, the filesystem is not persistent. You have more information about this here.
For video storage you could use Amazon S3 or Youtube for example. For Amazon S3 you can read this article which explains you how to do it. You can look up for more examples on Internet. It is pretty straight forward, and as long as you create your bucket in Virginia (US) or Dublin (Europe), you will have low latency between your app and your file system.
Related
I've searched a lot but couldn't find an example.
I want to use nanorframework as a webserver where I can upload and download e.g. a JSON file from the browser which holds all my settings. Is this possible?
Otherwise if I want to change some settings I have to rebuild the whole solution and uploat it.
Thanks in advance
You can use the Storage libraries to store that Json file. The actual storage can support by flash (using SPIFFs), SD card or USB mass storage device. This depends on the hardware platform that you are using. Check the Storage samples in our samples repo here.
Downloading a file is pretty straightforward you just need to serve the respective HTTP request. Check the HTTP samples in our samples repo here.
Uploading a file it's a matter of handling the POST request and grabbing the data being sent by the client browser.
We are having a hugh set of images inside dropbox and manually downloading and uploading to S3 is not a practical method.Can anyone suggest the best method to transfer dropbox files with folder to s3 bucket.Java is the technology that we are planning to use.
Thanks in advance.
There's no fix for this, since you don't have access to dropbox servers. If you did own the servers that the files were stored on, then you'd have this similar issue: How to upload files directly to Amazon S3 from a remote server?
You can either write a program to copy the files from Dropbox to AWS s3.OrYou can use a third-party tool.
DropBox
Here is the official GitHub project dropbox-sdk-java with code examples.
AWS
Here is the AWS Java SDK example code for upload-object
Full APIs references
Here are the APIs for Dropbox getting-started-api
Here are the APIs for AWS S3 API Reference
Here are the javadoc for AWS JAVA SDK
Tools
There are third-party tools for doing this.
Here are two examples:
inclowdz
multcloud
I am developing an application which needs to fetch some data from an XML file for the automatic update process and for some other functions. This approach requires the files to be located in a direct link, so it can be hard-programmed to use that specific URL.
I heard that you can use a lot of free-to-use file-sharing services such as Google Drive, Box and Dropbox. Can you tell me if it's true or not? And are there any other services beside those I mentioned?
I don't need a web hosting that supports PHP and other frameworks, I just want to store files and make my application access it when required.
Yes, both Dropbox and Google Drive provide web hosting of your public folders, but there is a 10GB bandwidth limit with Dropbox.
You can use any free web hosting like 110mb or 5gbfree too.
You can try github, bitbucket or mega.co.nz
We are moving away from using a local file system for all our media and moving it over to Cloud Files (Rackspace). The downer is that we use the ImageManager plugin which seems to rely heavily on doing filesystem commands to generate lists of files etc.
Have any of you had experience with using TinyMCE and inserting images and assets from a cloud storage service?
I am starting to use Jungle Disk to upload files to an Amazon S3 bucket which corresponds to a Cloudfront distribution. i.e. I can access it via an http:// URL and I am using Amazon as a CDN.
The problem I am facing is that Jungle Disk doesn't set 'read' permissions on the files so when I go to the corresponding URL in a browser I get an Amazon 'AccessDenied' error. If I use a tool like BucketExplorer to set the ACL then that URL now returns a 200.
I really really like the simplicity of dragging files to a network drive. JungleDisk is the best program I've found to do this reliably without tripping over itself and getting confused. However it doesn't seem to have an option to make the files read-able.
I really don't want to have to go to a different tool (especially if i have to buy it) to just change the permissions - and this seems really slow anyway because they generally seem to traverse the whole directory structure.
JungleDisk provides some kind of 'web access' - but this is a paid feature and I'm not sure if it will work or not.
S3 doesn't appear to propagate permissions down which is a real pain.
I'm considering writing a manual tool to traverse my tree and set everything to 'read' but I'd rather not do this if this is a problem someone else has already solved.
Disclaimer: I am the developer of this tool, but I think it may answer your question.
If you are on Windows you can use CloudBerry Explorer Amazon S3 client. It supports most of the Amazon S3 and CloudFront features and It is freeware.
I use the Transmit Mac app to modify permissions on files I've already uploaded with JungleDisk. If you're looking for a more cross-platform solution, the S3Fox browser plugin for Firefox claims to be able to modify permissions on S3 files as well.
If you need a web based tool, you can use S3fm, free online Amazon S3 file manager.
It's a pure Ajax app that runs in your browser and doesn't require sharing your credentials with a 3rd party web site.
If you need a reliable cross-platform tool to handle permissions, you can have a look at CrossFTP Pro. It supports most of the Amazon S3 and CloudFront features as well.