Method for User uploading Large ZIP file to server - file-upload

I am using OpenCart right now to allow customers to buy products from my website.
Once they have successfully purchased an item, I would like them to be able to upload a large file (preferrably via FTP) to my server. I'm expecting somewhere between 10Mb and 100Mb.
Is this possible and how would you recommend I put this together?
I'm guessing I'd have to create a private directory on my server using the orderID.
Should I just create an FTP user with write-only access to a specific private directory on the server and ask them to upload '<%OrderID%>.zip' there?
Or is there a better way of handling this? Should I consider Dropbox, Amazon S3 or even Plupload?
Many thanks for any help with this :-)

Related

Can I transfer images between shopify sites?

I'm doing some work for a client who has an existing shopify website. They want to make some big changes to the site, so i have set up a new development site in shopify, exported all of the products/pages/blog posts to it and am now working on getting all the new functionality/design working on the dev site.
Once the new build is finished though, i want to transfer everything back over to their current site. Products/pages/blog posts will be fine (ive written a custom export/import thing using their api), but what about images?
I am uploading lots of images to the dev site and i am worried they will be deleted when development is finished and i shut down the dev site. Is it possible to transfer over images from one site to another?
Ideally, keeping the same urls on shopifys cdn when doing so, although if i have to change the urls, then i can probably do an automated replace on the csv files that will get uploaded.
There are going to be hundreds of images involved, and they will be used in various places throughout the site, including in the rich text area of pages/blogs, so it's not going to be practical to do manually in any way, must be something I can automate.
Thanks for any help.
When you export products as a CSV, you get links to your images. You could write a script to download each of the images in the CSV. Just redirect the output of curl to save the image.
curl link_url > imagename
Have you tried transferring between the two sites using FTP? If you have SSH Access
login to the server via SSH
change to the right directory to file location or desired location
FTP into the other server using ftp <name_or_IP_address_of_other_server> and your login details
use cd to locate your location / desired destination
use the binary command
hash if you want a progress bar
if sending the file from the server you SSHed into issue the put <filename> command, and if you want to pull the file from the other server to the one you are logged into use get <filename> instead.
Wait a while for the transfer to complete - might take a while

Transfer from one cPanel to another cPanel without WHM access

I have cPanel access on two different servers and would like to transfer from one server to another. The original size of the account on the first server is close to 15GB.
Currently, the only two ways I can think of are:
Backup using cPanel then Restore on the second server. But this process times out. I get "Failed - Network error" error
Use FTP App like Filezilla to login and transfer files from that. I haven't tested this but I think it first downloads the files on my local machine (temp folder) then uploads them to the second server.
My problem with option 2 is that this means I will end up using 30GB of data transfer if it actually does that.
What is the best way to transfer from server to server using cPanel?
I have limited knowledge on this. I will suggest you some tips.
1.Backup using cPanel then Restore on the second server. But this process times out. I get "Failed - Network error" error
The backup creation is failed due to the high size of account. So that reason that way is not possible.
Use FTP App like Filezilla to login and transfer files from that. I haven't tested this but I think it first downloads the files on my local machine (temp folder) then uploads them to the second server.
You can download the whole content to your PC or download one folder by folder like first home, mail, and etc and upload to new cpanel account using ftp
Could you please open a support ticket to your hosting provider they will help you create from server backend using /script/pkgacct.
Thank you.

Can I easily limit which files a user can download from an Amazon S3 server?

I have tried looking for an answer to this but I think I am perhaps using the wrong terminology so I figure I will give this a shot.
I have a Rails app where a company can have an account with multiple users each with various permissions etc. Part of the system will be the ability to upload files and I am looking at S3 for storage. What I want is the ability to say that users from Company A can only download the files associated with that company?
I get the impression I can't unless I restrict the downloads to my deployment servers IP range (which will be Heroku) and then feed the files through a controller and a send_file() call. This would work but then I am reading data from S3 to Heroku then back to the user vs. direct from S3 to the user.
If I went with the send_file method can I close off my S3 server to the outside world and have my Heroku app send the file direct?
A less secure idea I had was to create a unique slug for each file and store it under that name to prevent random guessing of files i.e. http://mys3server/W4YIU5YIU6YIBKKD.jpg etc. This would be quick and dirty but not 100% secure.
Amazon S3 Buckets support policies for granting or denying access based on different conditions. You could probably use those to protect your files from different user groups. Have a look at the policy documentation to get an idea what is possible. After that you can switch over to the AWS policy generator to generate a valid policy depending on your needs.

Correct Server Schema to upload pictures in Amazon Web Services

I want to upload pictures to the AWS s3 through the iPhone. Every user should be able to upload pictures but they must remain private for each one of them.
My question is very simple. Since I have no real experience with servers I was wondering which of the following two approaches is better.
1) Use some kind of token vending machine system to grant the user access to the AWS s3 database to upload directly.
2) Send the picture to the EC2 Servlet and have the virtual server place it on the S3 storage.
Edit: I would also need to retrieve, should i do it directly or through the servlet?
Thanks in advance.
Hey personally I don't think it's a good idea to use token vending machine to directly upload the data via the iPhone, because it's much harder to control the access privileges, etc. If you have a chance use ec2 and servlet, but that will add costs to your solution.
Also when dealing with S3 you need to take in consideration that some files are not available right after you save them. Look at this answer from S3 FAQ.
For retrieving data directly from S3 you will need to deal with the privileges issue again. Check the access model for S3, but again it's probably easier to manage the access for non public files via the servlet. The good news is that there is no data transfer charge for data transferred between EC2 and S3 within the same region.
Another important point to consider the latter solution
High performance in handling load and network speeds within amazon ecosystem. With direct uploads the client would have to handle complex asynchronous operations of multipart uploads etc instead of focusing on the presentation and rendering of the image.
The servlet hosted on EC2 would be way more powerful than what you can do on your phone.

Allowing users to download files as a batch from AWS s3 or Cloudfront

I have a website that allows users to search for music tracks and download those they they select as mp3.
I have the site on my server and all of the mp3s on s3 and then distributed via cloudfront. So far so good.
The client now wishes for users to be able to select a number of music track and then download them all in bulk or as a batch instead of 1 at a time.
Usually I would place all the files in a zip and then present the user a link to that new zip file to download. In this case, as the files are on s3 that would require I first copy all the files from s3 to my webserver process them in to a zip and then download from my server.
Is there anyway i can create a zip on s3 or CF or is there someway to batch / group files in to a zip?
Maybe i could set up an EC2 instance to handle this?
I would greatly appreciate some direction.
Best
Joe
I am afraid you won't be able to create the batches w/o additional processing. firing up an EC2 instance might be an option to create a batch per user
I am facing the exact same problem. So far the only thing I was able to find is Amazon's s3sync tool:
https://docs.aws.amazon.com/cli/latest/reference/s3/sync.html
In my case, I am using Rails + its Paperclip addon which means that I have no way to easily download all of the user's images in one go, because the files are scattered in a lot of subdirectories.
However, if you can group your user's files in a better way, say like this:
/users/<ID>/images/...
/users/<ID>/songs/...
...etc., then you can solve your problem right away with:
aws s3 sync s3://<your_bucket_name>/users/<user_id>/songs /cache/<user_id>
Do have in mind you'll have to give your server the proper credentials so the S3 CLI tools can work without prompting for usernames/passwords.
And that should sort you.
Additional discussion here:
Downloading an entire S3 bucket?
s3 is single http request based.
So the answer is threads to achieve the same thing
Java api - uses TransferManager
http://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/s3/transfer/TransferManager.html
You can get great performance with multi threads.
There is no bulk download sorry.