We have to upload a lot of virtual box images witch are between 1G and 6G.
So i would prefer to use ftp for upload and then include the files in mediawiki.
Is there a way to do this?
Currently I use a jailed ftp user who can upload to a folder and then use the UploadLocal extension to include the files.
But this works only for files smaller then around 1G. If we upload bigger files we get a timeout and even by setting execution_time of PHP to 3000s the including stops after about 60s with a 505 gateway time out (witch is also the only thing appearing in the logs).
So is there a better way of doing this?
You can import files from shell using maintenance/importImages.php. Alternatively, upload by URL by flipping $wgAllowCopyUploads, $wgAllowAsyncCopyUploads and friends (requires that job queue be run using cronjobs). Alternatively, decide if you need to upload these files into MediaWiki at all, because just linking to them might suffice.
Related
I'm doing some work for a client who has an existing shopify website. They want to make some big changes to the site, so i have set up a new development site in shopify, exported all of the products/pages/blog posts to it and am now working on getting all the new functionality/design working on the dev site.
Once the new build is finished though, i want to transfer everything back over to their current site. Products/pages/blog posts will be fine (ive written a custom export/import thing using their api), but what about images?
I am uploading lots of images to the dev site and i am worried they will be deleted when development is finished and i shut down the dev site. Is it possible to transfer over images from one site to another?
Ideally, keeping the same urls on shopifys cdn when doing so, although if i have to change the urls, then i can probably do an automated replace on the csv files that will get uploaded.
There are going to be hundreds of images involved, and they will be used in various places throughout the site, including in the rich text area of pages/blogs, so it's not going to be practical to do manually in any way, must be something I can automate.
Thanks for any help.
When you export products as a CSV, you get links to your images. You could write a script to download each of the images in the CSV. Just redirect the output of curl to save the image.
curl link_url > imagename
Have you tried transferring between the two sites using FTP? If you have SSH Access
login to the server via SSH
change to the right directory to file location or desired location
FTP into the other server using ftp <name_or_IP_address_of_other_server> and your login details
use cd to locate your location / desired destination
use the binary command
hash if you want a progress bar
if sending the file from the server you SSHed into issue the put <filename> command, and if you want to pull the file from the other server to the one you are logged into use get <filename> instead.
Wait a while for the transfer to complete - might take a while
I'm using AjaXplorer to give access to my clients to a shared directory stored in Amazon S3. I installed the SD, configured the plugin (http://ajaxplorer.info/plugins/access/s3/) and could upload and download files but the upload size is limited to my host PHP limit which is 64MB.
Is there a way I can upload directly to S3 without going over my host to improve speed and have S3 limit, no PHP's?
Thanks
I think that is not possible, because the server will first climb to the PHP file and then make transfer to bucket.
Maybe
The only way around this is to use some JQuery or JS that can bypass your server/PHP entirely and stream directly into S3. This involves enabling CORS and creating a signed policy on the fly to allow your uploads, but it can be done!
I ran into just this issue with some inordinately large media files for our website users that I no longer wanted to host on the web servers themselves.
The best place to start, IMHO is here:
https://github.com/blueimp/jQuery-File-Upload
A demo is here:
https://blueimp.github.io/jQuery-File-Upload/
This was written to upload+write files to a variety of locations, including S3. The only tricky bits are getting your MIME type correct for each particular upload, and getting your bucket policy the way you need it.
for realtime online tournaments i upload a bunch of html pages, generated by our tournament software. On the webserver i do the things with these files which are needed to do. So the tournament software is "integrated" in our website.
Now we want other people run these tournaments and then they do have to upload these generated html/css files. Via http upload it really is too much work en takes too much time.
I wanted to create another ftp account with only acces to the tournament directory. So far so good.
But, i want to limit the uploading filetypes just to html and css files, so they can only upload static content via ftp (i am just paranoid, i do not want they can upload php files with possible dangerous code or other unxpected filetypes)
is this possible?
Files are just sequences of bytes, the extension has nothing to do, the danger is the way you read them. You must set the uploaded files to be non-executable and you are safe. Even if you set .css files to be uploaded somehow, you cannot check if it is safe or not by only extension, the attacker may have changed the extension by hand. Also, uploading PHP files will not be problem, if you specify an upload folder that Apache does not know, i.e not below your www folder.
I have a website that allows users to search for music tracks and download those they they select as mp3.
I have the site on my server and all of the mp3s on s3 and then distributed via cloudfront. So far so good.
The client now wishes for users to be able to select a number of music track and then download them all in bulk or as a batch instead of 1 at a time.
Usually I would place all the files in a zip and then present the user a link to that new zip file to download. In this case, as the files are on s3 that would require I first copy all the files from s3 to my webserver process them in to a zip and then download from my server.
Is there anyway i can create a zip on s3 or CF or is there someway to batch / group files in to a zip?
Maybe i could set up an EC2 instance to handle this?
I would greatly appreciate some direction.
Best
Joe
I am afraid you won't be able to create the batches w/o additional processing. firing up an EC2 instance might be an option to create a batch per user
I am facing the exact same problem. So far the only thing I was able to find is Amazon's s3sync tool:
https://docs.aws.amazon.com/cli/latest/reference/s3/sync.html
In my case, I am using Rails + its Paperclip addon which means that I have no way to easily download all of the user's images in one go, because the files are scattered in a lot of subdirectories.
However, if you can group your user's files in a better way, say like this:
/users/<ID>/images/...
/users/<ID>/songs/...
...etc., then you can solve your problem right away with:
aws s3 sync s3://<your_bucket_name>/users/<user_id>/songs /cache/<user_id>
Do have in mind you'll have to give your server the proper credentials so the S3 CLI tools can work without prompting for usernames/passwords.
And that should sort you.
Additional discussion here:
Downloading an entire S3 bucket?
s3 is single http request based.
So the answer is threads to achieve the same thing
Java api - uses TransferManager
http://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/s3/transfer/TransferManager.html
You can get great performance with multi threads.
There is no bulk download sorry.
How do I get a status report of all files currently being uploaded via HTTP form based file upload on an Apache Server?
I don't believe you can do this with Apache itself. The upload looks like nothing more than a POST as far as Apache cares. There are modules and other servers that do special processing to uploads so you may have some luck there. It would probably be easier to keep track of it in your application.
Check out SWFUpload, its uses Flash (in a nice way) to assist with managing multiple uploads.
There are events you can monitor for how many files of a set have been uploaded.