FTP upload: possible to restrict for only static content - apache

for realtime online tournaments i upload a bunch of html pages, generated by our tournament software. On the webserver i do the things with these files which are needed to do. So the tournament software is "integrated" in our website.
Now we want other people run these tournaments and then they do have to upload these generated html/css files. Via http upload it really is too much work en takes too much time.
I wanted to create another ftp account with only acces to the tournament directory. So far so good.
But, i want to limit the uploading filetypes just to html and css files, so they can only upload static content via ftp (i am just paranoid, i do not want they can upload php files with possible dangerous code or other unxpected filetypes)
is this possible?

Files are just sequences of bytes, the extension has nothing to do, the danger is the way you read them. You must set the uploaded files to be non-executable and you are safe. Even if you set .css files to be uploaded somehow, you cannot check if it is safe or not by only extension, the attacker may have changed the extension by hand. Also, uploading PHP files will not be problem, if you specify an upload folder that Apache does not know, i.e not below your www folder.

Related

Can I transfer images between shopify sites?

I'm doing some work for a client who has an existing shopify website. They want to make some big changes to the site, so i have set up a new development site in shopify, exported all of the products/pages/blog posts to it and am now working on getting all the new functionality/design working on the dev site.
Once the new build is finished though, i want to transfer everything back over to their current site. Products/pages/blog posts will be fine (ive written a custom export/import thing using their api), but what about images?
I am uploading lots of images to the dev site and i am worried they will be deleted when development is finished and i shut down the dev site. Is it possible to transfer over images from one site to another?
Ideally, keeping the same urls on shopifys cdn when doing so, although if i have to change the urls, then i can probably do an automated replace on the csv files that will get uploaded.
There are going to be hundreds of images involved, and they will be used in various places throughout the site, including in the rich text area of pages/blogs, so it's not going to be practical to do manually in any way, must be something I can automate.
Thanks for any help.
When you export products as a CSV, you get links to your images. You could write a script to download each of the images in the CSV. Just redirect the output of curl to save the image.
curl link_url > imagename
Have you tried transferring between the two sites using FTP? If you have SSH Access
login to the server via SSH
change to the right directory to file location or desired location
FTP into the other server using ftp <name_or_IP_address_of_other_server> and your login details
use cd to locate your location / desired destination
use the binary command
hash if you want a progress bar
if sending the file from the server you SSHed into issue the put <filename> command, and if you want to pull the file from the other server to the one you are logged into use get <filename> instead.
Wait a while for the transfer to complete - might take a while

Web server: how to allow users to browse into compressed files?

I'm looking for an Apache configuration, module, or perhaps cgi/php/whatever script to allow users of my website to not only browse directories but also browse right into compressed .zip files (*) in those directories as if they were regular directories.
Background:
I run an Apache web server and one of the services it provides is a simple directory listing to some files that people need access to.
Some of these files are compressed (*) and generally that is good because people normally want to download the whole thing. However it would be really useful if it were also possible to seamlessly browse into the compressed files just as if they were regular folders.
While I could write a cgi script to do this myself, surely this has been done before. Surely, although my web searches have not found one, there's just an Apache module to do it.
Any pointers?
Steve
(*) Using .zip, not .tar.bz2, alas. It's a requirement, don't blame me.

How to upload large files to mediawiki in an efficient way

We have to upload a lot of virtual box images witch are between 1G and 6G.
So i would prefer to use ftp for upload and then include the files in mediawiki.
Is there a way to do this?
Currently I use a jailed ftp user who can upload to a folder and then use the UploadLocal extension to include the files.
But this works only for files smaller then around 1G. If we upload bigger files we get a timeout and even by setting execution_time of PHP to 3000s the including stops after about 60s with a 505 gateway time out (witch is also the only thing appearing in the logs).
So is there a better way of doing this?
You can import files from shell using maintenance/importImages.php. Alternatively, upload by URL by flipping $wgAllowCopyUploads, $wgAllowAsyncCopyUploads and friends (requires that job queue be run using cronjobs). Alternatively, decide if you need to upload these files into MediaWiki at all, because just linking to them might suffice.

Uploading multiple files given only relative local path

Say I have a user, and that user has an XML file which, among other things, includes the relative (to the XML file) path to one or more images stored on their local machine. I want them to be able to upload this XML file to a web server, and automatically upload the images.
So my XML file might contain:
<tag>Images\img_20120905_015463548.jpg</tag>
and I want to upload both the XML file and img_20120905_015463548.jpg in one operation.
The problem is, as best I can tell, I can't get a local web page to grab the images automatically using JS/jQuery due to the pesky web browser security model that won't allow me to upload arbitrary files off the local computer, or even know the real path of the XML file. After bashing my head against a brick wall for a few hours, I've come up with two possible solutions:
Upload the XML file, the server strips out the image file addresses and asks the user to locate each one. While it would get the job done, it's ugly and error-prone.
Use a batch file (or similar) to copy the XML file and images to a public-facing web server that the user can access on the local network, and then supply the public address of the XML file to my web server. It can then grab the images off the local public server. Problem: my IT department are too competent to allow users file access to public-facing servers. :)
Is there any solution out there I might have missed, that allows the user to upload multiple files given filenames only specified as a relative path?
Thanks in advance. :)
If you are not restricted to a web-only solution, this would be achievable using a plugin or desktop application. For instance, a desktop .NET or Java WebStart application or a signed and therefore trusted Java applet would be able to access the local XML file and any associated image files, then upload them to the web server using a POST, web services or WebDAV.

Display list files and folder using Mediafire API

I tryed to use Mediafire API, but when I use Folder, get_info, it doesn't return file & folder array like the example.
Full url I used: http://www.mediafire.com/api/folder/get_info.php?folder_key=l461cm2d8hfxd
What's wrong with my attempt? Thank you.
You can try the MediaFire API PHP Library. This class currently implements all the functions in the Mediafire API.
Ok I just took a look at their API documentation. They've updated the get_info function for the folders. They've taken out the file tree....
So if you are uploading via the dropbox (which doesn't return the quickkey associated with the file), you CAN NEVER dropbox upload and then use the api to find the file and download it. This renders their API as useless as tits on a boar hog.
The point of a dropbox is to allow remote uploads to a folder, you then know the folder key so you can query the API and return the document quickkeys that are in the folder which then allows you to manage those files remotely, move, delete, download etc. Now you cannot do this FAIL FAIL FAIL.
Despite the Functionality of get_info not working, folder search can resolve at least some issues with retrieving quick keys. In my case i searched for ".mp3" and was handed all the mp3s in my folder