upload big files to server - file-upload

I want to upload big files to server. What is the best way to this:
1) using node.js library, such as formidable
2) using nginx upload module
or may be other more faster and better solution ?

If you just want to upload big files nginx would be the better solution.
However if you want to stream files, download as you upload, then node.js would be the right tool.

Related

How to use Cloudflare and how it is different from Pinata

For my IPFS server, I use pinata.cloud and upload my content there. But it is slow and I have heard people say that Cloudflare IPFS are faster. But I couldn't find anything on how to upload to Cloudflare IPFS server instead of Pinata.
I might have misunderstood the whole thing. Does it really matter where do I upload my content as long as I upload it to IPFS? Like, is there any difference if I use IPFS desktop or use Pinata to upload my content to IPFS? How is Cloudflare faster then? or is it?

How to upload multiples images from ftp media server to cludinary?

I'm looking for a solution to load multiples images from ftp media server to cloudinary. I searched on the net and I found these links:
How can I bulk upload my images?
Bulk upload large images to cloudinary
Data upload options:
If your images are already publicly available online, you can specify their remote HTTP or HTTPS URLs instead of uploading the actual data. In this case, Cloudinary will fetch the image from its remote URL for you. This option allows for a much faster migration of your existing images
There is no information about uploading images from an ftp media server or something like that. All the available solutions are using a script and then upload images one by one.In my case I have on my server many folders of images and in each folder there are many sub-folders and I have about 10000 images.How can I do this?
You can upload to Cloudinary using an FTP source like this (in PHP):
\Cloudinary\Uploader::upload('ftp://username:password#ftp.mydomain.com/my_image.jpg');

handling file upload and serving in a distributed web application

I'm going to deploy a web application with multiple Pyramid application servers and nginx as a load balancer.
This application will have a feature for uploading files which should be available for downloading afterwards.
Total size of uploaded files may be very big so I'd like to deploy a separate file webserver to serve these static files. (this is one reason why I don't like rsync solution proposed here).
What is the best solution to handle file upload and syncronization in this case? I was thinking about NFS or something like that, but I'm not sure it is a good way to solve the problem. I suppose there must be some best-practices here or even a tool or library for these purposes.
UPDATE:
I don't want use cloud services like Dropbox, it would be nicer to find some syncronization solution inside the network segment.
UPDATE2:
I finished with setting up NFS, for now it works perfectly.
not really a python or pyramid related question. But, you should investigate distributed file systems and CDN's both of which are for this kind of thing. gridfs is easy enough to get going with. But there are plenty of other options. Both Amazon and Google have similar services.

Where to save the uploaded files?

I am developing a web application to upload .mp3 files and need to play them. I successfully uploaded the files and saving them in C:/uploads folder. I understand that as it's a web application we need to save them in the Apache web server it self. But I am not sure, where to save them.
Thanks,
Serenity.
You can use content repositories to store uploaded data, I think this is common approach. For instance, take a look at the Apache JackRabbit CR, applying it you won't easy look for uploaded files on hard drive, but you will have web interface, and also some other tools available to connect to repository and show you files there etc.
As alternative to JackRabbit, you can try Alfresco CMS, they both implement JCR, other implementations are listed here (you will them at the bottom of that page).

How can I remotely upload files to Amazon S3?

I am looking for a way to transfer files from a server to Amazon S3 bucket, without first downloading the files to my computer. All of the files I plan to transfer can be accessed publicly (e.g. http://something.com/file.ext). Everything I tried only allows me to directly upload files from my Mac to S3.
P.S. Although I have access to windows, a Mac app that can do this would be great... or maybe a browser-based solution :)
You can check out this PHP class (and a net tuts tutorial on it), it works well, I've been using it for a while now. It includes bucket creation, deletion, adding files and more. You can easily add files remotely from another server, or from the same server you're running it on.