Programmable ftp server - authentication

We have a need to set up a ftp server. Many clients will upload files regularly. Each client will use a different ftp account. We will be called by an external system to provision a new client - we can either take username/password from the external aystem or generate them and pass them back. We will probably want to create a directory per client for them to upload files.
When a client uploads a file, we want to be notified, process the file, and pass it on to another external system (then rename or otherwise identify the file as processed)
So I am after suggestions for a ftp server that can have accounts added programmatically. Ideally, it would also handle the directories and new-upload notifications, but we can do these in other ways.
This would be on Red Hat Linux (ideally), or Solaris is an option.

Google says RedHat 9 Comes with vsftpd installed.
https://security.appspot.com/vsftpd.html

Related

Transfer from one cPanel to another cPanel without WHM access

I have cPanel access on two different servers and would like to transfer from one server to another. The original size of the account on the first server is close to 15GB.
Currently, the only two ways I can think of are:
Backup using cPanel then Restore on the second server. But this process times out. I get "Failed - Network error" error
Use FTP App like Filezilla to login and transfer files from that. I haven't tested this but I think it first downloads the files on my local machine (temp folder) then uploads them to the second server.
My problem with option 2 is that this means I will end up using 30GB of data transfer if it actually does that.
What is the best way to transfer from server to server using cPanel?
I have limited knowledge on this. I will suggest you some tips.
1.Backup using cPanel then Restore on the second server. But this process times out. I get "Failed - Network error" error
The backup creation is failed due to the high size of account. So that reason that way is not possible.
Use FTP App like Filezilla to login and transfer files from that. I haven't tested this but I think it first downloads the files on my local machine (temp folder) then uploads them to the second server.
You can download the whole content to your PC or download one folder by folder like first home, mail, and etc and upload to new cpanel account using ftp
Could you please open a support ticket to your hosting provider they will help you create from server backend using /script/pkgacct.
Thank you.

How to upload set of all the files in folder on flowForce server using system ftp store method?

I have setup flow force server in my local pc and I was able to run couple of sample jobs. Then I try to setup FTP store job by providing required details. I was also able to upload specific file in to FTP server as well.
But now I need to know whether there is a possibility to upload all the files in particular folder in to target folder in FTP server. Appreciate if someone who have knowledge above flowforce server can share their thoughts about this.

Tiny FTP client with pre-configured settings? - a simple file uploader?

I am looking to have users upload files to an FTP server using a "pre-configured" FTP client. By that I mean; the FTP client's connection settings have already been set in the client they have downloaded. Ideally users should be able just to drag and drop the files into a simple window and the file uploads.
I have found two applications which allow me to do this;
"FTP Maker" (softhing.com/ftpmaker.html)
This allows you to configure the FTP connection details and add a logo. You then hit a button which generates the "uploader application". This can then be distributed to users where they have to configure nothing. While this works, it doesn't have as many features as...
"FTP Uploader Creator" (devzerog.com)
Same as above, except the application can zip before uploading, can resume uploads and can also send an email after upload has completed. These are handy features I wish FTP Maker had. The issue with this application is it's developer seems to have gone out of business and only the thirty-day trial is available...
Another application is "FTPcreator" (ftpcreator.com). Unfortunately this is a little outside my price range.
I am also aware of options such as dropbox, ftpbox etc.
Do you know of any super-lite FTP clients which I could pre-configure before sending out to users? Ideally it should have the features of "FTP Uploader Creator" above. I believe this sort of tool might be used in IT to allow users to send files directly to an FTP server.
I know I could do this through a webpage. However, the files will be particularly large and well over the limits of my hosting apache upload file size limit.
I have spent literally hours looking for alternative! Any help would be greatly appreciated!

What is the fastest way to upload the big files to the server

I have got dedicated server and file about 4 GB to upload on the server. What is the fastest and most save way to upload that file to the server?
FTP may create issues if the connection will be broken.
SFTP will have the same issue as well.
Do you have your own computer available through internet public IP as well?
In that case you may try to set up a simple HTTP server (if you have Windows - just set up the IIS) and then use some download manager on dedicated server (depends from OS) to download the file through HTTP (it can use multiple streams for that) or do this through torrent.
There're trackers, like http://openbittorrent.com/, which will allow you to keep the file on your computer and then use some torrent client to upload the file to the dedicated server.
I'm not sure what OS your remote server is running but I would use wget it has a --continue from the man page:
--continue
Continue getting a partially-downloaded file. This is useful when
you want to finish up a download started by a previous instance of
Wget, or by another program. For instance:
wget -c ftp://sunsite.doc.ic.ac.uk/ls-lR.Z
If there is a file named ls-lR.Z in the current directory, Wget
will assume that it is the first portion of the remote file, and
will ask the server to continue the retrieval from an offset equal
to the length of the local file.
wget binaries are available for GNU/Linux / Windows / MacOSX / dos:
http://wget.addictivecode.org/FrequentlyAskedQuestions?action=show&redirect=Faq#download

Multiple Website Backup

Does anyone know of a script or program that can be used for backing up multiple websites?
Ideally, I would like the have it setup on a server where the backups will be stored.
I would like to be able to add the website login info, and it connects and creates a zip file or similar that it would then be sent back to the remote server to be saved as a backup etc...
But it would also need to be able to be set up as a cron so it backed up everyday at least?
I can find PC to Server backups that a similar, but no server to server remote backup scripts etc...
It would be heavily used, and needs to be a gui so the less techy can use it too?
Does anyone know of anything similar to what we need?
HTTP-Track website mirroring utility.
Wget and scripts
RSync and FTP login (or SFTP for security)
Git can be used for backup and has security features and networking ability.
7Zip can be called from the command line to create a zip file.
In any case you will need to implement either secure FTP (SSH secured) OR a password-secured upload form. If you feel clever you might use WebDAV.
Here's what I would do:
Put a backup generator script on each website (outputting a ZIP)
Protect its access with a .htpasswd file
On the backupserver, make a cron script download all the backups and store them