Tiny FTP client with pre-configured settings? - a simple file uploader? - file-upload

I am looking to have users upload files to an FTP server using a "pre-configured" FTP client. By that I mean; the FTP client's connection settings have already been set in the client they have downloaded. Ideally users should be able just to drag and drop the files into a simple window and the file uploads.
I have found two applications which allow me to do this;
"FTP Maker" (softhing.com/ftpmaker.html)
This allows you to configure the FTP connection details and add a logo. You then hit a button which generates the "uploader application". This can then be distributed to users where they have to configure nothing. While this works, it doesn't have as many features as...
"FTP Uploader Creator" (devzerog.com)
Same as above, except the application can zip before uploading, can resume uploads and can also send an email after upload has completed. These are handy features I wish FTP Maker had. The issue with this application is it's developer seems to have gone out of business and only the thirty-day trial is available...
Another application is "FTPcreator" (ftpcreator.com). Unfortunately this is a little outside my price range.
I am also aware of options such as dropbox, ftpbox etc.
Do you know of any super-lite FTP clients which I could pre-configure before sending out to users? Ideally it should have the features of "FTP Uploader Creator" above. I believe this sort of tool might be used in IT to allow users to send files directly to an FTP server.
I know I could do this through a webpage. However, the files will be particularly large and well over the limits of my hosting apache upload file size limit.
I have spent literally hours looking for alternative! Any help would be greatly appreciated!

Related

Transfer from one cPanel to another cPanel without WHM access

I have cPanel access on two different servers and would like to transfer from one server to another. The original size of the account on the first server is close to 15GB.
Currently, the only two ways I can think of are:
Backup using cPanel then Restore on the second server. But this process times out. I get "Failed - Network error" error
Use FTP App like Filezilla to login and transfer files from that. I haven't tested this but I think it first downloads the files on my local machine (temp folder) then uploads them to the second server.
My problem with option 2 is that this means I will end up using 30GB of data transfer if it actually does that.
What is the best way to transfer from server to server using cPanel?
I have limited knowledge on this. I will suggest you some tips.
1.Backup using cPanel then Restore on the second server. But this process times out. I get "Failed - Network error" error
The backup creation is failed due to the high size of account. So that reason that way is not possible.
Use FTP App like Filezilla to login and transfer files from that. I haven't tested this but I think it first downloads the files on my local machine (temp folder) then uploads them to the second server.
You can download the whole content to your PC or download one folder by folder like first home, mail, and etc and upload to new cpanel account using ftp
Could you please open a support ticket to your hosting provider they will help you create from server backend using /script/pkgacct.
Thank you.

How to upload set of all the files in folder on flowForce server using system ftp store method?

I have setup flow force server in my local pc and I was able to run couple of sample jobs. Then I try to setup FTP store job by providing required details. I was also able to upload specific file in to FTP server as well.
But now I need to know whether there is a possibility to upload all the files in particular folder in to target folder in FTP server. Appreciate if someone who have knowledge above flowforce server can share their thoughts about this.

Would a public upload folder be a security issue?

On my site, I have an ability for users to upload a file into the server and be able to view all uploaded files in a directory called "public uploads" where users can view all files that's been uploaded by other users. It's the Apache directory page where it says "Index of /uploads". It's sort of a file sharing hub where people can download and share other people's files.
Would there be any security issues with this?
Can a user, say, upload a malicious PHP script, and execute it from the client side?
How can I resolve these issues, should they exist?
Possibly, it all depends on server, PHP and Apache configuration.
See OWASP's Unrestricted File Upload vulnerability page for some of the risks:
The impact of this vulnerability is high, supposed code can be
executed in the server context or on the client. The likelihood of a
detection for the attacker is high. The prevalence is common. As a
result the severity of this type of vulnerability is High.
The web
server can be compromised by uploading and executing a web-shell which
can run commands, browse system files, browse local resources, attack
other servers, and exploit the local vulnerabilities, and so forth.
This may also result in a defacement.
An attacker might be able to put
a phishing page into the website.
An attacker might be able to put
stored XSS into the website.
This vulnerability can make the website
vulnerable to some other types of attacks such as XSS.
Picture uploads
may trigger vulnerabilities in broken picture libraries on a client
(libtiff, IE had problems in the past) if the picture is published
1:1.
Script code or other code may be embedded in the uploaded file,
which gets executed if the picture is published 1:1.
Local
vulnerabilities of real-time monitoring tools, such as an antivirus,
can be exploited.
A malicious file (Unix shell script, windows virus,
reverse shell) can be uploaded on the server in order to execute code
by an administrator or webmaster later -- on the server or on a client
of the admin or webmaster.
The web server might be used as a server in
order to host of malware, illegal software, porn, and other objects.
See my other post for some general guidelines on making file uploads safe.
Allowing users to upload files to a public folder does not pose a risk for your server. They cannot run these files on your server.
It does pose a risk for users that download any of these files. These files may contain a virus or malware. Opening any of these files is a high security risk for your users. Not sure you're doing them a favor offering such a feature.

SFTP automation Using WinSCP or FileZilla

So, as part of my daily jobs, I have to transfer a one file from our customers server to our internal server and any responses back.
Each customer effectively has one file up and one file down each day.
I have an SFTP server here that I can use and is already used manually for a few sites.
I'm looking to automate as many sites as possible using batch files on a scheduled task.
Initially, I'm looking at automating the internal side of the process.
We simple have a requests folder that needs to import from the SFTP (then delete the original on the SFTP) and a response folder which needs to copy to a 'sent' folder and then export to the SFTP (also, deleteing the original)
On the SFTP server I have a "to site" and "from site" folder. Each file is site specific followed by a variable. So SiteNameImport.<variable> and SiteNameExport.<variable>
EDIT:
I'm asking this as I'm a novice at scripting and basically have no idea what to do.
I've tried reading the automation guide on WinSCP website but a lot of it means nothing to me.
Filezilla doesn't support automation, You're better off with WinSCP. They have some scripting examples here as well as any other information you'll need to build the scripts functionality. You'll just need to add the specifics (Like deleting sent files and so on) CuteFTP is also another solution you can script with but I believe you have to pay for a licence. I suggest VBscript, Examples can be found Here for vbscript.

Programmable ftp server

We have a need to set up a ftp server. Many clients will upload files regularly. Each client will use a different ftp account. We will be called by an external system to provision a new client - we can either take username/password from the external aystem or generate them and pass them back. We will probably want to create a directory per client for them to upload files.
When a client uploads a file, we want to be notified, process the file, and pass it on to another external system (then rename or otherwise identify the file as processed)
So I am after suggestions for a ftp server that can have accounts added programmatically. Ideally, it would also handle the directories and new-upload notifications, but we can do these in other ways.
This would be on Red Hat Linux (ideally), or Solaris is an option.
Google says RedHat 9 Comes with vsftpd installed.
https://security.appspot.com/vsftpd.html