How the ftp uploads works in PHP? - file-upload

I am using the http://phpseclib.sourceforge.net/ library for file uploading using ftp.Now I cant able to understand the concept behind this "how the ftp handling the file data while uploading".
what happend is:
I created the form for file upload with submit button.When I choose the file and click on submit the loader starts.But the file not coming into server.My expectation is when I click the submit button it reads the data from the file and push into the server depending upon the packet size mentioned in phpseclib.
Any one explained me what I misunderstood or whats happening while the loader showing in the browser?
EDIT:
File upload has no issues.Only thing is why its called so late.So while uploading whether the php move the file to server into some tempdirectories. If so why I need to go for ftp upload.
I tested with 100Mb files.Files are uploaded.What my expectation is why it doesnt start immediately after click submit button?

Do you have enctype="multipart/form-data" for your form?
In html, forms need the enctype="multipart/form-data" attribute to upload a file.
The form usually looks like this:
<form id="form_id" enctype="multipart/form-data" method="POST">
<input type="file" name="file" />
<input type="submit" name="submit" value="submit" />
</form>

I'm confused. Can you post the relevant PHP code that shows
your handling of the POST request that uploads the file
how you call the FTP library to initiate the FTP transfer
What I think is happening is this: your user uploads a file to your webserver, then you initiate FTP from your webserver to the FTP server. There are 2 uploads here; 1 via HTTP and 1 via FTP. You won't see the FTP upload commence until the HTTP upload is complete.

Great question, you are dealing with 2 transaction here. The first transaction puts the file into a location on your web server. This uses HTTP as a transport via a POST method (also likely the slowest part of this transaction).
Once the initial client side upload is complete the file will be stored on your web server where a S/FTP script can now transfer.
Reading the comment below what you wanted was to transfer the file using FTP from the client side, and that's a perfectly viable solution. However this is the current process you implemented.
Your process currently.
User A uploads a file via HTTP using a web page you host.
User A waits until file upload has completed before closing his browser.
File is saved in the directory path specified in the upload script.
S/FTP script reads the file and initiates a connection with a foreign S/FTP server and begins transferring the file to that server.
If step 4 is redundant then the S/FTP script is not required at all unless what you wanted was to transfer via S/FTP from the client.
Your intention.
User A uploads a file via S/FTP using a browser based S/FTP client.
User A waits for file upload to complete.
File is saved in the directory path specified in the upload script.
In the comments you mentioned possibly implementing a Flex solution. Here are some resources I found that might help.
Flex FTP based client
Flex based FTP Client question on Stack Exchange
Implement a S/FTP client on the server side with PHPSecLib.
SFTP.php on Gist (for line numbers and highlighting)
Original PHPSec SFTP.php
I copied the original via the Sourceforge site to Gist so you can use the line numbers as a guide. The SFTP library expects either a full path to file (i.e /tmp/somefilehere ) or a valid PHP file resource. Like the one returned by fopen $fp = #fopen('/tmp/somefilehere', 'rb'); see line 1132 on gist for an example.
Once authenticated the transfer will be pretty quick compared to the initial upload. Your Server is likely in a data centre with much more bandwidth so file transfers are much faster.
You probably want to initiate a S/FTP transaction via your web browser. Its possible just not with conventional scripting languages like PHP / Python or Ruby. You can S/FTP from the browser with Flash, Flex or Java and probably some Windows technologies too.

Related

How can I set a file upload function?

I am creating an SAPUI5 WebApp with an file upload function. I try this example from SAPUi5 Explored: sap.m.sample.UploadCollection
I try with my trial account in SAP WebIDE to set the upload function ( Upload Collection)
The issue is, not allowing to upload a file in the project folder or local desktop folder.
If I upload a file it appears but i can't open it and I get a 405 HTTP
error.
Any Ideas, what the problem is?
like you see already in your posts comments, you need a backend for this task. The UploadCollection control is only usable with a backend in the background which receives the transmitted file from the control.
On the page https://sapui5.hana.ondemand.com/#/api/sap.m.UploadCollection you see
This control allows you to upload single or multiple files from your devices (desktop, tablet or phone) and attach them to the application
while you can replace "application" with "receiving backend"
Indipendent of this may I allowed to ask where do you think the file should be uploaded when not to a backend system? I mean when you choose a file from your local storage it doesn't make any sense to upload it again to your local storage?!

Export file from NetSuite's FileCabinet to FTP

File resides in the NetSuite file cabinet and needs to be placed on an FTP server each day.
I'm not sure how to handle this via Suitelet/RESTlet, or if it's possible - but would prefer to not use an external source/application.
My current and hopefully temporary workaround is a local scheduled task to run a script to pull files from NetSuite & upload to the FTP.
In SuiteScript 2.0, although unsecured FTP is still not support, but SS2.0 has the capability to do SFTP. See http://www.upilioconsulting.com/blog/netsuite-2016-2-sftp-suitescript-2-0/
In SuiteScript 1.0, it's not supported. The workaround is that you'll need to write a middleware code (i.e. in PHP) and let the middleware do the FTP transfer.
Netsuite doesn't interact with FTP.
You need a bridge server of some sort that runs a web app (full blown Apache or nginx running PHP or just a simple Node service)
Just get a server and install some web server/web service and POST your files to it (nlapiRequestURL with a Scheduled script). Have the web app on the bridge server send the files to the FTP server. If you are using Netsuite you can afford the cost of the bridge server.
One possible solution is to create a saved search on the Documents to list out all the files in Netsuite filtering by createdate or lastmodifieddate. Create a scheduler to fetch only the new files and save them locally where you want.
Note all the files will be in base64 encoded string format, you need to decode again to obtain the file.
As bknights said NetSuite doesn't support FTP. You need a web server(any server side language can do for that matter, I have written one in Node.js), to receive the files.
The file content for text file will be in Text format, so, no decode logic required for text files. However, binary/pdf/image and other would be in base64 format, as NetSuite's JS has no way of handling binary data. So, make sure you decode it before you create the file on your FTP Server.

How to detect that a file is being uploaded over FTP

My application is keeping watch on a set of folders where users can upload files. When a file upload is finished I have to apply a treatment, but I don't know how to detect that a file has not finish to upload.
Any way to detect if a file is not released yet by the FTP server?
There's no generic solution to this problem.
Some FTP servers lock the file being uploaded, preventing you from accessing it, while the file is still being uploaded. For example IIS FTP server does that. Most other FTP servers do not. See my answer at Prevent file from being accessed as it's being uploaded.
There are some common workarounds to the problem (originally posted in SFTP file lock mechanism, but relevant for the FTP too):
You can have the client upload a "done" file once the upload finishes. Make your automated system wait for the "done" file to appear.
You can have a dedicated "upload" folder and have the client (atomically) move the uploaded file to a "done" folder. Make your automated system look to the "done" folder only.
Have a file naming convention for files being uploaded (".filepart") and have the client (atomically) rename the file after upload to its final name. Make your automated system ignore the ".filepart" files.
See (my) article Locking files while uploading / Upload to temporary file name for an example of implementing this approach.
Also, some FTP servers have this functionality built-in. For example ProFTPD with its HiddenStores directive.
A gross hack is to periodically check for file attributes (size and time) and consider the upload finished, if the attributes have not changed for some time interval.
You can also make use of the fact that some file formats have clear end-of-the-file marker (like XML or ZIP). So you know, that the file is incomplete.
Some FTP servers allow you to configure a hook to be called, when an upload is finished. You can make use of that. For example ProFTPD has a mod_exec module (see the ExecOnCommand directive).
I use ftputil to implement this work-around:
connect to ftp server
list all files of the directory
call stat() on each file
wait N seconds
For each file: call stat() again. If result is different, then skip this file, since it was modified during the last seconds.
If stat() result is not different, then download the file.
This whole ftp-fetching is old and obsolete technology. I hope that the customer will use a modern http API the next time :-)
If you are reading files of particular extensions, then use WINSCP for File Transfer. It will create a temporary file with extension .filepart and it will turn to the actual file extension once it fully transfer the file.
I hope, it will help someone.
This is a classic problem with FTP transfers. The only mostly reliable method I've found is to send a file, then send a second short "marker" file just to tell the recipient the transfer of the first is complete. You can use a file naming convention and just check for existence of the second file.
You might get fancy and make the content of the second file a checksum of the first file. Then you could verify the first file. (You don't have the problem with the second file because you just wait until file size = checksum size).
And of course this only works if you can get the sender to send a second file.

Pyramid/Pylons: How to check if an uploaded file is complete in a POST request?

I'm building a web tool which allows users to upload PDFs to a server using their web browsers. The server is based on Python (Paste + Pyramid).
The problem I have right now is the following: If a user uploads a rather large file (let's say 100 MB) and they cancel the upload before it is completed, my handler code on the server is still called (instead of the request being aborted).
The problem is that the request.POST['myfile'].file is incomplete when that happens. This effectively means that the PDF file is corrupted if I simply write it to some place on the server.
When I watch the server's log, it shows a "broken pipe" exception within the Paste server; however I have no idea how to catch that exception and have it prevent my view/handler code from executing and storing the incomplete file.
Seems like the paster HTTP server does not correctly validate the uploaded form data and simply passes the request down the WSGI pipeline even if the connection (HTTP POST) was closed by the user.
I worked around this issue by simply setting up NGINX to act as a reverse proxy. This also adds some security benefits as it might be better tested than paster.
Update:
My main problem was that I was using runserver (the built in web server of manage.py). After some trial and error we ended up using WSGI.
More specifically, uWSGI and Nginx as web server. Static content is served directly by Nginx while dynamic pages are piped through uWSGI and are handled by the Python web app.
Unless you are doing something fancy (like tracking the upload progress, etc), your pylons controller should not be invoked until the entire file has been uploaded.

How do I get a status report of all files currently being uploaded via a HTTP form on an Apache Server?

How do I get a status report of all files currently being uploaded via HTTP form based file upload on an Apache Server?
I don't believe you can do this with Apache itself. The upload looks like nothing more than a POST as far as Apache cares. There are modules and other servers that do special processing to uploads so you may have some luck there. It would probably be easier to keep track of it in your application.
Check out SWFUpload, its uses Flash (in a nice way) to assist with managing multiple uploads.
There are events you can monitor for how many files of a set have been uploaded.