Is there a client or plugin that allows me to put a frame/upload field on my web site that will let my users upload files to my ftp server without an ftp client software of their own? Preferrably one where I can pre-configure the ftp username/password automatically (from data I have stored about the logged-in user)?
In a digital signage solution we let users upload video files to their account, and currently have the solution of uploading via ftp (the file is then renamed and moved, users can NOT use this feature to share video files. Furthermore it is solely about videos they themselves made, no license breakage. Just as a disclaimer =)
try third-party software like uploadify, or a service like http://www.net2ftp.com/ which can be somewhat integrated with php.
Related
I am trying to download (backup) images that customers upload for products that take custom logos (these are typically JPG, PNG, PDF, etc.) These customer files are downloadable by clicking on a hyperlink in the BigCommerce admin page for the order in question. The link is not a link to the image path but instead, a link to a service that sends the file to the browser. In other words, you have to be authenticated into the admin site to download the file. The URL looks like this:
https://mystore.com/internalapi/v1/orders/383945/products/251438/attributes/561518/download
https://mystore.com/internalapi/v1/orders/{order id}/products/{lineItem id}/attributes/{option id}/download
These are easily constructed in the API itself for a given order. If I use the link in a browser tab while I'm logged into the admin site, the file downloads.
But what I am trying to write an app to automatically download all the files (there are thousands). When I try to use this URL in an app, I get a authentication error. I tried at first using my regular API credentials but then used the credentials to log into the admin site. Both give me an authentication error.
I could not find anything documented on this so-called "internalapi." Anyone ever try to use this "internal" API that is used by the admin site?
I believe authentication is cookie based for that internal API, but there could be problems with using our non-publicly documented internal APIs in production, i.e. we may make future updates that would be breaking changes.
Images attached to orders through a file upload option also get copied to WebDAV, in the dav/product_images/configured_products folder. Another way to do this could be to use a WebDAV client library like easywebdav to connect and download the files.
Theortical question here.. Is it possible that WebRTC would be misused to push files to everyone browsing the site?
Here's a scenario:
A malicious player shares a malicious file via a WebRTC platform and once someone has browsed that platform that file would be pushed into his system and preform its malicious act.
I know that when trying to access the webcam, WebRTC asks for the users permission, is it the same with file sharing?
Javascript doesn't have APIs that execute a file on the users system, for a number of good reasons.
WebRTC doesn't add such an API (and in itself, does not have a filesharing API).
I am creating a web service that mashes up Dropbox, Soundcloud and Wordpress.
I need a callback when user places a file in his Dropbox folder so that I can update the browser user interface. Since it is possible to ask for a download link locally before a file is completely synced, I naturally expect it to be possible to get a callback when file sync has started on a file-by-file basis.
However according to what I experienced /delta only shows files that have finished syncing.
Is there a way to know when file sync starts? If it is not possible via Core API, could it be possible with a small client applet (java or something)?
The Dropbox API doesn't currently expose any notion of a pending upload or file sync status. It can only return information about files that have finished uploading.
Likewise, even with a client app running on the same OS, there currently isn't an interface for communicating with the official Dropbox desktop client to get this information.
We have been working on a web service (http://www.genomespace.org/) that allows computational biology tools that implement our REST API to, among other things, read and write files stored on "The Cloud".
Our default file storage is a common Amazon S3 bucket, but we now allow users to mount their own private S3 bucket as well as files on Dropbox.
We are now trying to enable similar functionality for Google Drive and have run into some problems unique to Google Drive that we have not encountered with S3 or Dropbox.
Only way to allow clients that are not Google-authenticated to read files unobtrusively is to make the files "Public". Our preference would be that once the user has authorized access to our application via OAuth2, the user files could remain "Private" in Google Drive.
However, even though the user has already authorized our web service to offline access to their "Private" files, we have not found a way to generate a URL that a client authorized by our system can use to GET the file directly without the client being logged into Google as well.
The closest we have come to this functionality has been to change the file permissions to "Anyone with Link", except that for files greater than 20MB Google insists on returning an intermediate web page warning that the file has not been scanned for viruses. In addition to having to mess with file permissions, this would break our existing clients. Only when the file is "Public" and we utilize URLs of the form https://googledrive.com/host/PARENT_FOLDER_ID/FILENAME can non-Google clients read the files without interference.
Have not found any way for clients that are not Google-authenticated to upload a file to Google Drive. Our API allows our authorized clients to PUT files directly to the backing file storage using URLs provided by our server. However, even if a folder is marked "Public", the client requires Google authentication credentials to save to Google Drive. We could deal with both of these issues with intermediate hops through our system (e.g., our web server would first download the file from Google Drive and then allow the client to GET it) however this would be woefully inefficient and, hopefully, unnecessary. These problems have been discussed multiple times before on stackoverflow (e.g. here and here and have read the responses very carefully, but have not seen any recent discussion.
The Google folks direct their API users to post on stackoverflow for support, so I am hoping for a fresh look from insiders.
The general answer is: dont make the drive requests through the user's browser. Insead do everything from your servers. You are the one having the (refresh) tokens for users, so you should make all requests like a proxy between the user and Drive. Same for downloading, you download it and return to the user. As long as you use each drive's token there shouldnt be rate limit/quota issues.
I have a requirement where I want HTML FTP to be accessed by AD users. When users prompted to supply their AD credentials, the IIS/FTP should mapped them directly to their folders. What I want is, an HTML "Form" page/website when its accessed from internal/external it will prompt and users will supply AD credentials. Based on the permission set on the folders users will access only their folder and will upload to their folder only. The Upload as a Browse Button which will allow them to upload specified documents to their folder only. So, When they log into their computer at work they will be able to see what has been uploaded and will be able to open that document directly.
This is actually for Part Time Lectures, currently all the shared we have on a Windows 2008 R2 and permission set based on AD.
Is this doable via HTML or any other code or FTP itself. Even third party software which will be fine too.
It might be worth taking a look at plupload - depending on your usage, you might need to pay licensing: http://www.plupload.com