I am trying to move to express 4.+ and I am having trouble figuring out what I should do about file uploads.
Migration docs says I need to come up w/ an alternative but they don't really give an example. They also dropped support for a reason, which file upload package should I use?
Also read that you should not attach a form upload to every post by using:
app.use(multer(...));
Is there a good way to attach these to just a particular route?
formidable or busboy are options.
Expressjs 4.+ has lost this feature because it's not built on top of connect anymore.
I use connect-busboy. I could not get 'busboy' to work. Other packages I came across but have never played with are 'flow' and 'parted'. There is also a npm package 'mongoose-file' which seems to let you add a file upload path to a mongoose Schema for upload to server. It may be handy if you are uploading other data to a MongoDB?
Basic connect-busboy and Formidable example using Express V4.2:
Node/Express file upload
Related
Im creating a vue app and Im trying to fetch data from this Strapi api to my VueJs app, but everything on Strapi gets deleted after a few hours. Does anyone have a solution to this?
So everytime Heroku cycles it's Dynos, your data will disappear. That is because Heroku doesn't allow for the file system to have changes made. However, you can use their mLab add on to use MongoDB or host your database on MongoDB Atlas (which is what I use). Then for media, you will need to use an external provider like AWS S3.
Same thing kept happening to me... I had to switch to Mongo DB. You can also use Cloudinary for media
I would like to access the list of all uploads that have been added to a given project on my company GitLab server.
I don't mean versionned files, I mean attached files: binaries and other types of files that have been attached to issues, merge requests, etc.
It's OK if I have to use the API for that.
What I've tried
My first approach was through GET /projects/:id/repository/files/:file_path, but that's for the versionned files.
Then, I found out about POST /projects/:id/uploads, but that's only for uploading and not for listing already uploaded files.
Is there a way to list all those uploaded files?
I believe this is not possible.
There is an open issue for retrieving specific files which has not received much attention:
https://gitlab.com/gitlab-org/gitlab-ce/issues/55520
Hopefully, in the future, there will eventually be an endpoint
GET /projects/:id/uploads
I had the same question and after getting in touch with gitlab support they confirmed that this is not currently implemented (as of now, November 2021), and forwarded me the 3 following feature requests :
API list all files on a project : https://gitlab.com/gitlab-org/gitlab/-/issues/197361
Attachment Manager : https://gitlab.com/gitlab-org/gitlab/-/issues/16229
Retrieve uploaded files using API : https://gitlab.com/gitlab-org/gitlab/-/issues/25838
A workaround seems to be to export the whole project, and you'll find the uploads in that archive, and you'll be able to list them.
File resides in the NetSuite file cabinet and needs to be placed on an FTP server each day.
I'm not sure how to handle this via Suitelet/RESTlet, or if it's possible - but would prefer to not use an external source/application.
My current and hopefully temporary workaround is a local scheduled task to run a script to pull files from NetSuite & upload to the FTP.
In SuiteScript 2.0, although unsecured FTP is still not support, but SS2.0 has the capability to do SFTP. See http://www.upilioconsulting.com/blog/netsuite-2016-2-sftp-suitescript-2-0/
In SuiteScript 1.0, it's not supported. The workaround is that you'll need to write a middleware code (i.e. in PHP) and let the middleware do the FTP transfer.
Netsuite doesn't interact with FTP.
You need a bridge server of some sort that runs a web app (full blown Apache or nginx running PHP or just a simple Node service)
Just get a server and install some web server/web service and POST your files to it (nlapiRequestURL with a Scheduled script). Have the web app on the bridge server send the files to the FTP server. If you are using Netsuite you can afford the cost of the bridge server.
One possible solution is to create a saved search on the Documents to list out all the files in Netsuite filtering by createdate or lastmodifieddate. Create a scheduler to fetch only the new files and save them locally where you want.
Note all the files will be in base64 encoded string format, you need to decode again to obtain the file.
As bknights said NetSuite doesn't support FTP. You need a web server(any server side language can do for that matter, I have written one in Node.js), to receive the files.
The file content for text file will be in Text format, so, no decode logic required for text files. However, binary/pdf/image and other would be in base64 format, as NetSuite's JS has no way of handling binary data. So, make sure you decode it before you create the file on your FTP Server.
I am using red5 for streaming videos in my project and I am able to play the videos from the local system which are saved in default folder "streams".
Now i want to customize the path and want to get the videos from S3. How do i configure red5 to work with S3. Is this a good practice?
I've got code using the IStreamFilenameGenerator works with S3; I'll warn you now that it may not work with the latest jets3 library, but you'll get the point of how it works by looking through the source. One problem / issue that you must understand when using S3 is that you cannot "record" to the bucket on-the-fly; your flv files can only be transferred to S3 once the file is finalized; there is an example upload call in the Application.class. Whereas "play" from S3 will work as expected.
I added the S3 code to the red5-examples repo: https://github.com/Red5/red5-examples
Search for:
https://stackoverflow.com/search?q=IStreamFilenameGenerator
Or https://www.google.com.au/search?q=IStreamFilenameGenerator+example&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:de:official&client=firefox-a
and you will find some examples howto modify the path(s).
You could alternatively also of course simply mount some drive into the streams folder or I guess a symbolic link would even work. But it might be not that flexible as if you can do it with IStreamFilenameGenerator and generate really some string completely like you want it.
Sebastian
I would like to upload a form from a web page and directly save the file to S3 without first saving it to disk. This node.js app will be deployed to Heroku, where there is no local disk to save the file to.
The node-formidable library provides a great way to upload files and save them to disk. I am not sure how to turn off formidable (or connect-form) from saving file first. The Knox library on the other hand provides a way to read a file from the disk and save it on Amazon S3.
1) Is there a way to hook into formidable's events (on Data) to send the stream to Knox's events, so that I can directly save the uploaded file in my Amazon S3 bucket?
2) Are there any libraries or code snippets that can allow me to directly take the uploaded file and save it Amazon S3 using node.js?
There is a similar question here but the answers there do not address NOT saving the file to disk.
It looks like there is no good way to do it. One reason might be that the node-formidable library saves the uploaded file to disk. I could not find any options to do otherwise. The knox library takes the saved file on the disk and using your Amazon S3 credentials uploads it to Amazon.
Since on Heroku I cannot save files locally, I ended up using transloadit service. Though their authentication docs have some learning curve, I found the service useful.
For those who want to use transloadit using node.js, the following code sample may help (transloadit page had only Ruby and PHP examples)
var crypto, signature;
crypto = require('crypto');
signature = crypto.createHmac("sha1", 'auth secret').
update('some string').
digest("hex")
console.log(signature);
this is Andy, creator of AwsSum:
https://github.com/appsattic/node-awssum/
I just released v0.2.0 of this library. It uploads the files that were created by Express' bodyParser() though as you say, this won't work on Heroku:
https://github.com/appsattic/connect-stream-s3
However, I shall be looking at adding the ability to stream from formidable directly to S3 in the next (v0.3.0) version. For the moment though, take a look and see if it can help. :)