send file and text in the same response in express - express

I have developed an API in express that returns text and images separately in two different calls, but for performance reasons I would like to return the text and image (probably in a blob object) in the same response.
Now I'm doing it with the res.send() and res.sendFile() methods but I haven't been able to read anything in the documentation about how to send both at the same time. How could I do it?
Examples of what I'm doing now:
async getImage(req, res){
//Doing things with req
let mypath = './foo/bar'
let file = new Blob(path.resolve(mypath), {type: 'image/png'});
return res.sendFile(file)
}
async getText(req, res){
// Doing things with req
return res.status(200).send({ message: 'foo' })
}

Related

Sending response in async function

I need to return an array of labels, but I can only return 1 of the labels so far. The error which I get is "Cannot set headers after they are sent to the client". So I tried res.write and placed res.end after my for loop then I get the obvious error of doing a res.end before a res.write. How do I solve this?
for(let i=0;i<arr.length;i++){
request.get(arr[i], function (error, response, body) {
if (!error && response.statusCode == 200) {
myfunction();
async function myfunction(){
const Labels = await Somefunctioncallwhoseresponseigetlater(body)
res.send(Labels);
}
}
});}
New code-
async function getDataSendResponse(res) {
let allLabels = [];
for (let url of arr) {
let body = await got(url).buffer();
var imgbuff= Buffer.from(body,'base64')
const imageLabels = await rekognition.detectLabels(imgbuff);
allLabels.push(...imageLabels);
}
res.send(allLabels);
}
The error I have with this code is
"Resolver: AsyncResolver
TypeError: Cannot destructure property Resolver of 'undefined' or 'null'."
You are trying to call res.send() inside a for loop. That means you'll be trying to call it more than once. You can't do that. You get to send one response for any given http request and res.send() sends an entire response. So, when you try to call it again inside the loop, you can the warning you see.
If what you're trying to do is to send an array of labels, then you need to accumulate the array of labels first and then make one call to res.send() to send the final array.
You don't show the whole calling context here, but making the following assumptions:
Somefunctioncallwhoseresponseigetlater() returns a promise that resolves when it is done
You want to accumulate all the labels you collected in your loop
Your Labels variable is an array
Your http request returns a text response. If it returns something else like JSON, then .text() would need to be changed to .json().
then you can do it like this:
const got = require('got');
async function getDataSendResponse(res) {
let allLabels = [];
for (let url of arr) {
let body = await got(url).buffer();
const labels = await Somefunctioncallwhoseresponseigetlater(body);
allLabels.push(...labels);
}
res.send(allLabels);
}
Note, I'm using the got() library instead of the deprecated request() library both because request() is not deprecated and because this type of code is way easier when you have an http library that supports promises (like got() does).

HapiJS reply with readable stream

For one call, I am replying with a huge JSON object which sometimes causes the Node event loop to become blocked. As such, I'm using Big Friendly JSON package to stream JSON instead. My issue is I cannot figure out how to actually reply with the stream
My original code was simply
let searchResults = s3Access.getSavedSearch(guid)).Body;
searchResults = JSON.parse(searchResults.toString());
return reply(searchResults);
Works great but bogs down on huge payloads
I've tried things like, using the Big Friendly JSON package https://gitlab.com/philbooth/bfj
const stream = bfj.streamify(searchResults);
return reply(stream); // according to docs it's a readable stream
But then my browser complained about an empty response. I then tried to add the below to the reply, same result.
.header('content-encoding', 'json')
.header('Content-Length', stream.length);
I also tried return reply(null, stream); but that produced a ton of node errors
Is there some other way I need to organize this? My understanding was I could just reply a readable stream and Hapi would take care of it, but the response keeps showing up as empty.
Did you try to use h.response, here h is reply.
Example:
handler: async (request, h) => {
const { limit, sortBy, order } = request.query;
const queryString = {
where: { status: 1 },
limit,
order: [[sortBy, order]],
};
let userList = {};
try {
userList = await _getList(User, queryString);
} catch (e) {
// throw new Boom(e);
Boom.badRequest(i18n.__('controllers.user.fetchUser'), e);
}
return h.response(userList);
}

Need my server to return a response that includes a data error. Need client to see what was wrong with data in request

As it will become quickly apparent, I have never seriously written a webserver before
Here is the current scenario:
Clients make requests to webserver, asking to save some data
Server looks at payload, and makes 2 checks
a. Is this client banned from saving data?
b. Does the payload of this data pass a language filter?
Server responds with success, or one of those 2 errors
My endpoint is written with Express in TypeScript
class ChatRequest {
public uid: string;
public message: string;
}
export const register = (app: express.Application, deps: dependencies.IDependencies) => {
app.post("/sendChat", (req: express.Request, res: express.Response) => {
transformAndValidate(ChatRequest, req.body)
.then((sendGlobalChatRequest: SendGlobalChatRequest) => {
const payload = {
message: sendGlobalChatRequest.message,
uid: sendGlobalChatRequest.uid
};
//Check if uid is banned here
//Check if payload passes language filter here
//Save Payload here
res.sendStatus(200);
}, (err) => {
deps.logger.error(err);
res.sendStatus(503);
});
});
I have been using this article for reference:
https://hackernoon.com/the-request-sent-bad-data-whats-the-response-94088bd290a
But I think my conclusion is that they are discussing something slightly different.
So from my understanding, I can just make up HTTP codes...
so I could just do res.sendStatus(499); if the uid is banned, and maybe res.sendStatus(498); if the payload doesn't pass language filter
Then my client can just read the Int statusCode and quickly determine the failure.
But even though I think I can do that, and it would work, it doesn't seem right?
Should I instead be using a standard HTTP Response Code? https://developer.mozilla.org/en-US/docs/Web/HTTP/Status
And then add in the body of the response, a String or something that my client can parse to determine the error?
The String parsing seems way harder to maintain, but technically seems more "legal" if that makes sense?
What is the best way for me to have a client determine the type of server-side error?
I decided to return 400 with a JSON mapping errors to bools
if (isProfane(message)) {
res.status(400).json({messageContentBlocked: true});
}
In this way the client can receive multiple errors for the request at once, and it's more explicit
And in case anyone is googling around, I am using RxSwift/RxCocoa
Here is how I handle the error on the client:
extension Error {
var chatMessageBlockedURLError: Bool {
guard let rxCocoaURLError = self as? RxCocoaURLError else {return false}
switch rxCocoaURLError {
case let .httpRequestFailed(response, data):
guard response.statusCode == 400, let data = data else {return false}
let decoder = JSONDecoder()
decoder.dateDecodingStrategy = .millisecondsSince1970
guard let errors = try? decoder.decode([String:Bool].self, from: data) else {return false}
return errors["messageContentBlocked"] == true
default:
return false
}
}
}

Dojo datagrid jsonrest response headers

I'd like to use custom headers to provide some more information about the response data. Is it possible to get the headers in a response from a dojo datagrid hooked up to a jsonRest object via an object store (dojo 1.7)? I see this is possible when you are making the XHR request, but in this case it is being made by the grid.
The API provides an event for a response error which returns the response object:
on(this.grid, 'FetchError', function (response, req) {
var header = response.xhr.getAllResponseHeaders();
});
using this I am successfully able to access my custom response headers. However, there doesn't appear to be a way to get the response object when the request is successful. I have been using the undocumented private event _onFetchComplete with aspect after, however, this does not allow access to the response object, just the response values
aspect.after(this.grid, '_onFetchComplete', function (response, request)
{
///unable to get headers, response is the returned values
}, true);
Edit:
I managed to get something working, but I suspect it is very over engineered and someone with a better understanding could come up with a simpler solution. I ended up adding aspect around to allow me to get hold of the deferred object in the rest store which is returned to the object store. Here I added a new function to the deffered to return the headers. I then hooked in to the onFetch of the object store using dojo hitch (because I needed the results in the current scope). It seems messy to me
aspect.around(restStore, "query", function (original) {
return function (method, args) {
var def = original.call(this, method, args);
def.headers = deferred1.then(function () {
var hd = def.ioArgs.xhr.getResponseHeader("myHeader");
return hd;
});
return def;
};
});
aspect.after(objectStore, 'onFetch', lang.hitch(this, function (response) {
response.headers.then(lang.hitch(this, function (evt) {
var headerResult = evt;
}));
}), true);
Is there a better way?
I solved this today after reading this post, thought I'd feed back.
dojo/store/JsonRest solves it also but my code ended up slightly different.
var MyStore = declare(JsonRest, {
query: function () {
var results = this.inherited(arguments);
console.log('Results: ', results);
results.response.then(function (res) {
var myheader = res.xhr.getResponseHeader('My-Header');
doSomethingWith(myheader);
});
return results;
}
});
So you override the normal query() function, let it execute and return its promise, and attach your own listener to its 'response' member resolving, in which you can access the xhr object that has the headers. This ought to let you interpret the JsonRest result while fitting nicely into the chain of the query() all invokers.
One word of warning, this code is modified for posting here, and actually inherited from another intermediary class that also overrode query(), but the basics here are pretty sound.
If what you want is to get info from the server, also a custom key-value in the cookie can be a solution, that was my case, first I was looking for a custom response header but I couldn't make it work so I did the cookie way getting the info after the grid data is fetched:
dojo.connect(grid, "_onFetchComplete", function (){
doSomethingWith(dojo.cookie("My-Key"));
});
This is useful for example to present a SUM(field) for all rows in a paginated datagrid, and not only those included in the current page. In the server you can fetch the COUNT and the SUM, the COUNT will be sent in the Content-Range header and the SUM can be sent in the cookie.

when to check for file size/mimetype in node.js upload script?

I created an upload script in node.js using express/formidable. It basically works, but I am wondering where and when to check the uploaded file e. g. for the maximum file size or if the file´s mimetype is actually allowed.
My program looks like this:
app.post('/', function(req, res, next) {
req.form.on('progress', function(bytesReceived, bytesExpected) {
// ... do stuff
});
req.form.complete(function(err, fields, files) {
console.log('\nuploaded %s to %s', files.image.filename, files.image.path);
// ... do stuff
});
});
It seems to me that the only viable place for checking the mimetype/file size is the complete event where I can reliably use the filesystem functions to get the size of the uploaded file in /tmp/ – but that seems like a not so good idea because:
the possibly malicious/too large file is already uploaded on my server
the user experience is poor – you watch the upload progress just to be told that it didnt work afterwards
Whats the best practice for implementing this? I found quite a few examples for file uploads in node.js but none seemed to do the security checks I would need.
With help from some guys at the node IRC and the node mailing list, here is what I do:
I am using formidable to handle the file upload. Using the progress event I can check the maximum filesize like this:
form.on('progress', function(bytesReceived, bytesExpected) {
if (bytesReceived > MAX_UPLOAD_SIZE) {
console.log('### ERROR: FILE TOO LARGE');
}
});
Reliably checking the mimetype is much more difficult. The basic Idea is to use the progress event, then if enough of the file is uploaded use a file --mime-type call and check the output of that external command. Simplified it looks like this:
// contains the path of the uploaded file,
// is grabbed in the fileBegin event below
var tmpPath;
form.on('progress', function validateMimetype(bytesReceived, bytesExpected) {
var percent = (bytesReceived / bytesExpected * 100) | 0;
// pretty basic check if enough bytes of the file are written to disk,
// might be too naive if the file is small!
if (tmpPath && percent > 25) {
var child = exec('file --mime-type ' + tmpPath, function (err, stdout, stderr) {
var mimetype = stdout.substring(stdout.lastIndexOf(':') + 2, stdout.lastIndexOf('\n'));
console.log('### file CALL OUTPUT', err, stdout, stderr);
if (err || stderr) {
console.log('### ERROR: MIMETYPE COULD NOT BE DETECTED');
} else if (!ALLOWED_MIME_TYPES[mimetype]) {
console.log('### ERROR: INVALID MIMETYPE', mimetype);
} else {
console.log('### MIMETYPE VALIDATION COMPLETE');
}
});
form.removeListener('progress', validateMimetype);
}
});
form.on('fileBegin', function grabTmpPath(_, fileInfo) {
if (fileInfo.path) {
tmpPath = fileInfo.path;
form.removeListener('fileBegin', grabTmpPath);
}
});
The new version of Connect (2.x.) has this already baked into the bodyParser using the limit middleware: https://github.com/senchalabs/connect/blob/master/lib/middleware/multipart.js#L44-61
I think it's much better this way as you just kill the request when it exceeds the maximum limit instead of just stopping the formidable parser (and letting the request "go on").
More about the limit middleware: http://www.senchalabs.org/connect/limit.html