I have an node / express js app that was generated using the yoman full stack generator. I have swapped out mongo / mongoose for cloudant db (which is just a paid for version of couchdb). I have a written a wrapper for the Cloudant node.js library which handles cookie auth with my instance via an init() method wrapped in a promise. I have refactored my application to not start the express server until the connection to the db has been established as per snippet below taken from my app.js
myDb.init(config).then(function (db) {
logger.write(1001, '','Connection to Cloudant Established');
// Start server
server.listen(config.port, config.ip, function () {
logger.write(1001, "",'Express server listening on '+config.port+', in '+app.get('env')+' mode');
});
});
On my express routes I have introduced a new middleware which attaches the db object to the request for use across the middleware chain as per below. This gets the db connection object before setting the two collections to use.
exports.beforeAll = function (req, res, next) {
req.my = {};
// Adding my-db
req.my.db = {};
req.my.db.connection = myDb.getDbConnection();
req.my.db.orders = req.my.db.connection.use(dbOrders);
req.my.db.dbRefData = req.my.db.connection.use(dbRefData);
next();
};
This mechanism works when i manually drive my apis through POSTman as the express server won't start until after the promise from the db connection has been resolved. However when running my automated tests the first few tests are now always failing because the application has not finished initialising with the db before jasmine starts to run my tests against the APIs. I can see in my logs the requests on the coming through and myDb.getDbConnection(); in the middleware returning undefined. I am using supertest and node-jasmine to run my tests. For example
'use strict';
var app = require('../../app');
var request = require('supertest');
describe('GET /api/content', function () {
it('should respond with JSON object', function (done) {
request(app)
.get('/api/content')
.expect(200)
.expect('Content-Type', /json/)
.end(function (err, res) {
if (err) return done(err);
expect(res.body).toEqual(jasmine.any(Object));
done();
});
});
});
So, my question is how can I prevent supertest from making the requests until the server.listen() step has been completed as a result of the myDb.init() call being resolved? OR perhaps there is some kind of jasmine beforeAll that I can use to stop it running the describes until the promise has been resolved?
You could make you app return an EventEmitter which emits a "ready" event when it has completed its initialisation.
Then your test code, in a before clause, can wait until the "ready" event arrives from the app before proceeding with the tests.
Related
I'm making a simple API in express js. I've an end point where I'll make a call to GitHub api int turn. My Front-end application will utilize it. Here is my code:
api.js
const express = require('express');
const router = express.Router();
...
var http = require('http');
var https = require("https");
router.get('/github', function (req, res) {
// APPROACH 1: Failed : fetch is not defined
// fetch('https://api.github.com/users/tmtanzeel')
// .then(response => response.json())
// .then(json => console.log(json))
// APPROACH 2: Failed : throw er; // Unhandled 'error' event
/*try {
https.get('https://api.github.com/users/tmtanzeel',function (res) {
console.log(res);
})
} catch (error) {
...
}*/
});
Both my approaches are failing. I've almost no experience with Express. Please picth in
The second method is almost corrent. Add the error handler and send to the caller the data you just received.
https.get('https://api.github.com/users/tmtanzeel',function (apiRes) {
apiRes.pipe(res);
}).on('error', (e) => {
console.error(e);
res.status(500).send('Something went wrong');
});
Handling the response (stream) received from the API call can be done in two ways.
Using pipes which is automatic
Handle read events and manage the data writing manually
I have used the first approach.
However, it is very well recommended that you get sound knowledge in handling Streams if you use Node JS. Streams are the basis of Node JS requests and responses.
https://nodejs.dev/learn/nodejs-streams
You should use Express to handle incoming requests.
(for example if your webapp fetches data from your (express) server).
Read the docs: http://expressjs.com
Your first attempt failed because fetch is an implementation of the web-browser, not one of nodejs.
If you want to use fetch try: https://www.npmjs.com/package/node-fetch
its a well documented, easy to use fetch function.
From your examples, all seems fine except I can't see you sending the returned data to the client.
You can try something similar like adding res.send(data) to send the data and make it available on the /github route
I am using jest and supertest to raise instance of express and run tests on it.
I faced with the problem of busy port that cannot still solve.
In my test I do the next:
import supertest from 'supertest';
const agent = supertest(app);
Then I go requests with agent and everything works fine.
Until I am running another test.
In app.js I have:
var app = express();
app.post('/auth/changePassword',VerifyToken, auth.changePassword);
app.listen(4001, function () {
console.log('Server is running');
});
So first spec runs perfect. But second tries to listen to the port that is already in use.
I really do not know how to close the connection here.
I tried app.close() but got error that such method. This is clear, that I have to assign
server = app.listen(4001, function () {
console.log('Server is running');
});
server.close();
But do not know how can I do that.
I also tried to preset agent in jest.setup and assign it to global variable
import app from "../../server/app";
import supertest from 'supertest';
const agent = supertest(app);
global.agent = agent;
But situation is the same, first test passed, second tries to raise express on the same port.
Supertest is able to start and stop your app for you - you should never need to explicitly kill the express server during the tests (even when running in parallel)
The problem is that in app.js your server is actually started - this means that when the app tests are run your server is started every time app.js is read (or once per test case)
You can remedy this by splitting the server start logic into a seperate file, so that importing app.js doesn't start the app (rather returns an express server instance) The pattern I normally use for this is:
// app.js
import express from 'express'
export const app = express();
app.get("/example", (req,res) => {
return res.status(200).json({data: "running"})
})
// server.js
import app from "./app"
app.listen(3000, () => console.log("listening on 3000"));
// app.spec.js
import app from "./app"
import request from "supertest"
it("should be running", async () => {
const result = await request(app).get("/example");
expect(result.body).toEqual({data: "running"});
});
I'm trying to mock API calls from Detox tests and nothing seems to be working. Nock in theory would do exactly what I want but there when I run my tests with nock debugging it isn't seeing any of the requests being made from my app. I'm using axios to make requests and I've tried setting the adapter on my axios instance to the http adapter.
Any suggestions on how to get nock working with Detox or if there is another mocking library you have had success with is appreciated, thanks!
What I ended up doing was leveraging the mocking specified in Detox docs (a separate *.e2e.js file hitting a different endpoint during tests). You define these special files that only run during e2e, I've set mine up to only hit localhost:9091 -- I then run an Express server on that port serving the endpoints I need.
Maybe an ugly way to do it, would love suggestions!
My mock file:
// src/helpers/axios.e2e.js
import axios from 'axios';
const instance = axios.create({
baseURL: `http://localhost:9091/api`,
});
export default instance;
Here's how I am running an express server during tests:
// e2e/mytest.e2e.js
const express = require('express');
let server;
beforeAll(async () => {
const app = express();
app.post('/api/users/register/', (req, res) => {
res.status(400)
res.send({"email": ["Test error: Email is required!"]})
})
await new Promise(function(resolve) {
server = app.listen(9091, "127.0.0.1", function() {
console.log(` Running server on '${JSON.stringify(server.address())}'...`);
resolve();
});
});
})
afterAll(() => {
server.close()
})
Correct me if I am wrong but getServerSideProps is used to pre-render data on each render? If I use the standard redis npm module in getServerSideProps I get the error net.isIP is not a function. From what I have researched this is due to the client trying to use the redis functions.
I am trying to create an application to where session data is saved in a redis key based on a cookie token. Based on the user Id a database is called and renders data to the component. I can get the cookie token in getServerSideProps but I I run client.get(token) I get the error net.isIP is not a function at runtime.
Am I not using getServerSideProps correctly or should I be using a different method / function? I am new to the whole Next.js world. I appreciate the help.
If I use the same functionality in an /api route everything works correctly.
import util from 'util';
import client from '../libs/redis' // using 'redis' module
// ... my component here
export async function getServerSideProps(context) {
const get = util.promisify(client.get).bind(client);
const name = await get('mytoken') // error `net.isIP is not a function`
return {
props: {
name
},
}
}
// redis.js
const redis = require("redis");
const client = redis.createClient();
client.on("error", function(error) {
console.error(error);
});
module.exports = client
I upgraded to next version 10.0.6 from 9.3 and I do not receive the error anymore.
I'm using promises to wrap asynchronous (Mongo) DB ops at the end of an (expressJS) route.
I want to try and figure out how to test the following code.
userService
userService.findOne = function (id) {
var deferred = q.defer();
User.findOne({"_id" : id})
.exec(function (error, user) {
if (error) {
deferred.reject(error);
} else {
deferred.resolve(user);
}
});
return deferred.promise;
};
userRoute
var user = function (req, res) {
var userId = req.params.id
, userService = req.load("userService");
// custom middleware that enables me to inject mocks
return userService.findOne(id)
.then(function (user) {
console.log("called then");
res.json({
msg: "foo"
});
}).catch(function (error) {
console.log("called catch");
res.json({
error: error
});
}).done();
};
Here's an attempt to test the above with mocha
userTest
it("when resolved", function (done) {
var jsonSpy = sinon.spy(httpMock.res, "json")
, httpMock = require("/path/to/mock/http/object")
, serviceMock = require("/path/to/mock/service"),
, deferred = q.defer()
, findStub = sinon.stub(serviceMock, "findOne")
.returns(deferred.promise)
, loadStub = sinon.stub(httpMock.req, "load")
.returns(serviceMock),
retPromise;
// trigger route
routes.user(httpMock.req, httpMock.res);
// force promise to resolve?
deferred.resolve();
expect(jsonSpy.called).to.be.true; // fails
// chai as promised
retPromise = findStub.returnValues[0];
expect(retPromise).to.be.fulfilled; // passes
});
the http mock is just an empty object with no-ops where expressJS would normally start rendering stuff. I've added some logging inside those no-ops to get an idea on how this is hanging together.
This isn't really working out. I want to verify how the whole is integrated, to establish some sort of regression suite - but I've effectively mocked it to smithereens and I'm just testing my mocks (not entirely successfully at that).
I'm also noticing that the console logs inside my http mocks triggered by then and catch are firing twice - but the jsonSpy that is invoked inside the actual code (verified by logging out the sinon spy within the userRoute code) is not called in test.
Has anyone got some advice on integration testing strategies for express apps backed by Mongo?
It looks to me like you're not giving your promise an opportunity to fire before you check if the result has been called. You need to wait asynchronously for userService.findOne()'s promise chain to complete before jsonSpy.called will be set. Try this instead:
// start of code as normal
q.when(
routes.user(httpMock.req, httpMock.res),
function() { expect(jsonSpy.called).to.be.true; }
);
deferred.resolve();
// rest of code as normal
That should chain off the routes.user() promise and pass as expected.
One word of caution: I'm not familiar with your framework, so I don't know if it will wait patiently for all async events to go off. If it's giving you problems calling back into your defer chain, you may want to try nodeunit instead, which handles async tests very well (IMO).