I am building a REST API in Express and I'm trying to mock Redis in my Jasmine unit tests (using redis-mock).
If supertest is making a request to the API, how do I tell my app to use the mock Redis instead of the actual redis? .I know I'll probably need to create separate modules for redis that I can swap out somehow, just not sure how to swap it out for a regular request vs a supertest request.
Unit test:
describe('Instance API v1', () => {
it('returns a list of instances', (done) => {
request(app.default)
.get('/v1/instance')
.set('Authorization', 'Bearer '+authToken)
.expect(200)
.expect('Content-Type', 'application/json; charset=utf-8')
.end((error) => (error) ? done.fail(error) : done());
});
});
Route handler:
getAll = (request: express.Request, response: express.Response) => {
let redis: RedisClient = request.app.locals.redisclient;
let keys:string[] = [];
let prefix = 'instance/*';
const scanner = new RedisScan(redis);
scanner.scan('instances/*', (err, matchingKeys) => {
if (err) throw(err);
// matchingKeys will be an array of strings if matches were found
// otherwise it will be an empty array.
console.log(matchingKeys);
response.json(matchingKeys);
});
};
in my previous experience, I didn't mock Redis in integration test so I can test the flow in full functionality.
If you want to mock the Redis, you must do it before you require and initiate your application in the test something like:
before(function() {
const matchingKeys = '1234';
sinon.stub(RedisScan.prototype, 'scan').yields(null, matchingKeys);
const app = require('./app');
return app.init(); // your init or whatever function to initiate your express app
});
Hope it helps
Related
Looking at using next js and the new api routes with Apollo server. However I’m not using one source of data. I will be using contentful and shopify, however could be more 3rd party APIs.
How do you consume multiple 3rd party APIs and access all the data from through a custom end point?
Any examples?
You can combine multiple GraphQL APIs into a unified endpoint using a technique called GraphQL schema stitching. Apollo has an extensive guide on the topic.
I've also written a two part guide how to stitch the Contentful API together with other APIs: Part 1, Part 2.
All of these examples are build around Apollo Server but the same approach works with Apollo Client as well. So you can choose to do the stitching either as part of your API (adding your GraphQL API, Shopify and Contentful to a unified endpoint) or doing it client side. These have both advantages and disadvantages. Server side will generally result less requests to be made from your client while doing it client side will remove the single point of failure that is your stitching proxy.
A basic set-up, using Apollo Server, would be the following:
const {ApolloServer} = require('apollo-server');
const {HttpLink} = require('apollo-link-http');
const {setContext} = require('apollo-link-context');
const {
introspectSchema,
makeRemoteExecutableSchema,
mergeSchemas
} = require('graphql-tools');
const fetch = require('node-fetch');
const {GITHUB_TOKEN, CONTENTFUL_SPACE, CONTENTFUL_TOKEN} = require('./config.json');
const gitHubLink = setContext((request) => ({
headers: {
'Authorization': `Bearer ${GITHUB_TOKEN}`,
}
})).concat(new HttpLink({uri: 'https://api.github.com/graphql', fetch}));
const contentfulLink = setContext((request) => ({
headers: {
'Authorization': `Bearer ${CONTENTFUL_TOKEN}`,
}
})).concat(new HttpLink({uri: `https://graphql.contentful.com/content/v1/spaces/${CONTENTFUL_SPACE}`, fetch}));
async function startServer() {
const gitHubRemoteSchema = await introspectSchema(gitHubLink);
const gitHubSchema = makeRemoteExecutableSchema({
schema: gitHubRemoteSchema,
link: gitHubLink,
});
const contentfulRemoteSchema = await introspectSchema(contentfulLink);
const contentfulSchema = makeRemoteExecutableSchema({
schema: contentfulRemoteSchema,
link: contentfulLink,
});
const schema = mergeSchemas({
schemas: [
gitHubSchema,
contentfulSchema
]
});
const server = new ApolloServer({schema});
return await server.listen();
}
startServer().then(({ url }) => {
console.log(`🚀 Server ready at ${url}`);
});
Note that this will just merge the two GraphQL schemas. You might have conflicts if two APIs have the same name for a type or root field.
Two make sure you don't have any conflicts I suggest you transform all schema to give the types and root fields unique prefixes. This could look this:
async function getContentfulSchema() {
const contentfulRemoteSchema = await introspectSchema(contentfulLink);
const contentfulSchema = makeRemoteExecutableSchema({
schema: contentfulRemoteSchema,
link: contentfulLink,
});
return transformSchema(contentfulSchema, [
new RenameTypes((name) => {
if (name.includes('Repository')) {
return name.replace('Repository', 'RepositoryMetadata');
}
return name;
}),
new RenameRootFields((operation, fieldName) => {
if (fieldName.includes('repository')) {
return fieldName.replace('repository', 'repositoryMetadata');
}
return fieldName;
})
]);
}
This will let you use all the apis in a unified manner but you can't query across the different APIs. To do that, you'll have to stitch together fields. I suggest you dive into the links above, they explain this in more depth.
I'm trying to instrument my React Web App using https://docs.aws.amazon.com/xray/latest/devguide/scorekeep-client.html
I am using axios interceptor,But unable to instrument any further ideas?
Here's the axois interceptor code you'll need for X-Ray. Axios does not use the base HTTP library from Node, so you'll need to include this patcher.
I recently wrote a sample app to be published, here is the snippet I used.
Hopefully this helps.
const xray = require('aws-xray-sdk-core');
const segmentUtils = xray.SegmentUtils;
let captureAxios = function(axios) {
//add a request interceptor on POST
axios.interceptors.request.use(function (config) {
var parent = xray.getSegment();
var subsegment = parent.addNewSubsegment(config.baseURL + config.url.substr(1));
subsegment.namespace = 'remote';
let root = parent.segment ? parent.segment : parent;
let header = 'Root=' + root.trace_id + ';Parent=' + subsegment.id + ';Sampled=' + (!root.notTraced ? '1' : '0');
config.headers.get={ 'x-amzn-trace-id': header };
config.headers.post={ 'x-amzn-trace-id': header };
xray.setSegment(subsegment);
return config;
}, function (error) {
var subsegment = xray.getSegment().addNewSubsegment("Intercept request error");
subsegment.close(error);
return Promise.reject(error);
});
// Add a response interceptor
axios.interceptors.response.use(function (response) {
var subsegment = xray.getSegment();
const res = { statusCode: response.status, headers: response.headers };
subsegment.addRemoteRequestData(response.request, res, true);
subsegment.close();
return response;
}, function (error) {
var subsegment = xray.getSegment();
subsegment.close(error);
return Promise.reject(error);
});
};
module.exports = captureAxios;
Usage
Just pass in an initialized instance of Axios.
For React, you'll have to tell me a bit more about what your setup is, and what you're trying to accomplish. X-Ray only cares about the routes in your application - typically interceptors are setup on the routes to collect data and create (and close) the root segment (see the X-Ray SDK for Node Express here). For browser-based integration, we're still discussing possible options from the X-Ray end.
I'm in the middle of learninig Cycle.JS and ran into a challenge. I have a component that will get a result from an HTTP call and I'd like to persist this response in indexDB. However, I feel that the request for persistence is the responsibility of another component.
The questions I have are:
Is this a use case for a custom driver that persists HTTP responses to indexDB?
How does another component access the response stream for a request it did not make?
When I try to select the category from the HTTP source, nothing gets logged to the console. I'm using xstream, so the streams should be hot and I expect debug to output. What's going on here?
Below is my component that makes the HTTP call:
import { Feed } from './feed'
export function RssList ({HTTP, props}, feedAdapter = x => x) {
const request$ = props.url$
.map(url => ({
url: url,
method: 'GET',
category: 'rss'
}))
const response$ = HTTP
.select('rss')
.flatten()
.map(feedAdapter)
const vDom$ = response$
.map(Feed)
.startWith('')
return {
DOM: vDom$,
HTTP: request$
}
}
Here is my attempt at accessing the response at the app level:
export function main (sources) {
const urlSource = url$(sources)
const rssSink = rss$(sources, urlSource.value)
const vDom$ = xs.combine(urlSource.DOM, rssSink.DOM)
.map(([urlInput, rssList]) =>
<div>
{urlInput}
{rssList}
</div>
)
sources.HTTP.select('rss').flatten().debug() // nothing happens here
return {
DOM: vDom$,
HTTP: rssSink.HTTP
}
}
Selecting a category in the main (the parent) component is the correct approach, and is supported.
The only reason why sources.HTTP.select('rss').flatten().debug() doesn't log anything is because that's not how debug works. It doesn't "subscribe" to the stream and create side effects. debug is essentially like a map operator that uses an identity function (always takes x as input and outputs x), but with a logging operation as a side effect. So you either need to replace .debug() with .addListener({next: x => console.log(x)}) or use the stream that .debug() outputs and hook it with the operator pipeline that goes to sinks. In other words, debug is an in-between logging side effect, not a destination logging side effect.
Question #1: Custom HTTP->IDB Driver: It depends on the nature of the project, for a simple example I used a general CycleJS IDB Driver. See example below or codesandbox.io example.
Question #2: Components Sharing Streams: Since components and main share the same source/sink API you can link the output (sink) of one component to the input (source) of another. See example below or codesandbox.io example.
Question #3: debug and Logging: As the authoritative (literally) André Staltz pointed out debug needs to be inserted into a completed stream cycle, I.E., an already subscribed/listened stream.
In your example you can put debug in your RssList component:
const response$ = HTTP
.select('rss')
.flatten()
.map(feedAdapter)
.debug()
OR add a listener to your main example:
sources.HTTP.select('rss').flatten().debug()
.addListener({next: x => console.log(x)})
OR, what I like to do, is include a log driver:
run(main, {
DOM: makeDOMDriver('#app'),
HTTP: makeHTTPDriver(),
log: log$ => log$.addListener({next: log => console.log(log)}),
})
Then I'll just duplicate a stream and send it to the log sink:
const url$ = props.url
const http$ = url$.map(url => ({url: url, method: 'GET', category: 'rss'}))
const log$ = url$
return {
DOM: vdom$,
HTTP: http$,
log: log$,
}
Here's some example code for sending HTTP response to IndexedDB storage, using two components that share the data and a general IndexedDB driver:
function main(sources) {
const header$ = xs.of(div('RSS Feed:'))
const rssSink = RssList(sources) // input HTTP select and props
// output VDOM and data for IDB storage
const vDom$ = xs.combine(header$, rssSink.DOM) // build VDOM
.map(([header, rssList]) => div([header, rssList])
)
const idbSink = IdbSink(sources, rssSink.IDB) // output store and put HTTP response
return {
DOM: vDom$,
HTTP: rssSink.HTTP, // send HTTP request
IDB: idbSink.put, // send response to IDB store
log: idbSink.get, // get and log data stored in IDB
}
}
function RssList({ HTTP, props }, feedAdapter = x => x) {
const request$ = props.url$
.map(url => ({url: url, method: 'GET', category: 'rss'}))
const response$ = HTTP.select('rss').flatten().map(feedAdapter)
const idb$ = response$
const vDom$ = response$
.map(Feed)
.startWith(div('','...loading'))
return {
DOM: vDom$,
HTTP: request$,
IDB: { response: idb$ },
}
}
function Feed (feed) {
return div('> ' + feed)
}
function IdbSink(sources, idb) {
return {
get: sources.IDB.store('rss').getAll()
.map(obj => (obj['0'] && obj['0'].feed) || 'unknown'),
put: idb.response
.map(feedinfo => $put('rss', { feed: feedinfo }))
}
}
run(main, {
props: () => ({ url$: xs.of('http://lorem-rss.herokuapp.com/feed') }),
DOM: makeDOMDriver('#root'),
HTTP: makeHTTPDriver(),
IDB: makeIdbDriver('rss-db', 1, upgradeDb => {
upgradeDb.createObjectStore('rss', { keyPath: 'feed' })
}),
log: log$ => log$.addListener({next: log => console.log(log)}),
})
This is a contrived example, simply to explore the issues raised. Codesandbox.io example.
I'm trying to test an endpoint from my express application using Jest. I'm migrating from Mocha to try out Jest to improve the speed. However, my Jest test does not close? I'm at a loss...
process.env.NODE_ENV = 'test';
const app = require('../../../index');
const request = require('supertest')(app);
it('should serve the apple-app-site-association file /assetlinks.json GET', async () => {
const response = await request.get('/apple-app-site-association')
expect(response.statusCode).toBe(200);
});
So the only thing I could think for this failing is that you might be missing the package babel-preset-env.
In any case there are two other ways to use supertest:
it('should serve the apple-app-site-association file /assetlinks.json GET', () => {
return request.get('/apple-app-site-association').expect(200)
})
or
it('should serve the apple-app-site-association file /assetlinks.json GET', () => {
request.get('/apple-app-site-association').then(() => {
expect(response.statusCode).toBe(200);
done()
})
})
async is the fancy solution, but is also the one that has more requirements. If you manage to find what was the issue let me know :).
(Reference for my answer: http://www.albertgao.xyz/2017/05/24/how-to-test-expressjs-with-jest-and-supertest/)
it("should serve the apple-app-site-association file /assetlinks.json GET", async () => {
await request
.get("/apple-app-site-association")
.send()
.expect(200);
});
if your configuration setup is correct this code should work.
I'm trying to test a REST API wrapped in an AngularJS service using Jasmine async testing. I know, I should be, and am, running tests on the API on the server, and using a mock $httpBackend to test the service, but I'd still like to know how to do async tests where I need to.
The problem I'm having is my deferreds (returned by $http) never seem to resolve. Going past the service and trying to simply use $http has the same problem. What am I doing wrong here:
describe('Jasmine + Angular + $http', function() {
var injector;
beforeEach(function() {
injector = angular.injector(['ng']);
});
it('can be tested', function() {
injector.invoke(function($http) {
var complete = false;
runs(function() {
$http.get('...').then(function(res) {
complete = true;
});
});
waitsFor(function() {
return complete;
}, 'query to complete', 5000);
runs(function() {
expect(complete).toEqual(true);
});
});
});
});
(I'm using grunt-contrib-jasmine to run the tests)
The HTTP requests will not fire unless you call $httpBackend.flush().
More information can be found here: http://docs.angularjs.org/api/ngMock.$httpBackend