Mocking runtime config value with sinon - config

I am adding some config values before hapi server start. Application is works fine although in test I can not use config.get(). I can get around with proxyquire. So I was wondering
Is adding config file "dynamically" is bad design?
Is there a way I can use config.get() in such suitation?
Any alternative approach?
//initialize.js
const config = require('config');
async function startServe() {
const someConfigVal = await callAPIToGetSomeJSONObject();
config.dynamicValue = someConfigVal;
server.start();
}
//doSomething.js
const config = require('config');
function doesWork() {
const valFromConfig = config.dynamicValue.X;
// In test I can use proxiquire by creating config object
...
}
function doesNotWork() {
const valFromConfig = config.get('dynamicValue.X');
// Does not work with sinon mocking as this value does not exist in config when test run.
// sinon.stub(config, 'get').withArgs('dynamicValue.X').returns(someVal);
.....
}

Context: testing.
Is adding config file "dynamically" is bad design? => No. I have done it before. The test code changes configuration file: default.json in mid test to check whether function under test behaves as expected. I used several config utilities.
Is there a way I can use config.get() in such suitation? => Yes. For sinon usage, see the example below, which use mocha. You need to define the stub/mock before the function under test use it, and do not forget to restore the stub/mock. Also there is official documentation related to this: Altering configuration values for testing at runtime, but not using sinon.
const config = require('config');
const sinon = require('sinon');
const { expect } = require('chai');
// Example: simple function under test.
function other() {
const valFromConfig = config.get('dynamicValue.X');
return valFromConfig;
}
describe('Config', function () {
it ('without stub or mock.', function () {
// Config dynamicValue.X is not exist.
// Expect to throw error.
try {
other();
expect.fail('expect never get here');
} catch (error) {
expect(error.message).to.equal('Configuration property "dynamicValue.X" is not defined');
}
});
it('get using stub.', function () {
// Create stub.
const stubConfigGet = sinon.stub(config, 'get');
stubConfigGet.withArgs('dynamicValue.X').returns(false);
// Call get.
const test = other();
// Validate te result.
expect(test).to.equal(false);
expect(stubConfigGet.calledOnce).to.equal(true);
// Restore stub.
stubConfigGet.restore();
});
it('get using mock.', function () {
// Create mock.
const mockConfig = sinon.mock(config);
mockConfig.expects('get').once().withArgs('dynamicValue.X').returns(false);
// Call get.
const test = other();
// Validate te result.
expect(test).to.equal(false);
// Restore mock.
expect(mockConfig.verify()).to.equal(true);
});
});
Hope this helps.

Related

Redefine $fetch in nuxt3 with global onRequest handler

Is it possible to use global onRequest handler to $fetch with Nuxt3, to add specific data on each request?
With nuxt2 and axios it was simple
/plugins/axios.js
export default function ({ $axios, store, req }) {
$axios.onRequest((config) => {
if (config.data) {
config.data.test = '123';
} else {
config.data = { test: '123' };
}
return config;
});
}
But how achieve same goal on Nuxt3 and $fetch?
Ok, so Nuxt3 $fetch documentation says:
Nuxt uses ofetch to expose globally the $fetch helper...
When we jump into ofetch documentation we can see the Interceptors section. This gives us some options to do what you are trying to achieve. My suggestion is this:
Create a http composable (or anyother name you wish):
// composables/use-http.js
const opts = {
async onRequest({ request, options }) {
// Add your specific data here
options.query = { t: '1234' }
options.headers = { 'Authorization': 'my_token' }
}
}
export default () => $fetch.create(opts)
And here we are making usage of the onRequest interceptor from ofetch
onRequest is called as soon as ofetch is being called, allowing to modify options or just do simple logging.
There you can add any data you want, if you need you can create the logic to pass parameters to this composable and so on...
Now, to actually fetch the data (use the composable):
const http = useHttp() // useHttp is auto-imported
const data = await http('/url') // will trigger the interceptor

How to directly test a Solidity Library with Hardhat/Chai

I'm trying to test a Solidity Library directly using hardhat and chaï.
This is a Library example I would like to test:
library LibTest {
function testFunc() public view returns(bool) {
return true;
}
}
and this is how I'm trying to test it.
beforeEach(async () => {
const LibTest = await ethers.getContractFactory("LibTest");
const libTest = await LibTest.deploy();
await libTest.deployed();
})
describe('Testing test()', function () {
it("is working testFunc ?", async function () {
console.log(await libTest.testFunc());
})
})
But I have the error message:
ReferenceError: libTest is not defined
I read everything I can on Chai doc and Hardhat doc but can't found any solution
I would say to use fixture and also utilize waffle and to deploy the library contract once and save the snapshot of the contract:
const {loadFixture } = require('#nomicfoundation/hardhat-network-helpers');
const {expect} = require('chai');
const {ethers, waffle} = require('hardhat');
const {deployContract} = waffle;
const LibArtifact = require('../artifacts/contracts/lib.sol/LibTest.json');
describe("Lib tests", function () {
// We define a fixture to reuse the same setup in every test.
// We use loadFixture to run this setup once, snapshot that state,
// and reset Hardhat Network to that snapshopt in every test.
async function deployOnceFixture() {
const [owner, ...otherAccounts] = await ethers.getSigners();
lib = (await deployContract(owner, LibArtifact));
return { lib, owner, otherAccounts };
}
describe("Testing test()", function () {
it("s working testFunc ?", async function () {
const { lib } = await loadFixture(deployOnceFixture);
expect(await lib.testFunc()).to.be.true;
});
});
});
Instructions to add the waffle plugin:
Hardhat-waffle
Basically, you need to install the libraries:
npm install --save-dev #nomiclabs/hardhat-waffle 'ethereum-waffle#^3.0.0' #nomiclabs/hardhat-ethers 'ethers#^5.0.0'
And then import hardhat-waffle in the hardhat-config.js file:
require("#nomiclabs/hardhat-waffle");
Also, notice I put the test file in a test directory so I needed to go back one folder to find artifacts generated by running npx hardhat compile.
For convenience I pushed the solution in a Github repo: https://github.com/Tahlil/run-solidity-lib
The best way I have found to go about this is to create a LibTest.sol contract that invokes and tests the Lib itself. And just running abstracted tests in JS/TS to invoke the LibTest contract, connecting the Lib.sol contract to it during deployment in Hardhat.
const Lib = await ethers.getContractFactory("Lib");
const lib = await Lib.deploy();
const LibTest = await ethers.getContractFactory("LibTest", {
libraries: {
Lib: lib.address,
},
});
const libTest = await LibTest.deploy();
/// later: console.log(await libTest.testLibFunc());
LibTest.sol:
import "./Lib.sol";
library LibTest {
function testLibFunc() public view returns(bool) {
bool response = Lib.testFunc();
return response;
}
}
Were you able to find a better method?

Unit Testing the NestJS Routes are defined

I found this question about determine the routes. While the first answer is exactly what I need, and it works
import { Controller, Get, Request } from "#nestjs/common";
import { Request as ExpressRequest, Router } from "express";
#Get()
root(#Request() req: ExpressRequest) {
const router = req.app._router as Router;
return {
routes: router.stack
.map(layer => {
if(layer.route) {
const path = layer.route?.path;
const method = layer.route?.stack[0].method;
return `${method.toUpperCase()} ${path}`
}
})
.filter(item => item !== undefined)
}
}
I want to be able to unit test this.
My end to end test works fine
it('/api (GET) test expected routes', async done => {
const ResponseData = await request(app.getHttpServer())
.get('/api')
.set('Accept', 'application/json');
expect(ResponseData.status).toBe(200);
expect(ResponseData.headers['content-type']).toContain('json');
expect(ResponseData.body.routes.length).toBeGreaterThan(2);
done(); // Call this to finish the test
});
The problem I am having, is how to create and pass the Request part that is needed for the root() call for a unit test. The ExpressRequest is not a class or anything to simply create, and then assign values. It is currently a large definition. I assume there must be an easy way to create one, but I have not found it yet.
You can make use of the #golevelup/ts-jest package to help create mocks of objects. It can take an interface as a generic and return an entire jest mock that is compatible with the type.

using mock data while on development with vue + vuex

I'm working on a Vue app which also uses vuex.
Everything is setup ad working correctly as expected but i'd like to improve it so that I can work on it without actually calling the API endpoints (mainly to avoid rate limit).
I created a mock folder and placed some file in there.
How do I manage to use those mock in development, and the real api endpoint on the build on production withouth making a mess in my code ?
I created a repo with as less as possible.
It includes vue + vuex, a single smart component in charge of reading from the store, and a dumb component do display it.
In poor words, I'm looking for a way to do change this:
const actions = {
async fetchTodos({ commit }) {
let response;
if (process.env.NODE_ENV === "development") {
response = { data: todos };
} else {
response = await axios.get("https://jsonplaceholder.typicode.com/todos");
}
commit("setTodos", response.data);
}
};
with something which would be easier to maintain and wouldn't increase the bundle size.
I thought about mocking the whole action object, which seemed to be ok, but how do i avoid to bundle my mock files at that point?
How do you manage your front end environment to avoid this kind of problem?
What I did is encapsulate the whole API in another class/object. That single point of entry then switches between the mock and real api:
// store.js
const api = require('./api');
const actions = {
async fetchTodos({ commit }) {
// you can use api.getTodos() instead or another naming convention
const response = await api.get('todos');
commit("setTodos", response.data);
},
};
// api.js
const realapi = require('./realapi');
const mockapi = require('./mockapi');
module.exports = process.env.NODE_ENV === 'production' ? realapi : mockapi;
// mockapi/index.js
const todos = loadTodos();
module.exports = {
async get(path) {
switch (path) {
case 'todos':
return { data: todos };
// etc.
}
}
};
// realapi/index.js
const root = 'https://jsonplaceholder.typicode.com/';
module.exports = {
get(path) {
return axios.get(root + path);
}
};
Builders like Webpack can optimize the build and remove the whole mock api part in production builds based on the environment.

How Do I use a node.js module with Next.js?

Do I need to use express with next.js?
Im trying to add this code into a next.js application. ( from npm module example code: pdf2json )
let fs = require('fs');
var PDFParser = require("pdf2json");
let pdfParser = new PDFParser(this,1);
pdfParser.on("pdfParser_dataError", errData =>
console.error(errData.parserError) );
pdfParser.on("pdfParser_dataReady", pdfData => {
fs.writeFile("./sometxt.txt", pdfParser.getRawTextContent());
pdfParser.loadPDF("./page1.pdf");
You can require it conditionally by testing if it is the server:
static async getInitialProps({isServer}){
var fs;
if(isServer){
fs=require('fs');
//and do what ever you want
}
}
and dot not forgot to tell webpack to do not send the module to the client side by changing package.json like so:
"browser": {
"fs": false
}
unless it can produce errors.
The thing that's probably biting you is that most of your code must work on both the client and the server. You can write server-only code by creating a getInitialProps() method and checking to see if it's passed in a opts.req - if so, you know the code is running server-side and you can hit the filesystem:
import React from 'react'
const doServerSideStuff = () => {
let fs = require('fs');
var PDFParser = require("pdf2json");
let pdfParser = new PDFParser(this,1);
pdfParser.on("pdfParser_dataError", errData =>
console.error(errData.parserError) );
pdfParser.on("pdfParser_dataReady", pdfData => {
fs.writeFile("./sometxt.txt", pdfParser.getRawTextContent());
pdfParser.loadPDF("./page1.pdf");
}
export default class extends React.Component {
static async getInitialProps ({ req }) {
if (req) {
doServerSideStuff();
}
return {};
}
render () {
return <div> Hello World </div>
}
}
This isn't really a complete example yet, you should really make doServerSideStuff() async (or return a promise) and then await it in getInitialProps, and eventually return props that represent the result of the parsing & saving. Also, handle fs.writeFile errors. But, hopefully it's enough to get you going in the right direction.
See the docs for some more info on this.
Or you could just use Express like you suggested. There is a good tutorial and example code that should help you get started if you decide to go that route.