Distinguish knex client from knex transaction - orm

Knex promise-based transactions can be used like regular knex client.
const db = knex.transaction() //or just const db = require('knex')(knexOptions)
db('books').insert(books); // it works in both case
I need to accept knex object as an argument of my function and execute transaction inside it. Can i distingue somehow if knex object or knex transaction passed as argument? In case if this is not a transaction, I want to wrap client object with knex.transaction.
Could I use something like db.isTransaction syntax?
function myFunc(db) {
if (!db.isTransaction)
{
db=knex.transaction()
}
db('books').insert(books);
}

Use the knex.isTransaction property to tell if the object is a knex transaction or a client.
function myFunction(db) {
if (db.isTransaction) {
console.log('Inside transaction')
return db.from('table_1').select()
} else {
console.log('Wrapping client in a transaction')
return db.transaction(trx => myFunction(trx))
}
}
In both cases, you query should be executed within a transaction

Related

How to mock when using Typeorm Transaction Decorator in NestJS application unit tests?

#Transaction()
async updateAndCreate(
someEntity: SomeEntity,
entityStatus: SOME_STATUS,
someMsg: SOME_MESSAGE,
additionalInfo: any,
#TransactionManager() em?: EntityManager
) {
someEntity.LastStatus = entityStatus
await em. save(Invoice, invoice)
return em. save(History, {
someEntityId: someEntity.id,
message: someMsg,
status: entityStatus
additionalInfo,
})
}
I am trying to create and use a method to use a series of transactions.
I tried to write a unit test for that method.
If #Transaction() is used, a Connection related error occurs.
If #TransactionManager() is used, an error occurs because the corresponding entity manager cannot be found.
Is there a way to test the success case without creating a memory DB and direct connection in that case?
you can mock like this:
jest.mock('typeorm/decorator/transaction/Transaction', () => ({
Transaction() {
return jest.fn();
},
}));

Node.js mssql multiple concurrent connections to SQL servers interfering with each other

I am using mssql in my Node.js express application to make connections to many different databases across many different SQL servers.
I have constructed the following example to demonstrate the general structure of my code:
app.get('/api/example'), async (request, response) => {
// FYI I may be using await incorrect here since I'm new to it, just using it here for code simplicity
let results1 = await GetDataFromSqlServerA()
let results2 = await GetDataFromSqlServerB()
response.status(200).send([results1, results2])
});
function GetDataFromSqlServerA() {
return new Promise(function(resolve, reject) {
let sql = require("mssql")
let sqlConnectionDetails = {
user: 'test',
password: 'foobar',
server: 'SQLServerA',
database: 'DatabaseA'
}
sql.connect(sqlConnectionDetails, function (error) {
let sqlRequest = new sql.Request()
let queryText = 'SELECT * FROM TableA'
sqlRequest.query(queryText, function (error, results) {
sql.close()
resolve(results)
})
})
})
}
function GetDataFromSqlServerB() {
return new Promise(function(resolve, reject) {
let sql = require("mssql")
let sqlConnectionDetails = {
user: 'test',
password: 'foobar',
server: 'SQLServerB',
database: 'DatabaseB'
}
sql.connect(sqlConnectionDetails, function (error) {
let sqlRequest = new sql.Request()
let queryText = 'SELECT * FROM TableB'
sqlRequest.query(queryText, function (error, results) {
sql.close()
resolve(results)
})
})
})
}
I have a request being made which looks for data from two separate SQL server locations asynchronously. The first SQL call to be made executes OK, but the second fails Invalid object 'Table2'. It cannot find the table because the second call picks up the connection details for the first call for some reason. It's pointing at the wrong SQL server and database!
I would have thought this is not possible because the functions are within their own scopes. The first SQL call should know nothing about the second and vice-versa - or at least I would have thought.
I have also tried defining one instance of sql globally, but the same issues occur.
I can have a single function make a SQL connection to server A, make a request to server A, disconnect from server A, make a SQL connection to server B, and finally make a request to server B. However, when asynchronicity comes into play and SQL is being triggered by concurrent requests, I have issues.
Hoping I am just being dumb since I am new to asynchronous code, please enlighten me! Thank you!
Thought I would update this in case anyone else runs into similar trouble. #AlwaysLearning provided me with a link to a section of the mssql npm documentation that explains connection pooling and doing so when access to multiple databases is necessary. I would recommend reading through it.
It turns out that the .connect() function of mssql is a global. If the global pool is already connected when the .connect() function is called, it will resolve to the already connected pool. This was causing me troubles because the following would occur in my application when two requests are send in quick succession:
Request 1 would connect to database A
Request 2 would try to connect to database B
Since database A is already connected in the global pool, Request 2 picks up the database A connection
Request 2's SQL query fails because it is not valid for database A
To get around this, you need to develop some connection pool management code. This way you can ensure there are separate connection pools for each database connection that is necessary. In addition, this is an easy way to make your queries faster because you are not reconnecting to the database every time you want to make a request.
The npm documentation provides a couple example helper files, but I found them to not be ideal for me and I couldn't compile all the code in my environment. I customised the code into the following:
mssql-connection-pooling.js:
const { ConnectionPool } = require('mssql')
const pools = {}
// create a new connection pool
function CreatePool(config) {
let key = JSON.stringify(config)
if (GetPool(key))
throw new Error('Pool already exists')
pools[key] = (new ConnectionPool(config)).connect()
return pools[key]
}
// get a connection pool from all pools
function GetPool(name) {
if (pools[name])
return pools[name]
else
return null
}
// if pool already exists, return it, otherwise create it
function GetCreateIfNotExistPool(config) {
let key = JSON.stringify(config)
let pool = GetPool(key)
if(pool)
return pool
else
return CreatePool(config)
}
// close a single pool
function ClosePool(config) {
let key = JSON.stringify(config)
if (pools[key]) {
const pool = pools[key];
delete pools[key];
pool.close()
return true
}
return false
}
// close all the pools
function CloseAllPools() {
pools.forEach((pool) => {
pool.close()
})
pools = {}
return true
}
module.exports = {
ClosePool,
CloseAllPools,
CreatePool,
GetPool,
GetCreateIfNotExistPool
}
I created the GetCreateIfNotExistPool() function, which you supply a database connection configuration object. The function checks if there is an open connection stored in the pools for that given connection configuration. If there is, it simply returns the connection pool. If not, it creates it then returns it.
Example of usage:
const sql = require("mssql");
let mssql = require('./mssql-pool-management.js')
let exampleDBConfigA = {
user: 'test',
password: 'password,
server: 'SqlServerA',
database: 'DatabaseA'
};
let exampleDBConfigB = {
user: 'test',
password: 'password,
server: 'SqlServerB',
database: 'DatabaseB'
};
...
// Request 1
try {
let sqlPool = await mssql.GetCreateIfNotExistPool(exampleDBConfigA)
let request = new sql.Request(sqlPool)
// query code
}
catch(error) {
//error handling
}
...
// Request 2
try {
let sqlPool = await mssql.GetCreateIfNotExistPool(exampleDBConfigB)
let request = new sql.Request(sqlPool)
// query code
}
catch(error) {
//error handling
}
In this example, requests 1 and 2 can be called concurrently just fine without their connections interfering!

Problems on testing middleware in Laravel with Clousure $next

I have this middleware on my app that checks the user role for a route:
public function handle($request, Closure $next, ...$roles)
{
if (in_array($request->user()->rol, $roles)) {
return $next($request);
} else {
return redirect()->action('SecurityController#noAutorizado');
}
}
And I'm triying to make a test for this middleware (phpUnit):
public function testUsuarioLogadoPuedeAccederAPantallaUsuarios()
{
$user = UsuariosTestFixtures::unAsignador();
$this->actingAs($user);
$request = Request::create('/usuarios', 'GET');
$middleware = new CheckRole();
$response = $middleware->handle($request,Closure $next,$user->getRole(), function () {});
$this->assertEquals($response, true);
}
But i'm retreiving this error: Argument 2 passed to App\Http\Middleware\CheckRole::handle() must be an instance of Closure, null given
I don't know how I have to pass the "Closure $next" on the $middleware->handle
I've tryed this:
public function testUsuarioLogadoPuedeAccederAPantallaUsuarios(Closure $next){...}
But It returns an error: Too few arguments to function UsuarioControllerTest::testUsuarioLogadoPuedeAccederAPantallaUsuarios(), 0 passed in C:\www\APPS\catsa\vendor\phpunit\phpunit\src\Framework\TestCase.php
What's the solution?
Thanks a lot!
A Closure in PHP is simply a function, so you need to pass a function as the second argument of your handle method.
In the context of Laravel middleware, the $next function represent the full pipeline of steps that the request goes through.
Obviously you can't (and don't need to) execute this pipeline during a test. What you need is just a function that return some values that your can test in an assertion.
What you can do is something like this:
//... setup code here
$middleware = new CheckRole();
$roles = ['role1', 'role2']; // change this with the desired roles
$result = $middleware->handle($request,function($request) {
return 'success';
},$roles);
$this->assertEquals('success', $result);
So, what is happening here?
If everything goes as planned (the user has the required role), the $next closure is executed and it returns success; on the other hand, if the user doesn't have the required role, the code takes the other path and it returns a RedirectResponse.
Finally, the assertion checks if success is returned, and it reports a failure if that doesn't happen.

Return value from vuex mutation? (id for newly created object)

I'm trying to create an object in one part of vuex store, and then pass id to it to another object, and i'm not sure how to properly do that since mutations can't return returning anything (in this case, id).
Two store objects look like this:
// store/report.js
const state = {
name: 'Untitled Report',
subReportIds: []
};
// store/subReport.js
const state = { ... }
And i'd like this action to create blank report, then blank subreport, and then assign subreport id to newly created report. (subreports are independent entities, and can be used by multiple reports, hence different area in store)
const actions = {
createNewReport({ state, commit }) {
commit(mutationTypes.CREATE_NEW_REPORT)
// below doesn't work - i can't get return from mutation
let newSubreportId = commit(mutationTypes.ADD_NEW_SUBREPORT)
// if this worked, i'd then do something like
commit(mutationTypes.ADD_SUBREPORT_TO_REPORT, newSubreportId)
}
};
How can i achieve the above?
So best way to accomplish to me would be to dispatch actions instead of committing the mutations. If you look at the methods in Vuex source, commit only executes with no return (so is a void) and dispatch returns the value you return from the action (which is a function)
For my actions, i always return a promise so that i can compose them like you mention above. Here is an example.
fetchSomething ({ commit }) {
return mockApiGetIds()
.then(response => {
commit({
type: SOME_MUTATION,
ids: response
});
return response;
});
},
Disclaimer : I don't know if it is truely a good idea, but at least, it seems to work, and to me, it feels prettier than having to use actions and promises, or to generate the id in the action...
With your mutation, you can pass an argument. To return a value from a mutation (like a newly created id), I write it to a placeholder in that argument :
someMutation(state, arg){
//...
arg.out = {
status : "succeed"
}
}
//...
this.$store.commit('someMutation', arg);
if(arg.out !== "succeed") console.log("ERROR");

What is the role of exec() and next() call in cascade delete in mongoose middleware?

I'm new to using mongoose middleware and don't know if I'm following it well. Here is the purpose. After saving department, I want to populate university and save departmentId inside university object.
DepartmentSchema.post('save', function(next) {
var departmentId = this._id;
University.findOne({
_id: this.university
}, function(err, university) {
if (!university.departments) {
university.departments = [];
}
university.departments.push(new ObjectId(departmentId));
university.save(function(err) {
if (err) return console.log('err-->' + err);
// saved!
});
});
});
This is working fine but I'm not sure why in Cascade style delete in Mongoose they have used exec() and next() calls. Could you please tell me the purpose of these calls? I don't know what they do and not able to find relevant documentation. I just want to make sure I'm not missing anything.
clientSchema.pre('remove', function(next) {
// 'this' is the client being removed. Provide callbacks here if you want
// to be notified of the calls' result.
Sweepstakes.remove({
client_id: this._id
}).exec();
Submission.remove({
client_id: this._id
}).exec();
next();
});
Post middleware doesn't have reference to the next function and you cant do any flow control. Its actually passing the department that just got saved, so your code can be something like this:
DepartmentSchema.post('save', function(department) {
var departmentId = department._id;
In pre middleware you have access to the next middleware in the order of execution. Which is the order of definition on a particular hook.
// hook two middlewares before the execution of the save method
schema.pre('save', pre1);
schema.pre('save', pre2);
function pre1(next) {
// next is a reference to pre2 here
next()
}
function pre2(next) {
// next will reference the hooked method, in this case its 'save'
next(new Error('something went wrong');
}
// somewhere else in the code
MyModel.save(function(err, doc) {
//It'll get an error passed from pre2
});
Mongoose also gives you the ability to execute pre middlewares in parallel, in this case all middlewares will be executed in parallel but hooked method will not execute till the done is called from each middleware.
As for the exec() function, there are two ways of executing a query in Mongoose, either pass a callback to the query or chain it with an exec(): User.remove(criteria, callback) or User.remove(criteria).exec(callback), if you don't pass a callback to the query, it'll return a query object and it won't execute unless you chain it with exec()