Database error handling with Sequelize when column doesn't exist? - express

I'm trying to handle different errors that might show up when inserting into a MYSQL database.
Using Sequelize with express.
My foo.js model file looks like this:
module.exports = (sequelize, type) => {
return sequelize.define('event', {
id: {
type: type.INTEGER,
primaryKey: true,
autoIncrement: true
},
name: {
type: type.STRING,
}
},{
freezeTableName: true,
rejectOnEmpty: true,
})
}
and my route file (or whatever you wanna call it), looks like this.
const Sequelize = require('sequelize')
const fooModel = require('../../models/Foo')
const router = require('express').Router();
const auth = require('../auth');
const bodyParser = require('body-parser');
const sequelize = new Sequelize('username', 'password', 'db', {
host: 'localhost',
dialect: 'mysql'
})
const Foo = fooModel(sequelize, Sequelize);
router.use(bodyParser.json({limit: '100mb'}));
router.use(bodyParser.urlencoded({ extended: true, limit: '100mb', parameterLimit: 1000000 }));
sequelize.sync({force: true})
.then(() => {
console.log('Worked');
});
router.post('/', (req,res,next) => {
if(Object.keys(req.body).length > 0){
return Foo.create({
Name: req.body.Name
}).then((result) => {
if(result){
return res.status(200).json(result);
}else{
return res.status(400).json({'error': 'Could not create record.'});
}
}).catch(Sequelize.DatabaseError, function(err){
return res.status(400).json(err);
}).catch(function(err){
res.send(err);
})
}else{
return res.status(400).json({'error': 'error'});
}
});
module.exports = router;
Whenever I try to post to the route with something like:
{
"name": "test",
"foo": "bar"
}
Sequelize accepts the body and puts "test" in the ”name” column, and ignores the "foo" column, because the "foo" column does not exist. Meaning, all I get back once it's posted is:
{"id": "123",
createdAt: 2020-01-23 13:337:00
updatedAt: 2020-01-23 13:337:00
}
And not an error as I expect.
What Im trying to do, is catch that error (that I today ain't recieving) whenever I try to post to a column that doens't exist, basically replicate a normal MYSQL error behaviour.
Could someone point me in the right direction on what I'm missing?

In my experience, it would be better to avoid this particular problem by validating the fields on the client side.
But, you can trap such a condition in js. You'll not get a DB exception because Sequelize isn't sending your unrecognized attributes to the database.
if (!Foo.attributes.hasOwnProperty('foo')) {
// some error handing here, for invalid field.
}
You could write a utility function to iterate through the attributes of req.body and send an appropriate error to the response.
FWIW, you'll find that Name is also invalid, because your model specifies(lower case) name
hth

Related

Store Ability in Express Session?

I have seen the express example, where an ability is stored via middleware in the req object. It then uses the following method to evaluate the permissions:
ForbiddenError.from(req.ability).throwUnlessCan('read', article);
I want to achieve a similar thing. My idea is to save the ability inside an express session that is shared with socket io websockets. Through the sharing req.session = socket.handshake.session. My approach is the following, I make a request from the frontend application to get rules to update the ability on the frontend. The backend saves the ability inside the express session:
// abilities.js file
import { Ability } from '#casl/ability';
export const defineAbilitiesFor = (rules) => {
return new Ability(rules);
};
export default defineAbilitiesFor;
// handler for express route to get permissions from the frontend
export const getPermissions = async (req, res) => {
...
rules.push({
action: ['view'],
subject: views,
});
// manage all own processes
rules.push({
action: ['manage'],
subject: 'Process',
conditions: {
userId: req.kauth.grant.access_token.content.sub,
},
});
// store ability in session
req.session.rules = defineAbilitiesFor(rules);
const token = jwt.sign({ token: packRules(rules) }, 'secret');
if (token) {
return res.status(200).json(token);
} else {
return res.status(400).json('Error');
}
...
Then when a websocket request happens, I want to check in the backend if the user has the permissions to do that action:
ForbiddenError.from(socket.handshake.session.rules).throwUnlessCan('view', 'Process');
However, this throws the following error:
TypeError: this.ability.relevantRuleFor is not a function
at ForbiddenError.throwUnlessCan
The session object seems to have the correct ability object. When I console.log socket.handshake.session.rules, I get the following output:
{
h: false,
l: {},
p: {},
'$': [
{ action: [Array], subject: 'Process', conditions: [Object] },
{ action: [Array], subject: [Array] },
{ action: [Array], subject: 'Process', conditions: [Object] }
],
m: {}
}
Also the can function and everything else I tried wasn't working. I think storing the plain rules as an object inside the session and then updating the ability class before each request would work, but I don't want to do that. I want to store the ability right inside the session, so that I only have to execute the throwUnlessCan or can functions.
Is this even possible and if so, how would you do this?
Thanks so far.
Instead of storing the whole Ability instance, you need to store only its rules! rules is a plain js array of objects, so it can be easily serialized.So, change the code to this:
export const getPermissions = async (req, res) => {
...
rules.push({
action: ['view'],
subject: views,
});
// manage all own processes
rules.push({
action: ['manage'],
subject: 'Process',
conditions: {
userId: req.kauth.grant.access_token.content.sub,
},
});
// store ability RULES in session
req.session.rules = rules;
const token = jwt.sign({
token: packRules(rules) // packRules accepts an array of RawRule! not an Ability instance
}, 'secret');
if (token) {
return res.status(200).json(token);
} else {
return res.status(400).json('Error');
}
To use Ability in other handlers add a middleware:
function defineAbility(req, res, next) {
if (req.session.rules) {
req.ability = new Ability(req.session.rules);
next();
} else {
// handle case when there is no rules in session yet
}
}
// later
app.get('/api/users', defineAbility, (req, res) => {
req.ability.can(...);
// or
ForbiddenError.from(req.ability).throwUnlessCan(...);
})

cannot use the find functions using sequelize-typescript

I have setup passport and sequelize-typescript for my project. In the passport setup, I use a strategy for example google like this:
passport.use(
new GoogleStrategy(
{
clientID: process.env.GOOGLE_AUTH_CLIENT_ID,
clientSecret: process.env.GOOGLE_AUTH_CLIENT_SECRET,
callbackURL: process.env.GOOGLE_AUTH_CALLBACK_URL,
profileFields: ['id', 'displayName', 'photos', 'email'],
enableProof: true
},
function(accessToken, refreshToken, profile, done) {
console.log(profile)
const { name, email, picture } = profile._json;
User.findOne({where: {id: profile.id}})
.then(user => {
console.log(user)
if(user === null) {
const { name, email, picture } = profile._json;
// new User({
// id: profile.id,
// name: name,
// email: email,
// pictureUrl: picture,
// })
}
})
done(null, profile)
}
)
)
When I try to use functions such as findOrCreate() or findOne(), I receive a typescript error that says:
[ERROR] 23:29:01 ⨯ Unable to compile TypeScript:
src/passport_strategies.ts:45:18 - error TS2339: Property 'findOne' does not exist on type 'typeof User'.
45 User.findOne({where: {id: profile.id}})
I also get the same error for the part commented out in the first code snippet. The model user I have created is declared like this:
export class User extends Model<User> {} (It has the columns set in the file) Model being imported from sequelize-typescript
Here is where sequelize is created:
export const sequelize = new Sequelize({
"username": c.username,
"password": c.password,
"database": c.database,
"host": c.host,
dialect: 'postgres',
storage: ':memory:',
models: [__dirname + '/models']
});
I tried checking other examples that are on the internet but they all have the same setup and I couldn't figure out why I'm getting this error. Not sure if this helps at all but I'm using postgres dialect.
I suspect that it is a version mismatch.
sequelize-typescript#2 is for sequelize#6.2>= and sequelize-typescript#1 is for sequelize#5>=.
I also suggest for educational purposes to implement typescript with sequelize without the use of the sequelize-typescript package just for understanding the need of the package itself. https://sequelize.org/master/manual/typescript.html
Also just in case with all the respect, i point that #Table is needed if you are using the latest version.

Validate request body separately from request as a whole

I have a question for validating a PUT request. The body of the request is an array of objects. I want the request to succeed if the body contains an array of at least length one, but I also need to do a separate validation on each object in the array and pass that back in the response. So my put body would be:
[1, 2, {id: "thirdObject"}]
The response should be 200 even though the first two items are not even objects. The request just needs to succeed if an array of length 1 is passed in the body. The response needs to be something like:
[{id: firstObject, status: 400, error: should be object}, {id: secondObject, status: 400, error: should be object}, { id: thirdObject, status: 204 }]
Currently I am validating the body as such with fluent schema:
body: S.array().items(myObjectSchema)
.minItems(1)
Which will result in a 400 if any of the items in the body don’t match the myObjectSchema. Was wondering if you have any idea how to achieve this?
The validation doesn't tell you if a schema is successful (eg { id: thirdObject, status: 204 }), so you need to manage it by yourself.
To do that, you need to create an error handler to read the validation error and merge with the request body:
const fastify = require('fastify')()
const S = require('fluent-schema')
fastify.put('/', {
handler: () => { /** this will never executed if the schema validation fail */ },
schema: {
body: S.array().items(S.object()).minItems(1)
}
})
const errorHandler = (error, request, reply) => {
const { validation, validationContext } = error
// check if we have a validation error
if (validation) {
// here the validation error
console.log(validation)
// here the body
console.log(request.body)
reply.send(validation)
} else {
reply.send(error)
}
}
fastify.setErrorHandler(errorHandler)
fastify.inject({
method: 'PUT',
url: '/',
payload: [1, 2, { id: 'thirdObject' }]
}, (_, res) => {
console.log(res.json())
})
This will log:
[
{
keyword: 'type',
dataPath: '[0]',
schemaPath: '#/items/type',
params: { type: 'object' },
message: 'should be object'
},
{
keyword: 'type',
dataPath: '[1]',
schemaPath: '#/items/type',
params: { type: 'object' },
message: 'should be object'
}
]
[ 1, 2, { id: 'thirdObject' } ]
As you can see, thanks to validation[].dataPath you are able to understand which elements of the body array is not valid and merge the data to return your info.
Consider that the handler will be not executed in this scenario. If you need to execute it regardless the validation, you should do the validation job in a preHandler hook and avoid the default schema validation checks (since it is blocking)
edit
const fastify = require('fastify')()
const S = require('fluent-schema')
let bodyValidator
fastify.decorateRequest('hasError', function () {
if (!bodyValidator) {
bodyValidator = fastify.schemaCompiler(S.array().items(S.object()).minItems(1).valueOf())
}
const valid = bodyValidator(this.body)
if (!valid) {
return bodyValidator.errors
}
return true
})
fastify.addHook('preHandler', (request, reply, done) => {
const errors = request.hasError()
if (errors) {
console.log(errors)
// show the same errors as before
// you can merge here or set request.errors = errors to let the handler read them
reply.send('here merge errors and request.body')
return
}
done() // needed to continue if you don't reply.send
})
fastify.put('/', { schema: { body: S.array() } }, (req, reply) => {
console.log('handler')
reply.send('handler')
})
fastify.inject({
method: 'PUT',
url: '/',
payload: [1, 2, { id: 'thirdObject' }]
}, (_, res) => {
console.log(res.json())
})
I don't know the schema syntax you are using, but using draft 7 of the JSON Schema (https://json-schema.org/specification-links.html, and see also https://json-schema.org/understanding-json-schema for some reference material), you can do:
{
"type": "array",
"minItems": 1
}
If you want to ensure that at least one, but not necessarily all items match your object type, then add the "contains" keyword:
{
...,
"contains": ... reference to your object schema here
}

mongoose query return array instead of object

I faced a strange problem with the mongoose query. when I do db.collection.find() it should be return a object as expected. And I got so in mongo shell
When I do a similar query in my express router endpoint I got array instead of an object. Like
[
{
"dishes": [
"5eca615117611c0480320c12",
"5eca615117611c0480320c15"
],
"_id": "5ecae7eb2e746b312cfdf59e",
"user": "5ec644d06715633270d0414d",
...
}
]
which causes error in my frontend react application. Here is my schema in favorite model:
var favoriteSchema = new Schema(
{
dishes: [
{
type: mongoose.Schema.Types.ObjectId,
ref: 'Dish',
unique: true,
},
],
user: {
type: mongoose.Schema.Types.ObjectId,
ref: 'User',
},
},
{
timestamps: true,
}
);
And here is my express endpoint:
.get((req, res, next) => {
Favorites.find({})
.then(
(favorite) => {
res.statusCode = 200;
res.setHeader('Content-Type', 'application/json');
res.json(favorite);
console.log(favorite);
},
(err) => next(err)
)
.catch((err) => next(err));
})
I heartily thank if anyone helps me to figure out this.
You might want to use findOne with mongoose, if you are looking for a single result or null. If you use find you expect more than one row as result.
Bare in mind you should handle the case where "favourite" is null (when you can't find the row you are looking for). At that point you might want to return a different response.
.find({parameter}) returns all the objects from database by the given parameter.

Undefined args on a mutation, using apollo-server

Im working with apollo-server, everything works as expetected but the mutation arguments are undefined when the mutation is called from the frontend.
const express = require('express');
const morgan = require('morgan');
const { ApolloServer, gql } = require('apollo-server-express');
const mongoose = require('mongoose');
require('dotenv').config();
const app = express();
const typeDefs = gql`
type msgFields {
email: String!
textarea: String!
createdAt: String!
}
input MsgFieldsInput {
email: String!
textarea: String!
createdAt: String!
}
type Query {
formContact: msgFields!
}
type Mutation {
createMsg(email: String!, textarea: String!, createdAt: String!): String!
}
`;
const resolvers = {
Query: {
formContact: () => {
return {
email: 'test#mail.com',
textarea: 'checking Checking checking Checking checking Checking'
}
}
},
Mutation: {
createMsg: (args) => {
console.log(args); // => undefined here
return 'Worked';
}
}
}
const server = new ApolloServer({
typeDefs,
resolvers
});
app.use(morgan('dev'));
server.applyMiddleware({app})
mongoose.connect(process.env.MONGO_URL, { useNewUrlParser: true })
.then(() => {
app.listen({port: 4000}, () => {
console.log(`Server and DB ready at http://localhost:4000${server.graphqlPath}`)
});
})
.catch(err => {
throw err;
})
This is what i send from /graphql
mutation {
createMsg(email: "test#mail.com" textarea: "testing textarea" createdAt: "19-05-2018")
}
The resolver signature is as follows: (parent, args, context, info) where:
parent: The object that contains the result returned from the resolver on the parent field, or, in the case of a top-level Query field, the rootValue passed from the server configuration. This argument enables the nested nature of GraphQL queries.
args: An object with the arguments passed into the field in the query. For example, if the field was called with query{ key(arg: "you meant") }, the args object would be: { "arg": "you meant" }.
context: This is an object shared by all resolvers in a particular query, and is used to contain per-request state, including authentication information, dataloader instances, and anything else that should be taken into account when resolving the query. Read this section for an explanation of when and how to use context.
info: This argument contains information about the execution state of the query, including the field name, path to the field from the root, and more. It's only documented in the GraphQL.js source code, but is extended with additional functionality by other modules, like apollo-cache-control.
The arguments are passed to the resolver as the second parameter, not the first. See the docs for additional details.