How do I annotate an endpoint in NestJS for OpenAPI that takes Multipart Form Data - file-upload

My NestJS server has an endpoint that accepts files and also additional form data
For example I pass a file and a user_id of the file creator in the form.
NestJS Swagger needs to be told explicitly that body contains the file and that the endpoint consumes multipart/form-data this is not documented in the NestJS docs https://docs.nestjs.com/openapi/types-and-parameters#types-and-parameters.

Luckily some bugs led to discussion about how to handle this use case
looking at these two discussions
https://github.com/nestjs/swagger/issues/167
https://github.com/nestjs/swagger/issues/417
I was able to put together the following
I have added annotation using a DTO:
the two critical parts are:
in the DTO add
#ApiProperty({
type: 'file',
properties: {
file: {
type: 'string',
format: 'binary',
},
},
})
public readonly file: any;
#IsString()
public readonly user_id: string;
in the controller add
#ApiConsumes('multipart/form-data')
this gets me a working endpoint
and this OpenAPI Json
{
"/users/files":{
"post":{
"operationId":"UsersController_addPrivateFile",
"summary":"...",
"parameters":[
],
"requestBody":{
"required":true,
"content":{
"multipart/form-data":{
"schema":{
"$ref":"#/components/schemas/UploadFileDto"
}
}
}
}
}
}
}
...
{
"UploadFileDto":{
"type":"object",
"properties":{
"file":{
"type":"file",
"properties":{
"file":{
"type":"string",
"format":"binary"
}
},
"description":"...",
"example":"'file': <any-kind-of-binary-file>"
},
"user_id":{
"type":"string",
"description":"...",
"example":"cus_IPqRS333voIGbS"
}
},
"required":[
"file",
"user_id"
]
}
}

Here is what I find a cleaner Approach:
#Injectable()
class FileToBodyInterceptor implements NestInterceptor {
intercept(context: ExecutionContext, next: CallHandler): Observable<any> {
const ctx = context.switchToHttp();
const req = ctx.getRequest();
if(req.body && req.file?.fieldname) {
const { fieldname } = req.file;
if(!req.body[fieldname]) {
req.body[fieldname] = req.file;
}
}
return next
.handle();
}
}
const ApiFile = (options?: ApiPropertyOptions): PropertyDecorator => (
target: Object, propertyKey: string | symbol
) => {
ApiProperty({
type: 'file',
properties: {
[propertyKey]: {
type: 'string',
format: 'binary',
},
},
})(target, propertyKey);
};
class UserImageDTO {
#ApiFile()
file: Express.Multer.File; // you can name it something else like image or photo
#ApiProperty()
user_id: string;
}
#Controller('users')
export class UsersController {
#ApiBody({ type: UserImageDTO })
// #ApiResponse( { type: ... } ) // some dto to annotate the response
#Post('files')
#ApiConsumes('multipart/form-data')
#UseInterceptors(
FileInterceptor('file'), //this should match the file property name
FileToBodyInterceptor, // this is to inject the file into the body object
)
async addFile(#Body() userImage: UserImageDTO): Promise<void> { // if you return something to the client put it here
console.log({modelImage}); // all the fields and the file
console.log(userImage.file); // the file is here
// ... your logic
}
}
FileToBodyInterceptor and ApiFile are general, I wish they where in the NestJs
You probably need to install #types/multer to have to Express.Multer.File

Related

KendoUI SignalR grid not update if data is posed via ajax in my ASP.NET Core 5 web application

I have a KendoUI grid which uses SignalR. Whilst the grid itself works fine if you update the data inline or incell, it doesn't work if I update the data in a form which uses ajax to post it to my controller.
It was my understanding that, if I injected my hub into my controller and then called (whichever I needed, create, update or destroy) :
await _fixtureHub.Clients.All.SendAsync("update", model);
or
await _fixtureHub.Clients.All.SendAsync("update", model);
That it would tell the clients that the data had been changed/created and the grid would update to reflect that change. This isn't happening however and I'm wondering what I've done wrong or missing.
To start, here is my signalR bound grid.
$('#fixture_grid').kendoGrid({
dataSource: {
schema: {
model: {
id: "Id",
fields: {
Created_Date: {
type: "date"
},
Commencement_Date: {
type: "date"
}
}
}
},
type: "signalr",
autoSync: true,
transport: {
signalr: {
promise: fixture_hub_start,
hub: fixture_hub,
server: {
read: "read",
update: "update",
create: "create",
destroy: "destroy"
},
client: {
read: "read",
update: "update",
create: "create",
destroy: "destroy"
}
}
}
},
autoBind: true,
sortable: true,
editable: false,
scrollable: true,
columns: [
{
field: "Created_Date",
title: "Created",
format: "{0:dd/MM/yyyy}"
},
{
field: "Commencement_Date",
title: "Commencement",
format: "{0:dd/MM/yyyy}"
},
{
field: "Charterer",
template: "#if(Charterer !=null){# #=Charterer_Name# #} else { #--# }#"
},
{
field: "Region",
template: "#if(Region !=null){# #=Region_Name# #} else { #--# }#"
}
]
});
Here is the relative hub for that grid:
var fixture_url = "/fixtureHub";
var fixture_hub = new signalR.HubConnectionBuilder().withUrl(fixture_url, {
transport: signalR.HttpTransportType.LongPolling
}).build();
var fixture_hub_start = fixture_hub.start({
json: true
});
Here is the KendoUI wizard with form integration which I update the grid with, this form can process either a creation or an update data, this is achieved by checking the Id that is passed in. Whereby 0 equals new and >0 is existing.
function wizard_fixture() {
let wizard_name = "#wizard-fixture";
//Load Wizard
$(wizard_name).kendoWizard({
pager: true,
loadOnDemand: true,
reloadOnSelect: false,
contentPosition: "right",
validateForms: true,
deferred: true,
actionBar: true,
stepper: {
indicator: true,
label: true,
linear: true
},
steps: [
{
title: "Step 01",
buttons: [
{
name: "custom",
text: "Save & Continue",
click: function () {
let wizard = $(wizard_name).data("kendoWizard");
var validatable = $(wizard_name).kendoValidator().data("kendoValidator");
if (validatable.validate()) {
$.ajax({
type: "POST",
traditional: true,
url: "/Home/Process_Fixture",
data: $(wizard_name).serialize(),
success: function (result) {
...do stuff
},
error: function (xhr, status, error) {
console.log("error")
}
});
}
}
}
],
form: {
formData: {
Id: fixtureviewmodel.Id,
Created_User: fixtureviewmodel.Created_User,
Created_Date: fixtureviewmodel.Created_Date,
Connected: fixtureviewmodel.Connected
},
items: [
{
field: "Fixture_Id",
label: "Id",
editor: "<input type='text' name='Id' id='Fixture_Id' /> "
},
{
field: "Created_User",
label: "Created user",
editor: "<input type='text' name='Created_User' id='Created_User_Fixture' />"
},
{
field: "Created_Date",
id: 'Created_Date_Fixture',
label: "Created date",
editor: "DatePicker",
}
]
}
},
],
});
I've shortened this to demonstrate the custom button and the ajax posting that happens to Process_Fixture. Here is my controller which handles that:
public async Task<JsonResult> Process_Fixture(Fixture model)
{
if (model.Id == 0)
{
if (ModelState.IsValid)
{
var fixture = await _fixture.CreateAsync(model);
Update connected clients
await _fixtureHub.Clients.All.SendAsync("create", model);
return Json(new { success = true, data = fixture.Id, operation = "create" });
}
return Json(new { success = false });
}
else
{
var fixture = await _fixture.UpdateAsync(model);
await _fixtureHub.Clients.All.SendAsync("update", model);
return Json(new { success = true, data = fixture.Id, operation = "update" });
}
}
As you can see, I have injected my hub and I have called the "create" message to it which I believed would force the grid to update with whatever had changed or been created.
Here is the hub itself:
public class FixtureHub : DynamicHub
{
private readonly IRepository<Fixture> _fixtures;
private readonly IRepository<ViewGridFixtures> _viewFixtures;
public FixtureHub(IRepository<Fixture> fixtures, IRepository<ViewGridFixtures> viewFixtures)
{
_fixtures = fixtures;
_viewFixtures = viewFixtures;
}
public override Task OnConnectedAsync()
{
Groups.AddToGroupAsync(Context.ConnectionId, GetGroupName());
return base.OnConnectedAsync();
}
public override Task OnDisconnectedAsync(Exception e)
{
Groups.RemoveFromGroupAsync(Context.ConnectionId, GetGroupName());
return base.OnDisconnectedAsync(e);
}
public class ReadRequestData
{
public int ViewId { get; set; }
}
public IQueryable<ViewGridFixtures> Read()
{
IQueryable<ViewGridFixtures> data = _viewFixtures.GetAll();
return data;
}
public string GetGroupName()
{
return GetRemoteIpAddress();
}
public string GetRemoteIpAddress()
{
return Context.GetHttpContext()?.Connection.RemoteIpAddress.ToString();
}
}
I need some help here in understanding how I can tell the hub that the update/create/destroy has been called and it needs to do something. At the moment, I feel like injecting the hub and then calling the clients.all.async isn't the right way. With ajax it seems to ignore it and I wonder if the two technologies are working against each other.

Custom directive to check list length for input types

I tried my best to write a custom directive in apollo server express to validate if an input type field of type [Int] does not have more than max length but do not know if its the right way to do. Appreciate if somebody could help me correct any mistakes in the code below.
// schema.js
directive #listLength(max: Int) on INPUT_FIELD_DEFINITION
input FiltersInput {
filters: Filters
}
input Filters {
keys: [Int] #listLength(max: 10000)
}
// Custom directive
const { SchemaDirectiveVisitor } = require('apollo-server-express');
import {
GraphQLList,
GraphQLScalarType,
GraphQLInt,
Kind,
DirectiveLocation,
GraphQLDirective
} from "graphql";
export class ListLengthDirective extends SchemaDirectiveVisitor {
static getDirectiveDeclaration(directiveName) {
return new GraphQLDirective({
name: directiveName,
locations: [DirectiveLocation.INPUT_FIELD_DEFINITION],
args: {
max: { type: GraphQLInt },
}
});
}
// Replace field.type with a custom GraphQLScalarType that enforces the
// length restriction.
wrapType(field) {
const fieldName = field.astNode.name.value;
const { type } = field;
if (field.type instanceof GraphQLList) {
field.type = new LimitedLengthType(fieldName, type, this.args.max);
} else {
throw new Error(`Not a scalar type: ${field.type}`);
}
}
visitInputFieldDefinition(field) {
this.wrapType(field);
}
}
class LimitedLengthType extends GraphQLScalarType {
constructor(name, type, maxLength) {
super({
name,
serialize(value) {
return type.serialize(value);
},
parseValue(value) {
value = type.serialize(value);
return type.parseValue(value);
},
parseLiteral(ast) {
switch (ast.kind) {
case Kind.LIST:
if (ast.values.length > maxLength) {
throw {
code: 400,
message: `'${name}' parameter cannot extend ${maxLength} values`,
};
}
const arrayOfInts = ast.values.map(valueObj => parseInt(valueObj['value']));
return arrayOfInts;
}
throw new Error('ast kind should be Int of ListValue')
},
});
}
}
Does this look right?
Thanks

GraphQL queries with tables join using Node.js

I am learning GraphQL so I built a little project. Let's say I have 2 models, User and Comment.
const Comment = Model.define('Comment', {
content: {
type: DataType.TEXT,
allowNull: false,
validate: {
notEmpty: true,
},
},
});
const User = Model.define('User', {
name: {
type: DataType.STRING,
allowNull: false,
validate: {
notEmpty: true,
},
},
phone: DataType.STRING,
picture: DataType.STRING,
});
The relations are one-to-many, where a user can have many comments.
I have built the schema like this:
const UserType = new GraphQLObjectType({
name: 'User',
fields: () => ({
id: {
type: GraphQLString
},
name: {
type: GraphQLString
},
phone: {
type: GraphQLString
},
comments: {
type: new GraphQLList(CommentType),
resolve: user => user.getComments()
}
})
});
And the query:
const user = {
type: UserType,
args: {
id: {
type: new GraphQLNonNull(GraphQLString)
}
},
resolve(_, {id}) => User.findById(id)
};
Executing the query for a user and his comments is done with 1 request, like so:
{
User(id:"1"){
Comments{
content
}
}
}
As I understand, the client will get the results using 1 query, this is the benefit using GraphQL. But the server will execute 2 queries, one for the user and another one for his comments.
My question is, what are the best practices for building the GraphQL schema and types and combining join between tables, so that the server could also execute the query with 1 request?
The concept you are refering to is called batching. There are several libraries out there that offer this. For example:
Dataloader: generic utility maintained by Facebook that provides "a consistent API over various backends and reduce requests to those backends via batching and caching"
join-monster: "A GraphQL-to-SQL query execution layer for batch data fetching."
To anyone using .NET and the GraphQL for .NET package, I have made an extension method that converts the GraphQL Query into Entity Framework Includes.
public static class ResolveFieldContextExtensions
{
public static string GetIncludeString(this ResolveFieldContext<object> source)
{
return string.Join(',', GetIncludePaths(source.FieldAst));
}
private static IEnumerable<Field> GetChildren(IHaveSelectionSet root)
{
return root.SelectionSet.Selections.Cast<Field>()
.Where(x => x.SelectionSet.Selections.Any());
}
private static IEnumerable<string> GetIncludePaths(IHaveSelectionSet root)
{
var q = new Queue<Tuple<string, Field>>();
foreach (var child in GetChildren(root))
q.Enqueue(new Tuple<string, Field>(child.Name.ToPascalCase(), child));
while (q.Any())
{
var node = q.Dequeue();
var children = GetChildren(node.Item2).ToList();
if (children.Any())
{
foreach (var child in children)
q.Enqueue(new Tuple<string, Field>
(node.Item1 + "." + child.Name.ToPascalCase(), child));
}
else
{
yield return node.Item1;
}
}}}
Lets say we have the following query:
query {
getHistory {
id
product {
id
category {
id
subCategory {
id
}
subAnything {
id
}
}
}
}
}
We can create a variable in "resolve" method of the field:
var include = context.GetIncludeString();
which generates the following string:
"Product.Category.SubCategory,Product.Category.SubAnything"
and pass it to Entity Framework:
public Task<TEntity> Get(TKey id, string include)
{
var query = Context.Set<TEntity>();
if (!string.IsNullOrEmpty(include))
{
query = include.Split(',', StringSplitOptions.RemoveEmptyEntries)
.Aggregate(query, (q, p) => q.Include(p));
}
return query.SingleOrDefaultAsync(c => c.Id.Equals(id));
}

How to use jsonschema for Loopback remoteMethod?

In my app I want define JSON schemas for custom API.
For example from: http://docs.strongloop.com/display/public/LB/Remote+methods#Remotemethods-Example
module.exports = function(Person){
Person.greet = function(msg, cb) {
cb(null, 'Greetings... ' + msg);
}
Person.remoteMethod(
'greet',
{
accepts: <generate definitions from jsonschema>,
returns: <generate definitions from jsonschema>
}
);
};
How to do that?
This is right way?
MY SOLUTION - validation decorator + remote method params with object type
var validate = require('jsonschema').validate;
bySchema = function (schema) {
return function (func) {
return function () {
var data = arguments[0],
callback = arguments[1];
var result = validate(data, schema);
if (result.errors.length > 0) {
// some errors in request body
callback(null, {
success: false,
error: 'schema validation error',
});
return;
}
return func.apply(this, arguments);
};
};
};
defaultRemoteArguments = {
accepts: {
arg: 'data',
type: 'object',
http: function(ctx) {
return ctx.req.body;
}
},
returns: {
arg: 'data',
type: 'object',
root: true
}
};
Example:
Auth.login = bySchema(require('<path to shcemajson json for this request>'))
(function(data, cb) {
// process request
});
Auth.remoteMethod('login', defaultRemoteArguments);
In this solution contrib loopback explorer will not be useful, because request/response are objects, not fields...
The correct way to do it is to set the type in the returns attribute to the model name.
In your case you would write:
Person.remoteMethod(
'greet',
{
...
returns: {type:'Person', ...}
}
);
You need to modify your output to match the format accepted by the returns property.
...
returns: [{arg: "key1", type: "string"}, {arg: "key2", type: "object"}, ...];
...

i18next - All languages in one .json file

How can I make i18next load all languages from just one file?
I managed to do it by putting each language in a seperate file (translation-en.json, translation-no.json, etc), and also managed to input languages with the resStore option, but putting it all in a seperate .json file is really not documented anywhere (I've searched for 4 hours+ now)
My js code:
i18n.init({
debug: true,
lng: 'en',
resGetPath: 'translation.json'
},
function(t) {
console.log(t('test'));
});
My translation.json file:
{
en: {
translation: {
test: "some string"
}
},
no: {
translation: {
test: "litt tekst"
}
}
}
Ok, so I managed to "hack" it byt putting an object into a seperate .js file, include it in a script tag and loading it using resStore, but that just can't be the best way to use this lib.
Assume that your translation.json has loaded and assigned to a variable named resStore:
var resStore = {
en: {
translation: {
test: "some string"
}
},
no: {
translation: {
test: "litt tekst"
}
}
};
Next, you can override default ajax loading functionality with your customLoad function. An example might look like this:
var options = {
lng: 'en',
load: 'current',
lowerCaseLng: true,
fallbackLng: false,
resGetPath: 'i18n/__lng__/__ns__.json',
customLoad: function(lng, ns, options, loadComplete) {
var data = resStore[lng][ns];
loadComplete(null, data); // or loadComplete('some error'); if failed
},
ns: {
namespaces: ['translation'],
defaultNs: 'translation'
}
};
i18n.init(options, function(t) {
t('test'); // will get "some string"
});
new update on Mar 20, 2015
You can simply pass your resource store with the resStore option:
i18n.init({ resStore: resources });