Nestjs - file upload with fastify multipart - file-upload

I am trying to upload multiple files with nestjs using the fastify adapter. I can do so following the tutorial in this link -article on upload
Now this does the job of file upload using fastify-multipart, but I couldnt make use of the request validations before uploading,
for example, here is my rule-file-models (which later I wanted to save to postgre)
import {IsUUID, Length, IsEnum, IsString, Matches, IsOptional} from "class-validator";
import { FileExtEnum } from "./enums/file-ext.enum";
import { Updatable } from "./updatable.model";
import {Expose, Type} from "class-transformer";
export class RuleFile {
#Expose()
#IsUUID("4", { always: true })
id: string;
#Expose()
#Length(2, 50, {
always: true,
each: true,
context: {
errorCode: "REQ-000",
message: `Filename shouldbe within 2 and can reach a max of 50 characters`,
},
})
fileNames: string[];
#Expose()
#IsEnum(FileExtEnum, { always: true, each: true })
fileExts: string[];
#IsOptional({each: true, message: 'File is corrupated'})
#Type(() => Buffer)
file: Buffer;
}
export class RuleFileDetail extends RuleFile implements Updatable {
#IsString()
#Matches(/[aA]{1}[\w]{6}/)
recUpdUser: string;
}
And I wanted to validate the multipart request and see if these are set properly.
I cannot make it to work with event subscription based approach. Here are a few things I tried - adding the interceptor, to check for the request
#Injectable()
export class FileUploadValidationInterceptor implements NestInterceptor {
intercept(context: ExecutionContext, next: CallHandler): Observable<any> {
const req: FastifyRequest = context.switchToHttp().getRequest();
console.log('inside interceptor', req.body);
// content type cmes with multipart/form-data;boundary----. we dont need to valdidate the boundary
// TODO: handle split errors based on semicolon
const contentType = req.headers['content-type'].split(APP_CONSTANTS.CHAR.SEMI_COLON)[0];
console.log(APP_CONSTANTS.REGEX.MULTIPART_CONTENT_TYPE.test(contentType));
const isHeaderMultipart = contentType != null?
this.headerValidation(contentType): this.throwError(contentType);
**// CANNOT check fir req.file() inside this, as it throws undefined**
return next.handle();
}
headerValidation(contentType) {
return APP_CONSTANTS.REGEX.MULTIPART_CONTENT_TYPE.test(contentType) ? true : this.throwError(contentType);
}
throwError(contentType: string) {
throw AppConfigService.getCustomError('FID-HEADERS', `Request header does not contain multipart type:
Provided incorrect type - ${contentType}`);
}
}
I wasnt able to check req.file() in the above interceptor. It throws as undefined. I tried to follow the fastify-multipart
But I wasnt able to get the request data in a prehandler as provided in the documentation for fastify-multipart
fastify.post('/', async function (req, reply) {
// process a single file
// also, consider that if you allow to upload multiple files
// you must consume all files othwise the promise will never fulfill
const data = await req.file()
data.file // stream
data.fields // other parsed parts
data.fieldname
data.filename
data.encoding
data.mimetype
// to accumulate the file in memory! Be careful!
//
// await data.toBuffer() // Buffer
//
// or
await pump(data.file, fs.createWriteStream(data.filename))
I tried getting via by registering a prehandler hook of my own like this (executed as iife)
(async function bootstrap() {
const appConfig = AppConfigService.getAppCommonConfig();
const fastifyInstance = SERVERADAPTERINSTANCE.configureFastifyServer();
// #ts-ignore
const fastifyAdapter = new FastifyAdapter(fastifyInstance);
app = await NestFactory.create<NestFastifyApplication>(
AppModule,
fastifyAdapter
).catch((err) => {
console.log("err in creating adapter", err);
process.exit(1);
});
.....
app.useGlobalPipes(
new ValidationPipe({
errorHttpStatusCode: 500,
transform: true,
validationError: {
target: true,
value: true,
},
exceptionFactory: (errors: ValidationError[]) => {
// send it to the global exception filter\
AppConfigService.validationExceptionFactory(errors);
},
}),
);
app.register(require('fastify-multipart'), {
limits: {
fieldNameSize: 100, // Max field name size in bytes
fieldSize: 1000000, // Max field value size in bytes
fields: 10, // Max number of non-file fields
fileSize: 100000000000, // For multipart forms, the max file size
files: 3, // Max number of file fields
headerPairs: 2000, // Max number of header key=>value pairs
},
});
(app.getHttpAdapter().getInstance() as FastifyInstance).addHook('onRoute', (routeOptions) => {
console.log('all urls:', routeOptions.url);
if(routeOptions.url.includes('upload')) {
// The registration actually works, but I cant use the req.file() in the prehandler
console.log('###########################');
app.getHttpAdapter().getInstance().addHook('preHandler', FilePrehandlerService.fileHandler);
}
});
SERVERADAPTERINSTANCE.configureSecurity(app);
//Connect to database
await SERVERADAPTERINSTANCE.configureDbConn(app);
app.useStaticAssets({
root: join(__dirname, "..", "public"),
prefix: "/public/",
});
app.setViewEngine({
engine: {
handlebars: require("handlebars"),
},
templates: join(__dirname, "..", "views"),
});
await app.listen(appConfig.port, appConfig.host, () => {
console.log(`Server listening on port - ${appConfig.port}`);
});
})();
Here is the prehandler,
export class FilePrehandlerService {
constructor() {}
static fileHandler = async (req, reply) => {
console.log('coming inside prehandler');
console.log('req is a multipart req',await req.file);
const data = await req.file();
console.log('data received -filename:', data.filename);
console.log('data received- fieldname:', data.fieldname);
console.log('data received- fields:', data.fields);
return;
};
}
This pattern of registring and gettin the file using preHandler works in bare fastify application. I tried it
Bare fastify server:
export class FileController {
constructor() {}
async testHandler(req: FastifyRequest, reply: FastifyReply) {
reply.send('test reading dne');
}
async fileReadHandler(req, reply: FastifyReply) {
const data = await req.file();
console.log('field val:', data.fields);
console.log('field filename:', data.filename);
console.log('field fieldname:', data.fieldname);
reply.send('done');
}
}
export const FILE_CONTROLLER_INSTANCE = new FileController();
This is my route file
const testRoute: RouteOptions<Server, IncomingMessage, ServerResponse, RouteGenericInterface, unknown> = {
method: 'GET',
url: '/test',
handler: TESTCONTROLLER_INSTANCE.testMethodRouteHandler,
};
const fileRoute: RouteOptions = {
method: 'GET',
url: '/fileTest',
preHandler: fileInterceptor,
handler: FILE_CONTROLLER_INSTANCE.testHandler,
};
const fileUploadRoute: RouteOptions = {
method: 'POST',
url: '/fileUpload',
preHandler: fileInterceptor,
handler: FILE_CONTROLLER_INSTANCE.fileReadHandler,
};
const apiRoutes = [testRoute, fileRoute, fileUploadRoute];
export default apiRoutes;
Could someone let me know the right the way to get the fieldnames , validate them befr the service being called in Nestjs

Well, I have done something like this and It works great for me. Maybe it can work for you too.
// main.ts
import multipart from "fastify-multipart";
const app = await NestFactory.create<NestFastifyApplication>(
AppModule,
new FastifyAdapter(),
);
app.register(multipart);
// upload.guard.ts
import {
Injectable,
CanActivate,
ExecutionContext,
BadRequestException,
} from "#nestjs/common";
import { FastifyRequest } from "fastify";
#Injectable()
export class UploadGuard implements CanActivate {
public async canActivate(ctx: ExecutionContext): Promise<boolean> {
const req = ctx.switchToHttp().getRequest() as FastifyRequest;
const isMultipart = req.isMultipart();
if (!isMultipart)
throw new BadRequestException("multipart/form-data expected.");
const file = await req.file();
if (!file) throw new BadRequestException("file expected");
req.incomingFile = file;
return true;
}
}
// file.decorator.ts
import { createParamDecorator, ExecutionContext } from "#nestjs/common";
import { FastifyRequest } from "fastify";
export const File = createParamDecorator(
(_data: unknown, ctx: ExecutionContext) => {
const req = ctx.switchToHttp().getRequest() as FastifyRequest;
const file = req.incomingFile;
return file
},
);
// post controller
#Post("upload")
#UseGuards(UploadGuard)
uploadFile(#File() file: Storage.MultipartFile) {
console.log(file); // logs MultipartFile from "fastify-multipart"
return "File uploaded"
}
and finally my typing file
declare global {
namespace Storage {
interface MultipartFile {
toBuffer: () => Promise<Buffer>;
file: NodeJS.ReadableStream;
filepath: string;
fieldname: string;
filename: string;
encoding: string;
mimetype: string;
fields: import("fastify-multipart").MultipartFields;
}
}
}
declare module "fastify" {
interface FastifyRequest {
incomingFile: Storage.MultipartFile;
}
}

So I found a simpler alternative. I started using fastify-multer. I used it along with this awesome lib - which made me use the multer for fastify - #webundsoehne/nest-fastify-file-upload
These are the changes I made. I registered the multer content process.
app.register(multer( {dest:path.join(process.cwd()+'/upload'),
limits:{
fields: 5, //Number of non-file fields allowed
files: 1,
fileSize: 2097152,// 2 MB,
}}).contentParser);
Then in the controller - I use it as the nestjs doc says . This actually makes fasitfy work with multer
#UseInterceptors(FileUploadValidationInterceptor, FileInterceptor('file'))
#Post('/multerSample')
async multerUploadFiles(#UploadedFile() file, #Body() ruleFileCreate: RuleFileCreate) {
console.log('data sent', ruleFileCreate);
console.log(file);
// getting the original name of the file - no matter what
ruleFileCreate.originalName = file.originalname;
return await this.fileService.fileUpload(file.buffer, ruleFileCreate);
}
BONUS - storing the file in local and storing it in DB - Please refer
github link

Related

Handle image upload for CKEditor 5 with a graphql backend

here's my current setup that is resulting in TypeError: undefined is not a function
import client from 'GraphQl/apolloClient'
import { ADD_POST_IMAGE } from 'GraphQl/News/Mutations'
function MyCustomUploadAdapterPlugin(editor) {
editor.plugins.get('FileRepository').createUploadAdapter = (loader) => {
return new MyUploadAdapter(loader)
}
}
class MyUploadAdapter {
constructor(props) {
// CKEditor 5's FileLoader instance.
this.loader = props
this.mutation = client.mutate({ mutation: ADD_POST_IMAGE })
}
// Starts the upload process.
upload() {
return new Promise((resolve, reject) => {
this._sendRequest()
})
}
// Prepares the data and sends the request.
_sendRequest() {
const [addPost, { error }] = this.mutation
this.loader.file.then(async (result) => {
const { data: response } = await addPost({
variables: { data: { image: result } },
})
console.log(response)
})
}
}
export default MyCustomUploadAdapterPlugin
i'm trying to setup a custom upload adapter for React CKEditor plugin 5.
since i have a graphql backend, i plan to use mutations for upload.

Fetching API for Articles with NextJS and Strapi

I would like some help on an API issue.
I have been trying to link each Article page based on the content I have created in Strapi CMS on my local server.
The API endpoint that I manage to gather data is from 'http://localhost:1337/api/articles?populate=*'.
Here is my code:
// lib/api.js
export class ApiError extends Error {
constructor(url, status) {
super(`'${url}' returned ${status}`);
if(Error.captureStackTrace) {
Error.captureStackTrace(this, ApiError);
}
this.name = 'ApiError';
this.status = status;
}
}
export async function fetchJson(url, options) {
const response = await fetch(url, options);
if(!response.ok) {
throw new ApiError(url, response.status);
}
return await response.json();
}
// lib/articles.js
import { fetchJson } from "./api";
const API_URL = process.env.API_URL;
// Gets a single article
export async function getArticle(id) {
const article = await fetchJson(`${API_URL}/api/article/${id}`);
return stripArticle(article);
}
// Gets all articles
export async function getArticles() {
const articles = await fetchJson(`${API_URL}/api/articles`);
return articles.map(stripArticle);
}
function stripArticle(article) {
return {
id: article.id,
title: article.attributes.Title,
content: article.attributes.Content,
pictureUrl: API_URL + article.attributes.Photo.formats.thumbnail.url,
}
}
Article Page:
//article/[id].js
import Page from "../../components/Page";
import { getArticle, getArticles } from "../../lib/articles";
import ReactMarkdown from 'react-markdown';
import Moment from 'react-moment';
export async function getStaticProps({ params }) {
const article = await getArticle(params.id)
return {
props: { article },
unstable_revalidate: 1,
}
}
export default function Article({ article }) {
return (
<Page title={article.Title}>
<ReactMarkdown source={article.Content} />
<p>
<Moment from="MM Do YYYY">{article.CreatedAt}</Moment>
</p>
</Page>
)
}
export async function getStaticPaths() {
const articles = await getArticles()
return {
paths: articles.map((article) => ({
params: { id: article.id.toString() }, // Number convert to string
})),
fallback: 'blocking', // What if error. Client is blocked, until new page is ready.
};
}
I would get an error: TypeError: articles.map is not a function.
If there is a better way to format and write the code, do let me know as I have been trying to find which is best.
Thanks for the help in advance.

Cancelling upload request before destroying object makes mobx-state-tree throw Cannot modify [dead] errors

I have a React Native app where I want to upload some files using Axios.
I've made a mobx-state-tree store for file uploads, and each file has its own CancelTokenSource, which is sent to the Axios network call.
When an upload is in progress, I try to cancel the upload, and then destroy the item.
The simplest way is like I show below, by destroying the item in the store, and then have an beforeDestroy() hook that cancels the upload. But that approach makes mobx-state-tree show the error in the screenshot.
I've also tried calling the file.cancelTokenSource.cancel() explicitly before destroying the item. Same error. I suspect that the operation is not fully cancelled when the cancel() returns, but since it's not an async function, I cannot await its completion.
When I just call the cancel() without destroying, it cancels just fine, so I'm pretty sure that it's a timing issue, where the destroy(file) is called too soon, before cancel() has cleaned up after itself.
What to do here?
file-upload-store.ts
import { destroy, flow, Instance, types } from 'mobx-state-tree'
import { FileUpload, IFileUpload } from '../entities/file-upload/file-upload'
import { getApi } from '../../store-environment'
/**
* Store for handling the FileUpload
*/
export const FileUploadStore = types
.model('FileUploadStore')
.props({
files: types.array(FileUpload),
})
.actions((self) => {
const api = getApi(self)
const add = (uri: string, name: string, type: string, size: number) => {
const file = FileUpload.create({
uri,
name,
type,
size,
})
self.files.push(file)
upload(file)
}
const remove = (file: IFileUpload) => {
destroy(file)
}
const cancel = (file: IFileUpload) => {
// also tried this - with no luck
// file.cancelTokenSource.cancel()
destroy(file)
}
const upload = flow(function* (file: IFileUpload) {
file.status = 'pending'
file.uploadedBytes = 0
const { uri, name, type } = file
try {
const id = yield api.uploadFile(uri, name, type, file.setProgress, file.cancelTokenSource.token)
file.status = 'completed'
file.fileUploadId = id
} catch (error) {
file.status = 'failed'
file.error = error.message
}
})
return {
afterCreate() {
// Avoid persistance
self.files.clear()
},
remove,
cancel,
retry: upload,
add,
}
})
export type IFileUploadStore = Instance<typeof FileUploadStore>
file-upload.ts
import { Instance, SnapshotIn, types } from 'mobx-state-tree'
import { CancelToken } from 'apisauce'
/**
* FileUpload contains the particular data of a file, and some flags describing its status.
*/
export const FileUpload = types
.model('FileUpload')
.props({
name: types.string,
type: types.string,
uri: types.string,
size: types.number,
// set if an arror occours
error: types.maybe(types.string),
status: types.optional(types.enumeration(['pending', 'completed', 'failed']), 'pending'),
// updated by progressCallback
uploadedBytes: types.optional(types.number, 0),
// assigned when response from backend is received
fileUploadId: types.maybe(types.string),
})
.volatile(() => ({
cancelTokenSource: CancelToken.source(),
}))
.actions((self) => ({
setProgress(event: ProgressEvent) {
self.uploadedBytes = event.loaded
},
beforeDestroy() {
self.cancelTokenSource?.cancel()
},
}))
export interface IFileUpload extends Instance<typeof FileUpload> {}
// SnapshotIn, used for creating input to store: {Model}.create({})
export interface IFileUploadSnapshotIn extends SnapshotIn<typeof FileUpload> {}
You are destroying the FileUpload node and cancelling the axios request nicely, but cancelling the request will throw an error, so you need to make sure that your FileUpload node is still alive before you try to update it in the catch.
import { destroy, flow, Instance, types, isAlive } from 'mobx-state-tree'
// ...
const upload = flow(function* (file: IFileUpload) {
const { uri, name, type } = file
file.status = "pending"
file.uploadedBytes = 0
try {
const id = yield api.uploadFile(
uri,
name,
type,
file.setProgress,
file.cancelTokenSource.token
)
file.status = "completed"
file.fileUploadId = id
} catch (error) {
if (isAlive(file)) {
file.status = "failed"
file.error = error.message
}
}
})

Cannot read property 'context' of undefined - GraphQL

I am using Typescript, Express, TypeORM, GraphQL and TypeGraphQL to build a small app that allows the user to login.
However, when I hit my test query bye on the GraphQL playground, I get:
Cannot read property 'context' of undefined
"TypeError: Cannot read property 'context' of undefined",
" at new exports.isAuth // isAuth is a JS file I wrote
MyContext.js
import { Request, Response } from "express";
export interface MyContext {
req: Request;
res: Response;
payload?: { userId: string };
}
isAuth.js
import { MiddlewareFn } from "type-graphql";
import { verify } from "jsonwebtoken";
import { MyContext } from "./MyContext";
export const isAuth: MiddlewareFn<MyContext> = ({ context }, next) => {
const authorization = context.req.headers["authorization"];
if (!authorization) {
throw new Error("not authorized");
}
...
UserResolver
#Query(() => String)
#UseMiddleware(isAuth)
bye(#Ctx() { payload }: MyContext) {
console.log(payload);
return `your user id is: ${payload!.userId}`;
}
I am not sure why the context is undefinied in the file isAuth.js
SOLVED thanks to: https://github.com/MichalLytek/type-graphql/issues/433
1) Go into ./tsconfig.json
2) Change "target": "es5" to "target": "es6"

relay subscription onNext not triggered on react-native

I am a subscription setup but onNext is not getting triggered I am not sure why since this is my first time implementing subscription and docs was not much help with the issue.
Here are the code implementations:
import {
graphql,
requestSubscription
} from 'react-relay'
import environment from '../network';
const subscription = graphql`
subscription chatCreatedSubscription{
chatCreated{
id
initiate_time
update_time
support_id
category_id
email
name
}
}
`;
function chatCreated(callback) {
const variables = {};
requestSubscription(environment, {
subscription,
variables,
onNext: () => {
console.log("onNext");
callback()
},
updater: () => {
console.log("updater");
}
});
}
module.exports = chatCreated;
and here is my network for the subscription
import { Environment, Network, RecordSource, Store } from "relay-runtime";
import Expo from "expo";
import { SubscriptionClient } from "subscriptions-transport-ws";
import { WebSocketLink } from 'apollo-link-ws';
import { execute } from 'apollo-link';
import accessHelper from "../helper/accessToken";
const networkSubscriptions = async (operation, variables) => {
let token = await accessHelper();
if (token != null || token != undefined) {
const subscriptionClient = new SubscriptionClient("ws://localhost:3000/graphql",
{
reconnect: true,
connectionParams: {
Authorization: token,
},
});
execute(new WebSocketLink(subscriptionClient), {
query: operation.text,
variables,
});
}
}
const network = Network.create(fetchQuery, networkSubscriptions);
const store = new Store(new RecordSource());
const environment = new Environment({
network,
store
});
export default environment;
the subscription is called in a componentDidMount method on a component it executes but the onNext method inside the subscription is never triggered when new information is added to what the subscription is listening to.
so i figured out that my issue was the network js not being setup properly and the version of subscription-transport-ws. i added version 0.8.3 of the package and made the following changes to my network file:
const networkSubscriptions = async (config, variables, cacheConfig, observer) => {
const query = config.text;
let token = await accessHelper();
if (token != null || token != undefined) {
const subscriptionClient = new SubscriptionClient(`ws://${api}/graphql`,
{
reconnect: true,
connectionParams: {
Authorization: token,
},
});
subscriptionClient.subscribe({ query, variables }, (error, result) => {
observer.onNext({ data: result })
})
return {
dispose: subscriptionClient.unsubscribe
};
}
}
i hope this helps you if you get stuck with the same issue as mine.