Next-Auth Store Session in Redis - redis

I'm coming from express, never using next-auth before but unsure how to store user session in a redis database.
On express, I would have done the following;
import express from 'express';
import session from 'express-session';
import connectRedis from 'connect-redis';
import Redis from 'ioredis';
import { __prod__, COOKIE_NAME } from './constants';
const main = async () => {
const RedisStore = connectRedis(session);
const redis = new Redis(process.env.REDIS_URL);
app.use(
session({
name: 'qid',
store: new RedisStore({
client: redis,
disableTouch: true,
ttl: 1000 * 60 * 60 * 24 * 365, // 1 year
}),
cookie: {
maxAge: 1000 * 60 * 60 * 24 * 365, // 1 year
httpOnly: true,
sameSite: 'lax',
secure: __prod__,
domain: __prod__ ? process.env.DOMAIN : undefined,
},
saveUninitialized: false,
secret: process.env.SESSION_SECRET,
resave: false,
}),
);
};
main()
[...nextauth].ts
import NextAuth, { type NextAuthOptions } from "next-auth";
import CredentialsProvider from "next-auth/providers/credentials";
import { PrismaAdapter } from "#next-auth/prisma-adapter";
import { prisma } from "../../../server/db/client";
export const authOptions: NextAuthOptions = {
callbacks: {
session({ session, user }) {
if (session.user) {
session.user.id = user.id;
}
return session;
},
},
adapter: PrismaAdapter(prisma),
providers: [
CredentialsProvider({
async authorize(credentials, req) {
//
},
}),
],
};
export default NextAuth(authOptions);
I can't find any implementations of redis in NextAuth, other than using Upstash for caching, but not for sessions.

I made an adapter for Next Auth that uses ioredis to store data in the Hash data structure.
In the Upstash adapter, they store the data with JSON.stringify.
In my adapter, I use the Hash data structure so it is easier to extend the User object.
You can take a look at this repository.

Related

Getting Adapter not added. Use RxDB.plugin(require('pouchdb-adapter-[adaptername]'); in ReactNative

RxError: RxError:
RxDatabase.create(): Adapter not added. Use RxDB.plugin(require('pouchdb-adapter-[adaptername]');
Given parameters: {
adapter:"asyncstorage"}
database.js //mycode
import RxDB from 'rxdb';
import schema from './ramsSchema';
RxDB.plugin(require('pouchdb-adapter-asyncstorage').default);
RxDB.plugin(require('pouchdb-adapter-http'));
const syncURL = 'couchDB url'
//this function initializes the RxDB if DB already exists else creates a new one and returns the db instance
export async function initializeDB(dbName,password) {
const db = await RxDB.create({
name: dbName.toLowerCase(),
adapter: 'asyncstorage',
password:'rams#1234',
multiInstance: false,
ignoreDuplicate: true,
});
const collection = await db.collection({
name:'rams',
schema,
});
collection.sync({
remote: syncURL + dbName.toLowerCase() + '/',
options: {
live: true,
retry: true,
},
});
return db;
}
How can I fix this?
The createDatabase function is used like this in the documentation.
import {
createRxDatabase
} from 'rxdb';
import { getRxStorageDexie } from 'rxdb/plugins/dexie';
// create a database
const db = await createRxDatabase({
name: 'heroesdb', // the name of the database
storage: getRxStorageDexie()
});
By the way, PouchDB is deprecated in the RXDB document.
RxStorage PouchDB Usage like this.
import { createRxDatabase } from 'rxdb';
import { getRxStoragePouch, addPouchPlugin } from 'rxdb/plugins/pouchdb';
addPouchPlugin(require('pouchdb-adapter-idb'));
const db = await createRxDatabase({
name: 'exampledb',
storage: getRxStoragePouch(
'idb',
{
/**
* other pouchdb specific options
* #link https://pouchdb.com/api.html#create_database
*/
}
)
});

How can i keep alive ttl in redis?

Currently, I'm using redis for mutiple website ( A and B ) .
A website is php website and B website is node.js website (mine).
I'm trying to get redis information from A website to B website.
However, when a user login to A website and switch to B website then B website is able to get A website user's information, but as soon as B website gets redis information, redis TTl become 2, so in 2seconds the user information is expired.
However, when a user trying to do same thing like login B website and switch A website. then the session is not expired.
I tried to change ttl time and change another setting, what i did was not working.
I'd like to know what is able to trigger this issue.
this is setting for redis B website (node.js)
import express from "express";
import path from "path";
import mongoose from "mongoose";
import cors from "cors";
import session from "express-session";
import redisConnect from "connect-redis";
import { createClient } from "redis";
import cookieParser from "cookie-parser";
import user_routes from "./routes/user_routes.js";
import user_charts_routes from "./routes/user_charts_routes.js";
import code_visual_routes from "./routes/code_visual_routes.js";
import db_routes from "./routes/db_routes.js";
import assets_routes from "./routes/assets_routes.js";
import { redis_env, sso_env } from "./Data/env_export.js";
const {
redis_url,
redis_password,
redis_domain,
redis_prefix,
redis_session_name,
redis_session_secret,
} = redis_env;
const { sso_client_redirect } = sso_env;
let RedisStore = redisConnect(session);
let redisClient = createClient({
legacyMode: true,
url: redis_url,
password: redis_password,
});
redisClient.connect().catch(console.error);
const session_info = {
store: new RedisStore({
client: redisClient,
prefix: redis_prefix,
ttl: 3600 * 3,
}),
name: redis_session_name,
secret: redis_session_secret,
resave: true,
saveUninitialized: false,
cookie: {
path: "/",
httpOnly: true,
secure: false,
domain: redis_domain,
maxAge: 12 * 60 * 60 * 1000,
sameSite: "lax",
},
};
const corsOption = {
credentials: true,
origin: sso_client_redirect,
};
const PORT = process.env.PORT || 5000;
var app = express();
app.use(cookieParser());
app.use(cors(corsOption));
app.use(express.urlencoded({ limit: "50mb", extended: true }));
app.use(express.json({ limit: "50mb" }));
app.use(session(session_info));

How to test email sending with Mailhog on local?

Setup mailhog with docker-compose like:
version: '3'
services:
mailhog:
image: mailhog/mailhog
ports:
- 8025:8025
- 1025:1025
It's possible to access localhost:8025 from browser. Maybe the SMTP server 1025 also works but don't know how to confirm it.
In a NestJS application, for testing the email code as:
#Module({
imports: [NodeMailerModule],
providers: [MailHogEmailRepository],
exports: [MailHogEmailRepository],
})
class MailHogEmailRepositoryModule {}
#Module({
imports: [MailHogEmailRepositoryModule],
providers: [
{
provide: EmailRepository,
useFactory: (
config: ConfigService,
mailHog: MailHogEmailRepository,
) => {
return mailHog;
}
},
inject: [ConfigService, MailHogEmailRepository],
},
],
exports: [EmailRepository],
})
export class EmailRepositoryModule {}
MailHogEmailRepository send with nodemailer:
#Injectable()
export class MailHogEmailRepository implements EmailRepository {
constructor(
#Inject(NodeMailerToken) private readonly nodemailer: Transporter,
) {}
async send(email: Email) {
const options = {
to: email.to,
from: email.from,
subject: email.subject,
};
await this.nodemailer.sendMail(options);
}
}
nodemailer config:
import { Module } from '#nestjs/common';
import { ConfigService } from '#nestjs/config';
import { createTransport } from 'nodemailer';
export const NodeMailerToken = Symbol('nodemailer');
#Module({
providers: [
{
provide: NodeMailerToken,
useFactory: (config: ConfigService) =>
createTransport({
host: 'localhost',
port: 1025,
secure: true,
}),
inject: [ConfigService],
},
],
exports: [NodeMailerToken],
})
export class NodeMailerModule {}
In test source, it always timeout:
import { Test, TestingModule } from '#nestjs/testing';
import request from 'supertest';
import {
FastifyAdapter,
NestFastifyApplication,
} from '#nestjs/platform-fastify';
describe('Test sender', () => {
let app: NestFastifyApplication;
beforeEach(async () => {
const moduleFixture: TestingModule = await Test.createTestingModule({
imports: [AppModule],
}).compile();
app = moduleFixture.createNestApplication(new FastifyAdapter());
await app.init();
await app.getHttpAdapter().getInstance().ready();
});
describe('/handler (POST)', () => {
describe('should send data to mail server', () => {
it('success', () => {
const message = ...
return request(app.getHttpServer())
.post('/handler')
.send({ message })
.expect(200);
});
});
});
});
$ npm run test
thrown: "Exceeded timeout of xxx ms for a test.
Use jest.setTimeout(newTimeout) to increase the timeout value, if this is a long-running test."
It seems the test case couldn't access the mailhog server running in docker container. How to set it correctly?

ThrottlerStorageRedisService on integration service

im trying to build an integration test for a module in NestJS and im having a problem with this package.
I created a redis intances with docker on my local but nothing seems to work.
what am i doing wrong ?
import { config } from '#clients-service/common/config';
import {
DEFAULT_THROTTLE_TTL_SECONDS,
DEFAULT_THROTTLE_LIMIT,
} from '#clients-service/common/constants';
import { RedisCacheModule } from '#clients-service/common/providers/redis-cache';
import { INestApplication } from '#nestjs/common';
import { Test, TestingModule } from '#nestjs/testing';
import { ThrottlerModule } from '#nestjs/throttler';
import { ThrottlerStorageRedisService } from 'nestjs-throttler-storage-redis';
const MAX_TIME = 5 * 1000;
describe('[Module] Clients Service', () => {
jest.setTimeout(MAX_TIME);
let app: INestApplication;
beforeAll(async () => {
const test = new ThrottlerStorageRedisService({
host: config.redis.host,
port: config.redis.port,
password: config.redis.password,
});
const module: TestingModule = await Test.createTestingModule({
imports: [
RedisCacheModule,
ThrottlerModule.forRoot({
ttl: DEFAULT_THROTTLE_TTL_SECONDS,
limit: DEFAULT_THROTTLE_LIMIT,
storage: test,
}),
],
}).compile();
app = module.createNestApplication();
await app.init();
});
it('should be defined', () => {
expect(app).toBeDefined();
});
});

nuxt.js - How to cache axios call at server side for all clients

I am using a vue + nuxt.js application, I like to know if it is possible to cache an axios webservice call for all clients. I have to get some currency reference data and this makes not much sense that every client has to call this data.
Can someone provide me some hints or even an example? Thx.
Here is working solution with latest Nuxt 2.11, using locally defined module.
First add a local module to nuxt.config.js
modules: [
"#/modules/axCache",
...
]
Then
// modules/axCache.js
import LRU from "lru-cache"
export default function(_moduleOptions) {
const ONE_HOUR = 1000 * 60 * 60
const axCache = new LRU({ maxAge: ONE_HOUR })
this.nuxt.hook("vue-renderer:ssr:prepareContext", ssrContext => {
ssrContext.$axCache = axCache
})
}
and
// plugins/axios.js
import { cacheAdapterEnhancer } from "axios-extensions"
import LRU from "lru-cache"
const ONE_HOUR = 1000 * 60 * 60
export default function({ $axios, ssrContext }) {
const defaultCache = process.server
? ssrContext.$axCache
: new LRU({ maxAge: ONE_HOUR })
const defaults = $axios.defaults
// https://github.com/kuitos/axios-extensions
defaults.adapter = cacheAdapterEnhancer(defaults.adapter, {
enabledByDefault: false,
cacheFlag: "useCache",
defaultCache
})
}
Note, this works for both server/client sides and can be configured to work only on one side.
solution found on: https://github.com/nuxt-community/axios-module/issues/99
here is the new solution for cache the whole page
even you can cache consistent api like menu if you need
https://www.npmjs.com/package/nuxt-perfect-cache
npm i nuxt-perfect-cache
// nuxt.config.js
modules: [
[
'nuxt-perfect-cache',
{
disable: false,
appendHost: true,
ignoreConnectionErrors:false, //it's better to be true in production
prefix: 'r-',
url: 'redis://127.0.0.1:6379',
getCacheData(route, context) {
if (route !== '/') {
return false
}
return { key: 'my-home-page', expire: 60 * 60 }//1hour
}
}
]
]
then for cache your api response in redis for all clients:
asyncData(ctx) {
return ctx.$cacheFetch({ key: 'myApiKey', expire: 60 * 2 }, () => {
console.log('my callback called*******')
return ctx.$axios.$get('https://jsonplaceholder.typicode.com/todos/1')
})
}