502 Bad Gateway in Kubernetes cluster after I added mongoDB connection code - express

I was chugging along with my Kubernetes cluster project when after creating a User model to start creating users in my application, I get a 502 Bad Gateway error in my Postman client.
So I was so focused on my ingress-nginx yaml file, staring at it for typos, rewriting it, uninstalling, reinstalling and still getting that error, that I decided to take it the next step further.
Via the current user route handler:
import express from "express";
const router = express.Router();
router.get("/api/users/currentuser", (req, res) => {
res.send("howdy!");
});
export { router as currentUserRouter };
I have always been able to go to my browser and successfully see howdy! rendered when I went to mywebsite.com/api/users/currentuser
but then I added some logic to index.ts file I did not particular care for from the https://expressjs.com/en/guide/routing.html:
app.all("*", async (req, res) => {
throw new NotFoundError();
});
Well, sure enough that killed my ability to go to mywebsite.com/api/users/currentuser and see howdy! rendered and instead I was getting a 502 Bad Gateway. So I said okay I will just leave that one out then.
But then I noticed a huge chunk of very important code was breaking my ability to visit that url as well:
// const start = async () => {
// try {
// await mongooose.connect("mongodb://auth-mongo-srv:27017/auth", {
// useNewUrlParser: true,
// useUnifiedTopology: true,
// useCreateIndex: true,
// });
// console.log("Connected to MongoDB");
// } catch (error) {
// app.listen(3000, () => {
// console.log("Listening on port 3000!!!!!");
// });
// }
// };
// start();
All of the above is what I need to connect to my local MongoDB server and start creating users.
So I started to even get more granular and slowly commenting code back in. Well, the app.all() is not a problem anymore, the problem seems to be throwing my mongoDB connection code inside of a try/catch statement, but I have no idea why that would have created the problem. Any ideas anyone?
So instead if I just run it like this:
const start = async () => {
await mongooose.connect("mongodb://auth-mongo-srv:27017/auth", {
useNewUrlParser: true,
useUnifiedTopology: true,
useCreateIndex: true,
});
console.log("Connected to MongoDB");
app.listen(3000, () => {
console.log("Listening on port 3000!!!!!");
});
};
start();
It all works fine again.

Related

Is there any way in Nuxt3 to load a plugin once?

I'm trying to integrate Sequelize to my Nuxt 3 project. However, I couldn't figure out how to make it load only once instead of reloading it every time the page was refreshed / navigating to another routes.
I couldn't find any information on the docs. Is it even possible?
~/plugins/sequelize.server.ts
import { Sequelize } from "sequelize"
export default defineNuxtPlugin(async (nuxtApp) => {
const config = useRuntimeConfig()
const sequelize = new Sequelize(config.dbName, config.dbUser, config.dbPass,{
host: config.dbHost,
port: parseInt(config.dbPort),
dialect: 'mysql',
})
try {
await sequelize.authenticate()
// this log was executed every time I navigate to a new route
// or refreshing the browser.
console.log('Connection has been established successfully.');
} catch (error) {
console.error('Unable to connect to the database:', error);
}
return {
provide: {
db: sequelize
}
}
})
OP solved his issue by removing a composable that was initialized on a component's mounted lifecycle hook.
Just a remaining piece of code.

Redis issue on module-redis-fork

Issue
Hi everyone,
I have an issue while trying to interact with Redis in those conditions:
Redis instance with Redisearch module,
Create node-redis client before Redis module fork is ongoing,
Redis module fork is on-going
The behaviour that I get is that "send_command" stays idle until the fork stops.
When the fork ends I get this error:
debug mode ->
Redis connection is gone from end event
client error ->
AbortError: Redis connection lost and command aborted. It might have been processed.
After I get this error the commands from the same client (without creating a new client) come back to works fine.
On every fork, I got the same behaviour.
Additional Info:
keys: 37773168,
used_memory_human: '87.31G'
Code Example:
This is a simple express app,
'use strict';
const express = require('express');
const Redis = require('redis');
// Redis.debug_mode = true;
const router = express.Router();
let client = null;
router.get('/redisearch/connect', async (req, res, next) => {
const conf = {
'host': '127.0.0.1',
'port': 6379,
'db': 0,
};
try {
if (!client) client = Redis.createClient(conf.port, conf.host, { db: conf.db });
res.send('Connected');
} catch (err) {
res.send(err);
}
});
router.get('/redisearch/d', async (req, res, next) => {
const num = 10;
const dArgs = ['testIndexName', `#ic:[${num} ${num}]`, 'GROUPBY', 1, '#d'];
try {
client.send_command('FT.AGGREGATE', dArgs, (err, reply) => {
if (err) {
res.send({ err: err });
};
res.send({ d: reply });
});
} catch (err) {
res.send(err);
}
});
module.exports = router;
this is the simplest way I have to replicate the problem.
I don't know if there is a way to force redis to use the fork, in my case it appears following a massive search on index followed by delete and insert of records.
Redis however during these operations (insert/delete) works normally,
I can launch commands from the redis-cli;
By creating a new instance of the node-redis client while the fork is present everything works normally and when the fork goes away everything keep working.
Environment
Node.js Version: v14.15.1
Redis Version: 6.0.4
redisearch Version: 1.6.15
node-redis Version: 3.2
Platform: Server 128GB RAM, 8 Core, Debian

Nuxt end to end testing with jest

Hello im searching for a way to use component testing as well as end to end testing with nuxt.
we want to be able to test components (which already works) and also check if pages parse their url parameters correctly or sitemaps are correctly created and other page level features and router functions
i tried ava but we already implemented the component testing with jest which works fine now and in the nuxt docs the server rendering for testing was described with ava and i adapted that to jest now but i get timeout errors so i increased the time out to 40 seconds but still get a timeout.
did anybody get the testing to work with the nuxt builder like in the example (https://nuxtjs.org/guide/development-tools)?
this is my end to end test example file
// test.spec.js:
const { resolve } = require('path')
const { Nuxt, Builder } = require('nuxt')
// We keep the nuxt and server instance
// So we can close them at the end of the test
let nuxt = null
// Init Nuxt.js and create a server listening on localhost:4000
beforeAll(async (done) => {
jest.setTimeout(40000)
const config = {
dev: false,
rootDir: resolve(__dirname, '../..'),
telemetry: false,
}
nuxt = new Nuxt(config)
try {
await new Builder(nuxt).build()
nuxt.server.listen(4000, 'localhost')
} catch (e) {
console.log(e)
}
done()
}, 30000)
describe('testing nuxt', () => {
// Example of testing only generated html
test('Route / exits and render HTML', async (t, done) => {
const context = {}
const { html } = await nuxt.server.renderRoute('/', context)
t.true(html.includes('<h1 class="red">Hello world!</h1>'))
jest.setTimeout(30000)
done()
})
})
// Close server and ask nuxt to stop listening to file changes
afterAll((t) => {
nuxt.close()
})
my current error is :
● Test suite failed to run
Timeout - Async callback was not invoked within the 40000ms timeout specified by jest.setTimeout.Error: Timeout - Async callback was not invoked within the 40000ms timeout specified by jest.setTimeout.
any info is very appreciated as i could not resolve this issue myself

How to write PWA in Vue js?

i used to write pwa via vanilla javascript like this
importScripts('/src/js/idb.js');
importScripts('/src/js/utility.js');
const CACHE_STATIC_NAME = 'static-v4';
const CACHE_DYNAMIC_NAME = 'dynamic-v2';
const STATIC_FILES = [
'/',
'/index.html',
'/offline.html',
'/src/js/app.js',
'/src/js/feed.js',
'/src/js/promise.js',
'/src/js/fetch.js',
'/src/js/idb.js',
'/src/js/material.min.js',
'/src/css/app.css',
'/src/css/feed.css',
'/src/images/main-image.jpg',
'https://fonts.googleapis.com/css?family=Roboto:400,700',
'https://fonts.googleapis.com/icon?family=Material+Icons',
'https://cdnjs.cloudflare.com/ajax/libs/material-design-lite/1.3.0/material.indigo-pink.min.css'
];
self.addEventListener('install', function(e) {
e.waitUntil(
caches.open(CACHE_STATIC_NAME)
.then(function(cache) {
console.log('[Service Worker] Installing Service Worker ...');
cache.addAll(STATIC_FILES);
})
);
});
self.addEventListener('activate', function(e) {
console.log('[Service Worker] Activating Service Worker ...');
// clear old cache
e.waitUntil(
caches.keys()
.then(function(cachedKeys) {
return Promise.all(cachedKeys.map(function(key) {
if(key !== CACHE_STATIC_NAME && key !== CACHE_DYNAMIC_NAME) {
return caches.delete(key);
}
}))
})
);
// Tell the active service worker to take control of the page immediately.
return self.clients.claim(); // to ensure that activating is correctly done
});
//After install, fetch event is triggered for every page request
self.addEventListener('fetch', function(event) {
let url = 'https://pwa-training-4a918.firebaseio.com/posts.json';
if(event.request.url === url) {
event.respondWith(
fetch(event.request).then(res => {
let clonedRes = res.clone();
// in order to clear ol data if new data is different from the original one
clearAllData('posts')
.then(() => {
return clonedRes.json()
})
.then(data => {
for(let key in data) {
writeData('posts', data[key])
}
});
return res;
})
);
// USE Cache only Strategy if the request is in the static Files
} else if(STATIC_FILES.includes(event.request.url)) {
event.respondWith(
caches.match(event.request)
);
} else {
event.respondWith(
caches.match(event.request).then(response => {
return response || fetch(event.request).then(response => {
return caches.open(CACHE_DYNAMIC_NAME).then(cache => {
cache.put(event.request, response.clone());
return response;
})
})
})
.catch(err => {
return caches.open(CACHE_STATIC_NAME).then(cache => {
// i need to show offline page only if the failure is in the help Page
// because it does not make any sence if i show this page in case of the failure in files like css
if(event.request.headers.get('accept').includes('text/html')) {
return cache.match('/offline.html');
}
})
})
);
}
});
but when I'm trying to write my own in vuejs app I installed pwa via vue add pwa it created for me a file called registerServiceWorker.js that I don't understand because I'm not used to use it
This file contains the following
/* eslint-disable no-console */
import { register } from 'register-service-worker'
if (process.env.NODE_ENV === 'production') {
register(`${process.env.BASE_URL}service-worker.js`, {
ready () {
console.log(
'App is being served from cache by a service worker.\n' +
)
},
registered () {
console.log('Service worker has been registered.')
},
cached () {
console.log('Content has been cached for offline use.')
},
updatefound () {
console.log('New content is downloading.')
},
updated () {
console.log('New content is available; please refresh.')
},
offline () {
console.log('No internet connection found. App is running in offline mode.')
},
error (error) {
console.error('Error during service worker registration:', error)
}
})
}
I don't know how to write my own pwa code here or where I can do that?
Also I don't know if it will work on localhost or not because from what I'm noticing it works in Production
So My Question is, How Can I Write PWA As I used to do with vanilla js in vue app? What are the steps should I do in order to accomplish my full custom PWA?
Can I Do That without using workbox?
if anyone can help me i'll be appreciated.
Thanks in advance.
I/(pretty sure most of us) won't likely throw to redo service worker from scratch in any project, Workbox is also recommended tools in Google Developers' page other than Vue CLI.
As the registerServiceWorker.js, that's boilerplate for your service worker cycle in your App, as the logs pretty straightforward in the flow of your app process
If you wanna to do from scratch still, i would suggest read https://developers.google.com/web/fundamentals/primers/service-workers/ to understand the fundamentals. I would recommend because service-worker pretty much "I hope you know what you doing with your app like what-when-to update/caching/do-when-offline/"

Express route HMR with Webpack

I'm trying to get HMR to work on the server-side with an express app and I'm seeing some odd behavior. My simple test project;
index.ts
let httpListener: Server = null;
let AppServer = require('./AppServer').default;
const port = Config.serverPort;
if (process.env.NODE_ENV === 'dev') {
if ((module as any).hot) {
(module as any).hot.addDisposeHandler((data: any) => {
httpListener.close();
AppServer = require('./AppServer').default;
});
console.log('index.ts', (module as any).hot.dependencies);
(module as any).hot.accept((err: any) => {
console.log('HMR Error', err);
});
}
}
httpListener = AppServer.app.listen(port, (error: Error) => {
if (error) {
console.error(error);
} else {
console.info(`Listening on port ${port}.`);
}
});
AppServer.ts
class AppServer {
public app: express.Application = express();
constructor() {
this.app.use('/api', (new ApiRouter()).router);
}
}
export default new AppServer();
and ApiRouter.ts
export class ApiRouter {
public router: express.Router = express.Router();
constructor() {
this.router.use('/auth', (new AuthRouter()).router);
this.router.get('/', (req, res) => {
res.json({success: true});
});
}
}
Webpack bundles correctly, and HMR reports modules being updated. If I change some code in index.ts, those changes take effect. However, when I flip {success: true} to {success: false}, I see HMR update;
[HMR] Updated modules:
[HMR] - ./src/server/AppServer.ts
[HMR] - ./src/server/index.ts
[HMR] - ./src/server/api/ApiRouter.ts
but when I hit the endpoint, I get back {success: true}. So despite HMR seemingly doing the right thing, the code isn't being changed at run-time. I suspect I'm missing something fundamental about how module.hot.accept works here, but I can not figure out where I'm going wrong.
Has anyone gotten this to work correctly?