How do I detect whether I am on server on client in next.js? - express

I am using a customer express server with Next.js. It's running within a container. I am doing an http request with isomorphic-fetch to get data for my render. I'd like to do localhost when running on server and mysite.com when running on client. Not sure the best way to accomplish this. I can do it hackily by doing const isServer = typeof window === 'undefined' but that seems pretty bad.

Now (2020 Jan) it should be typeof window === 'undefined' since process.browser is deprecated
Refer to https://github.com/zeit/next.js/issues/5354#issuecomment-520305040

You can use process.browser to distinguish between server environment (NodeJS) and client environment (browser).
process.browser is true on the client and undefined on the server.

Since I don't like depending on odd third party things for this behavior (even though process.browser seems to come from Webpack), I think the preferred way to check is for presence of appContext.ctx.req like this:
async getInitialProps (appContext) {
if (appContext.ctx.req) // server?
{
//server stuff
}
else {
// client stuff
}
}
Source: https://github.com/zeit/next.js/issues/2946

One additional note is that componentDidMount() is always called on the browser. I often load the initial data set (seo content in getInitialProps(), then load more in depth data in the componentDidMount() method.

getServerSideProps and getStaticProps are added in Next 9.3(Mar 2020), and these functions are recommended.
If you're using Next.js 9.3 or newer, we recommend that you use getStaticProps or getServerSideProps instead of getInitialProps.
So no need to detect, just put server side stuff in getServerSideProps.
const MyPage = () => {
useEffect(() => {
// client side stuff
}, [])
return (
<div> ... </div>
)
}
MyPage.getServerSideProps = async () => {
// server side stuff
}

Related

Reevaluate Nuxt.js middleware without a route change

I'm wondering if it's possible to essentially "reevaluate" the middleware conditions without actually changing the current route.
The middleware's purpose is to prevent non-logged-in users from accessing the "dashboard".
My issue is, a user could become logged in or logged out without necessarily changing route but they wouldn't be redirected until they try and change pages.
I have a VueX action that triggers when the user's auth state changes but this (from what I can see), can't access the redirect or route variables.
// /mixins/auth.js
const reevaluateAuthStatus = (store, redirect, route) => {
console.log(route)
const redirectPolicy = route.meta.map((meta) => {
if (meta.auth && typeof meta.auth.redirectPolicy !== 'undefined') { return meta.auth.redirectPolicy[0] }
return []
})
const user = store.getters['auth/getUser']
if (redirectPolicy.includes('LOGGEDOUT')) {
if (user) {
return redirect('/dashboard')
}
} else if (redirectPolicy.includes('LOGGEDIN')) {
if (!user) {
return redirect('/login')
}
}
}
module.exports = {
reevaluateAuthStatus
}
// /middleware/auth.js
import { reevaluateAuthStatus } from '../mixins/auth'
export default function ({ store, redirect, route }) {
reevaluateAuthStatus(store, redirect, route)
}
Appreciate any help on this :)
You cannot re-evaluate a middleware AFAIK because it's mainly this (as stated in the documentation)
middlewares will be called [...] on the client-side when navigating to further routes
2 clean ways you can still achieve this IMO:
use some websockets, either with socket.io or something similar like Apollo Subscriptions, to have your UI taking into account the new changes
export your middleware logic to some kind of call, that you could trigger again by calling the $fetch hook again or any other data-related fetching hook in Nuxt
Some more ugly solutions would probably be:
making an internal setInterval and check if the actual state is still valid every 5s or so
move to the same page you are actually on with something like this.$router.go(0) as somehow explained in the Vue router documentation
Still, most of the cases I don't think that this one may be a big issue if the user is logged out, because he will just be redirected once he tries something.
As if the user becomes logged-in, I'm not even sure on which case this one can happen if he is not doing something pro-active on your SPA.
I don't know if it's relevant or not, but I solved a similar problem this way:
I have a global middleware to check the auth status. It's a function that receives Context as a parameter.
I have a plugin that injects itself into context (e.g. $middleware).
The middleware function is imported here.
In this plugin I define a method that calls this middleware passing the context (since the Plugin has Context as parameter as well): ctx.$middleware.triggerMiddleware = () => middleware(ctx);
Now the middleware triggers on every route change as intended, but I can also call this.$middleware.triggerMiddleware() everywhere I want.

electron.js and sql - correct way to set it up?

I am new to electron.js - been reading the documentation and some similar post here:
How do I make a database call from an Electron front end?
Secure Database Connection in ElectronJS Production App?
Electron require() is not defined
How to use preload.js properly in Electron
But it's still not super clear how to properly implement a secure SQL integration. Basically, I want to create a desktop database client. The app will connect to the remote db and users can run all kind of predefined queries and the results will show up in the app.
The documentation says that if you are working with a remote connection you shouldn't run node in the renderer. Should I then require the SQL module in the main process and use IPC to send data back and forth and preload IPCremote?
Thanks for the help
Short answer: yes
Long answer:
Allowing node on your renderer poses a big security risk for your app. It is best practices in this case to create pass a function to your preloader. There are a few options you can use to do this:
Pass a ipcRenderer.invoke function wrapped in another function to your renderer in your preload. You can then invoke a call to your main process which can either send info back via the same function or via sending it via the window.webContents.send command and listening for it on the window api on your renderer. EG:
Preload.js:
const invoke = (channel, args, cb = () => {return}) => {
ipcRenderer.invoke(channel, args).then((res) => {
cb(res);
});
};
const handle = (channel, cb) => {
ipcRenderer.on(channel, function (Event, message) {
cb(Event, message);
});
};
contextBridge.exposeInMainWorld("GlobalApi", {
invoke: invoke,
handle:handle
});
Renderer:
let users
window.GlobalApi.handle("users", (data)=>{users=data})
window.GlobalApi.invoke("get", "users")
or:
let users;
window.GlobalApi.invoke("get", "users", (data)=>{users=data})
Main:
ipcMain.handle("get", async (path) => {
let data = dbFunctions.get(path)
window.webContents.send(
path,
data
);
}
Create a DB interface in your preload script that passes certain invocations to your renderer that when called will return the value that you need from your db. E.G.
Renderer:
let users = window.myCoolApi.get("users");
Preload.js:
let get = function(path){
let data = dbFuncions.readSomeDatafromDB("path");
return data; // Returning the function itself is a no-no shown below
// return dbFuncions.readSomeDatafromDB("path"); Don't do this
}
contextBridge.exposeInMainWorld("myCoolApi", {
get:get
});
There are more options, but these should generally ensure security as far as my knowledge goes.

Dependency Injection (for HttpFetch) at setRoot in main.js Aurelia

I am having trouble getting dependency injection working for my AuthorizerService. Obviously, dep-inj is not ready until after Aurelia "starts", but I wasn't sure how to access it.
main.js:
aurelia.container.registerInstance(HttpClient, http.c());
// set your interceptors to take cookie data and put into header
return aurelia.start().then(() => {
let Authorizer = new AuthorizerService();
aurelia.container.registerInstance(AuthorizerService, Authorization);
console.log('Current State: %o', Authorizer.auth);
Authorizer.checkCookieAndPingServer().then(() => { console.log('Current State: %o', Authorizer.auth); aurelia.setRoot(PLATFORM.moduleName('app')); }, () => { aurelia.setRoot(PLATFORM.moduleName('login-redirect')); });
});
Now the problem is that if I do "new AuthorizerService()" then "this.http.fetch()" is not available in AuthorizerService.js.
Am I meant to pass "http.c()" (which delivers the HttpClient instance) as a parameter inside:
checkCookieAndPingServer(http.c())
or is there another way?
Can I delete "new AuthorizerService()" and just do (I made this up):
aurelia.container.getInstance(AuthorizerService);
Somehow FORCE it to do dependency-injection and retrieve the "registered Instance" of "http.c()"?
I can't just check cookie. I have to ping server for security and the server will set the cookie.
I think this is all sorts of wrong, because I need a global parameter that just is false by default, then it does the query to backend server and setsRoot accordingly. Perhaps only query backend in the "login page"? Okay but then I would need to do "setRoot(backtoApp); aurelia.AlsoSetLoggedIn(true);" inside login module. But when I setRoot(backtoApp) then it just starts all over again.
In other words, when setRoot(login); then setRoot(backToApp); <-- then AuthorizerService instance doesn't have its proper data set (such as loggedIn=true).
EDIT: Better Solution maybe:
main.js:
return aurelia.start().then(() => {
let Authorizer = aurelia.container.get(AuthorizerService);
let root = Authorizer.isAuthenticated() ? PLATFORM.moduleName('app') : PLATFORM.moduleName('login');
console.log('Current State: %o', Authorizer.auth);
aurelia.setRoot(root);
});
Authorizer.js
constructor(http) {
this.http = http;
this.auth = {
isAuthenticated: false,
user: {}
}
}
"this.auth" is no longer static. No longer "static auth = { isAuthenticated: false }" which was some example code I had found.
So now "auth" gets set inside "login" module. But this means the "login" module is displayed every single time the app loads briefly, before being redirected back to "setRoot(backToApp)"
If the class you want to get the instance is purely based on service classes and has no dependencies on some Aurelia plugins, it doesn't need to wait until Aurelia has started to safely invoke the container.
For your example:
aurelia.container.getInstance(AuthorizerService);
It can be
aurelia.container.get(AuthorizerService);
And you should not use new AuthorizerService(), as you have noticed in your question.

vue server side rendering and data population

Im currently refactoring an app and converting all my base code into vue. One of my requirements is to do server side rendering.
I have been follow vue ssr example along with hacker news example to help me understand ssr.
I do have however a question for which I cant find any good answer, and before further development, I want to make sure we are doing the right thing.
I want to know if its a good practice to have some actions in a vue store calling an api for server side rendering.
All the examples I have found deal with a real external endpoint they connect and perform request. But that is not the setup we have.
We do have a "normal" express app with its own endpoints; so, for example, express router looks a bit like this:
// This is our post model, it lives on the same app, so its not a external API
app.get('/posts', (req, res) => Posts.getPosts());
// Resolve request for SSR rendering, pretty much the same as in [vue ssr example](https://ssr.vuejs.org/guide/routing.html#routing-with-vue-router)
app.get(
const context = { url: req.url }
createApp(context).then(app => {
renderer.renderToString(app, (err, html) => {
if (err) {
if (err.code === 404) {
res.status(404).end('Page not found')
} else {
res.status(500).end('Internal Server Error')
}
} else {
res.end(html)
}
})
})
);
This part works fine on both client and server. If you request something to /posts you get your response back.
To have a more modular code, we are using vuex stores; being one of the actions called fetchPosts and this action is the responsible of fetching the current posts in the view.
...
actions: {
fetchPosts({ commit }) {
return $axios.get('/posts').then((response) => {
commit('setPosts', {
posts: response.data
});
});
}
},
...
I believe for client side this is good, but when rendering on the server, this is probably not optimal.
Reason being axios performing an actual http request, which will also have to implement auth mechanism, and in general, really poor performant.
My question is: is this the recommended and standard way of doing this ?
What are other possibilities that works in server and client ?
Do people actually creates separated apps for api and rendering app ?
Thanks !
I know this is an old question, but for future visitors:
The recommended way is to leverage webpack module aliases to load a server side api for the server and a client side api for the browser. You have to share the same call signature which allows the api to be "swapped".
This of course greatly improves performance as the server side api can do direct db queries instead fetching data over http.
In essence your webpack.server.config.js should have:
resolve: {
alias: {
'create-api': './create-api-server.js'
}
}
In your webpack.client.config.js:
resolve: {
alias: {
'create-api': './create-api-client.js'
}
}
Importing create-api will now load the required api.
Have a look at https://github.com/vuejs/vue-hackernews-2.0 to see a full example.

Session Data with Durandal

I am just getting started with Durandal.js so excuse me for the silly quesstion...
When a user makes it's first request to the app it is asked to choose a 'profile kind', and I need it to be accessible to every other view model in the web site, I first though of creating this property in the shell viewmodel, but don't how to do it.
How is the best way to store data in a Session like mode in a Durandal SPA?
Thanks!
Create an amd module for what data you need to store.
Then just require that module as a dependency for whatever other modules that need it.
Sort of like this:
session module
define(function () {
return {
someVariable: 'value1',
someVariable2: 'value2'
}
})
some other module
define(['session'], function(session) {
return {
getValue1: function () {
return session.someVariable;
},
obs1: ko.observable(session.someVariable2)
}
})
EDIT**
AMD modules are there to not pollute the global namespace of the window object. But if you would rather not require your session as a dependency and just access it through a global variable then that is perfectly fine.
you can declare it in your shell.js if you would like and do something like:
define(function () {
window.session = { someVariable: 'value1', someVariable2: 'value2' };
})
then inside some other module you can access the session object like so:
define(function() {
return {
getValue1: function () {
return session.someVariable;
},
obs1: ko.observable(session.someVariable2)
}
})
This information will not be persisted between page refreshes.. its only in-memory.
If your looking to persist the session data I would not look into persisting any information on the client unless you planned on making your app an off-line application.
An offline application is an app that works even w/out internet access. But if your app requires that the user is always connected to the internet then I would just store the session data on the server. So, just use web services to persist the session data and retrieve it.
You can tie the session on the server to the client by using a cookie.
As an alternative to Evan's answer, which is definitively the correct AMD approach... have you considered using a global object for that purpose?
in main.js
window.myApp = {
prop1: 'value',
subOne: {
prop1: 'value'
}
...
}
That will allow you to access the global myApp object everywhere. I know that some people will consider that as global namespace pollution and in general a bad practice, but in a SPA project (where you control window) I'd consider this still a viable approach.