Webpack Dev Server + Express Web Server - express

I'm having trouble setting up a development workflow that will do both of the following tasks simultaneously:
Recompile static assets (js, css) on file change. Serve these static assets. Notify the browser that the page must reload whenever assets are changed. Use react-hot-loader.
Use express to run a server which serves an API which the static assets will "consume". Have this express server restart any time my server files change.
Is the best practice to have webpack-dev-server serve the static assets, and have a separate web server serve the API on a different port? If so, the API calls in the javascript will need to specify the host and port, and will need to be changed before going to production. It seems all the tutorials online focus on #1, and don't set up an API server. Can anyone point me in the correct direction?
I'm not opposed to using gulp and browserify or SystemJS instead of webpack, but it seems that if I want to use react-hot-loader, I need to use webpack.

You can do something like this:
Create an express() proxy
Create a webpack-dev-server
Link the assets with absolute url
Start both servers.
Note: Make sure to have different ports to run both the servers.
var proxy = require('proxy-middleware');
var express = require('express');
var url = require('url');
//----------------- Web Proxy--------------------
var app = express();
app.use('/assets', proxy(url.parse('http://localhost:8000/dist/assets')));
app.get('/api/url', function(req, res) {}
app.post('/api/url', function(req, res) {}
// ---------------Webpack-dev-server---------------------
var server = new WebpackDevServer(webpack(config), config.devServer);
// ---------------Run both servers-----------------------
server.listen(config.port, 'localhost', function(err) {});
app.listen(8080);

We have been using gulp+webpack for the last year and it has been great. We have an API Gateway which only serves up one static html file(the index.html) and everything else is just REST end points. Then in the index.html we reference a css file or two and a couple scripts that load from our CDN(Cloudfront).
If you run that way in production, your local setup just needs to use the webpack dev server as your "local CDN". You are correct, your javascript will need to know what the api url is since that will change local vs production. We actually have local, dev, stage, and production - once you have it setup to work in 2 environments its not hard to add more(which is good!)
Now our index.html has some template variables that get filled in by the API Gateway when it serves it up. Similar to this:
<script>
var siteConfig = {
apiBase: '<%=apiBaseUri%>',
cdnBase: '<%=cdnBaseUri%>'
};
</script>
<script src="<%=cdnBaseUri%>/assets/js/bundle.min.js"></script>
Then you just pull in siteConfig when your react app is starting up and prepend those variables to any api/cdn calls. Swap the variables depending on your environment, and bam, you're done!
A slightly simpler approach is instead of filling in those vars when the page is served up you could do it at build time. Thats how we started but as things evolved it turned out easier to manage at run time.
BTW, I am pretty sure you can accomplish all of this using just webpack - gulp probably isn't necessary but it was easier for us at the time to use gulp for running unit tests, linting, etc.

Related

how to setup prerender function as serverless to make ISR happen

i try to host my application on AWS S3 and the static file can export to the S3 folder by export command in next.js, but when it goes to ISR, I don't know how to make it happen, i don't use vercel, i want to update the static html when the revalidate option is in our code, how to write the pre-render function and host it to a seperate serverless service, is there any way to make this happen? better have a simple example here! after set the serverless service up, how to config my next.js app to make it know that it will call the service when ISR is configured?
when i set locally in the next.js app, and run it locally, the ISR is good to go, meaning i put the pre-render function together with my ISR code, this is the way next.js document want us to do, i guess, but now, i want to host my static file in aws s3, just don't know how to set the config to make it call my pre-render function, because i want the pre-render function to host in a different servie endpoint

NextJs: How to use dynamic hostnames in api calls instead of .env variables

I am building a NextJs app: How to use dynamic hostnames in api calls instead of .env variables?
For example sometimes in my local dev environment the app sometimes runs in port 3001. How do I accommodate such scenarios by dynamically passing the appropriate hostname during requests?
I use next-absolute-url. From the docs:
This module enables you to easily get the protocol and host of your
Next.js app, both on the server and the client. Optionally, you can
set a localhost variable, which is useful for local development if you
have local lambda functions running on a different port.
import absoluteUrl from "next-absolute-url";
// in express we have req.get('host')
const { origin } = absoluteUrl(req);

How to use environment variables in remix run deployed on cloudflare pages

How is it possible to use environment variables in remix when deploying to cloudflare pages? The documentation gives some examples for different hosting providers, but not for cloudflare pages. After assuming, that dotenv is the way to go, I get the error SyntaxError: missing ) after argument list after running npm run dev which executes "dev:remix": "node -r dotenv/config node_modules/.bin/remix watch".
How is it possible to use environment variables with remix in a cloudflare pages context?
For local development, create the file .dev.vars in the root directory and add your environment variables:
MY_SUPER_SECRET_TOKEN=12345
Best to also .gitignore this file too so you don't accidentally commit something for hackers to find.
On Cloudflare, set the same environment variables in the page's control panel.
Then, in your route's LoaderFunction or ActionFunction, access the environment variables through the context:
export const action: ActionFunction = async ({ request, context }) => {
console.log(context.MY_SUPER_SECRET_TOKEN)
}

Using NuxtJS for dynamic routes without server target

I always thought that frontend should not be over bloated in size, usually by "frontend" I imagined a set of HTML, CSS and JS files, which are kind of small, especially when minified and compressed. So you can use whatever framework or library you love, your dev node_modules could be enormous in size, but after the compilation you get something lightweight to be served e.g by Nginx. Yeah, I just described an SPA-like setup, not an SSR when there's a server process running.
I had an experience building a website with NuxtJS, and it has only runtime logic, so no backend was required. I just did yarn generate and served all the resulted static with Nginx.
Now I'm building an application which requires a backend (it's a separate Python process), with dynamic pages like /users/john and /users/jane. Nuxt documentation says I can't use generate option anymore, cause such routing is dynamic. (technically I can write a set of fetch functions to load users from API during build time and generate corresponding pages, but it doesn't work well for runtime data). The only option is to use server target of NuxtJs.
There's even a page describing how to serve Nuxt application with Nginx. It assumes you should use yarn start command, which starts a Node process. It works fine, dynamic content is routed, caching is performed by Nginx, but.. it doesn't fit in a model that "frontend is lightweight". I use docker, and it means that now I need to bring huge node_modules with me. nuxt package itself is about 200 MB, which is kinda big for a small frontend app. I can run yarn install --production to save some space, but it still doesn't solve an issue that resulted image is huge.
Previously, when I wrote some apps in React, they resulted in a single index.html which I served by Nginx. It means, such dynamic routing was handled by frontend, using react-router or similar stuff.
To better understand my concerns, here's some rough comparison:
My old React apps: ~5 MB of disk space, 0 RAM, 0 CPU, routing is done by index.html file
My previous site with Nuxt static option: ~5 MB of disk space, 0 RAM, 0 CPU, routing is done by file system (/page1/index.html, /page2/index.html)
My current site with Nuxt server option: ~ 400 MB or even more disk space for a docker image, RAM, CPU, routing is done by Nuxt runtime
I don't really want to overcomplicate things. Allocating a lot of resources for a simple web app is too much, especially when you can solve the task with a help of a few static files.
The questions are:
Am I missing some option in NuxtJS to solve my issue?
Am I just misusing NuxtJS, and it's better to get plain VueJS, some vue-router, and develop the app as I described in "previously with react" section?
I think you are making a mistake here about SPA mode.
Assume that you have a page named users in your Nuxt pages, your folder structure is like this:
[pages]
[users]
[_name]
index.vue
When you requesting /users/john you can take the john from params and making an axios call to your server.
After that, you can use the nuxt generate command to create your dist folder and after that serve the dist folder with Nginx. Everything will work fine.
Check this simple routing approach in the browser
const routes = {
'/': Home,
'/users': Users
}
new Vue({
el: '#app',
data: {
currentRoute: window.location.pathname
},
computed: {
ViewComponent () {
return routes[this.currentRoute] || NotFound
}
},
render (h) { return h(this.ViewComponent) }
})
In the Users component integrate with your python backend.
You can use SPA mode in NuxtJS (mode: 'spa', or ssr: false, in latest versions), then run yarn generate or nuxt generate, it will generate a full bundle in dist folder that can be served with any HTTP server.
This works fine with all dynamic routes, I tested it with simple php -S localhost:8000 that's just serves a folder via HTTP.
This works due to a trick with 200.html https://dev.to/adnanbabakan/deploy-a-single-page-application-with-200-html-2p0f
For my project it generated all needed data and folder size is just 13mb (with all images, fonts, etc...).
You can read more about how static routing is done in Nuxt docs: https://router.vuejs.org/guide/essentials/history-mode.html#example-server-configurations

is it possible to set multiple base url in vue.config.js?

My vue.config.js code is as below
module.exports = {
baseUrl: process.env.NODE_ENV === 'production' ? '/prodserver1/' : ''
}
and it's working perfectly fine by hitting URL: abc.com/prodserver1/index.html (hostname + pathname)
But I have multiple production servers where I wanted to deploy the same application, let's say, I have one more production server named 'prodserver2'
How to pass multiple production server strings in base URL such that I can run app either on abc.com/prodserver1/index.html or abc.com/prodserver2/index.html?
Maintaining multiple applications for each server is not feasible as every minor change needs to updated in each time to each app.
The simple answer is no. You can't supply a path to a file or resource which references multiple possible locations. However using a relative path instead should work.
From the vue documentation baseUrl it first of all suggests to use publicpath instead. In the publicPath description :
The value can also be set to an empty string ('') or a relative path
(./) so that all assets are linked using relative paths. This allows
the built bundle to be deployed under any public path, or used in a
file system based environment like a Cordova hybrid app.
I suggest you use the option of a relative path, so you can then serve your app from any path.