how do I route multiple domains to multiple node applications? - apache

I'm used to the typical lamp web hosting environment where you just click a few buttons in cpanel and your domain gets zoned and mapped to a folder within htdocs. I've been using node.js a lot and it doesn't seem as simple to do the same thing. If I had multiple node apps and I wanted to route domain1.com:80 and domain2.com:80 each to its own node app and port, how do I go about doing that? where do I start?

This is typically done with nginx. Nginx is a reverse proxy, a piece of software you put infront node.js.
server {
listen 80;
server_name www.domain1.com;
root /var/www/domain1;
location / {
proxy_pass http://localhost:1337; # this is where your node.js app_domain1 is listening
}
}
server {
listen 80;
server_name www.domain2.com;
root /var/www/domain2;
location / {
proxy_pass http://localhost:1338; # this is where your node.js app_domain2 is listening
}
}
From here: Nginx Different Domains on Same IP

I dont recomend apache to do these, nginx fits better with nodejs.
You can run the apps for example at the port 3000 and 3001,
then proxy it to mydomain1:80, and mydomain2:80.
To get mydomain1 and mydomain2 unther the port 80, these is all about DNS not apache.
Theres no way to run apache/nginx and your node httpserver on the same port. ull get a error.
p.s. Im not sure in u can do these #tipical lamp webhost
hope it helps

You can setup virtual domains in Node if you're using Express.
The code you would use to start your server with would look something like this.
var sys = require('sys'),
express = require('express');
var app = express.createServer();
app.configure(function() {
app.use(express.vhost('subdomain1.local', require('./subdomain1/app').app));
app.use(express.vhost('subdomain2.local', require('./subdomain2/app').app));
app.listen(3000);
});
Then you would export app in each subdomain.
var app = express.createServer();
exports.app = app;
Here's a post to read more about vhost in Express.js.

Related

Binding media server to a different host

I am trying to reverse proxy my local home NAS.
What I would like to do is, now my media server listens on nas:8096, but I would like to reverse proxy it to be media.
I tried to bind it with Nginx Proxy Manager, but it doesn't seem to work:
server {
listen 80;
location media/ {
proxy_pass http://192.168.0.100:8096;
}
}
I know its a stupid try... :D
Thanks.

I have a Vuejs/nuxt app - I need to make same app for with another style and deploy on same server

I have an existing VueJs/nuxt app. The same app but with different CSS/ Images need to be deployed on same server (different URL) - its the same app but for another client.
Currently what we do is pull the branch on the Linux server and execute npm run generate for the current app.
I presume we need to generate the other app into another folder.
Is this the best solution? Then how do we configure for the new URL to point to this new folder.
example : current URL is www.potato.com and new url for will be www.potato.com/newclient
Thanks for your help.
You can set a Reverse Proxy with Nginx and launch both apps in different ports.
Let's say that potato.com will be in port 4000 and potato.com/newclient will be in port 5000.
You basically state that in your Nginx
server {
servername www.potato.com
listen 80;
listen [::]:80;
location / {
proxy_pass http://127.0.0.1:4000;
}
location /newclient {
proxy_pass http://127.0.0.1:5000;
}
}
Now your apps can live anywhere in your file system. Just deploy them with different ports.

Hosting multiple express sites on different domains in RHEL server

Business has requested to setup an environment to host multiple Express sites on different domains in the same RHEL 7.2 server with minimal inter-site impact/interference.
We tried using the vhost approach wherein we mapped domains to different shells and then mapped using the below code in server.js which serves as the main entry point for our all Express sites:
var fs = require('fs'),
path = require('path'),
express = require('express'),
vhost = require('vhost'),
var app = express();
var virtualHosts = JSON.parse(fs.readFileSync('vhosts.json', 'utf8'));
virtualHosts.forEach(function(virtualHost) {
app.use(express.static(path.join(__dirname, virtualHost.path)));
app.use(vhost(virtualHost.domain, app));
});
// Listen on port 8082
app.listen(8082);
But this approach is not working out as for each of these domains, the same content is being displayed if we browse through host file entry.
Any quick suggestions are highly appreciated?
PS: We are not inclined on using the nginx proxy.

Host multiple rails applications over subfolder using nginx+unicorn

I would like to host multiple rails applications using nginx + unicorn which is currently being served using apache + passenger with railsbaseuri. The only reason is being apache needs to be reloaded after every new application is deployed. I would like to know if adding new application is possible in unicorn+nginx without reloading server.
I want to deploy applications on subfolder like host-name/nginx-app1, host-name/nginx-app2 while host-name points to a basic html page.
Read somewhere related to using sockets to handle individual applications and would be looking for some help to implement this. In my case the application is deployed only once with no further iterations. Once i deploy the new application, there should be no downtime in order to make the current application running.
EDIT
config/unicorn.rb file inside the application.
working_directory "/home/ubuntu/application_1"
pid "/home/ubuntu/application_1/pids/unicorn.pid"
stderr_path "/home/ubuntu/application_1/log/unicorn.log"
stdout_path "/home/ubuntu/application_1/log/unicorn.log"
listen "/tmp/unicorn.todo.sock"
worker_processes 2
timeout 30
One way to go about it is hosting the rails applications as UDS. And nginx to have multiple server blocks to read from each UDS (Unix Domain Sockets). Writing the logic adhoc pardon for syntax errors.
e.g. Have a look at this.
http://projects.puppetlabs.com/projects/1/wiki/using_unicorn
You can host app1 using app1.conf for unicorn which will have a line.
listen '/var/run/app1.sock', :backlog => 512
and have multiple nginx upstreams like
upstream app1 {
server unix:/var/run/app1.sock fail_timeout=0;
}
upstream app2 {
server unix:/var/run/app2.sock fail_timeout=0;
}
....
and route requests (proxypass) from a server block based on location or host header
server {
listen 80;
location /app1 {
proxy_pass http://app1;
proxy_redirect off;
}
location /app2 {
proxy_pass http://app2;
proxy_redirect off;
}
}

How to use vhosts alongside node-http-proxy?

I'm running Nodejs and Apache alongside each other.
node-http-proxy is listening on port 80 and then forwarding requests to either Apache(:9000) or to Express(:8000).
My virtual hosts on Apache look like:
<VirtualHost 127.0.0.1>
DocumentRoot "/localhost/myVhost"
ServerName myVhost
</VirtualHost>
My question is, what is the "correct" way to have vhost like functionality on the Express/Nodejs side? I would prefer to not have to place each Nodejs app on its own port as is suggested here:
https://github.com/nodejitsu/node-http-proxy
(Section titled "Proxy requests using a 'Hostname Only' ProxyTable")
I noticed Connect (which as I understand it, gets bundled in Express) has some vhosts functionality. Should I be using that? If so, what would be the correct way to run it alongside node-http-proxy?
http://www.senchalabs.org/connect/middleware-vhost.html
I also noticed this other module called "Cluster", it seems to be related but I'm not sure how:
http://learnboost.github.com/cluster/
While not wanting to overwhelm, I also came across one called, "Haibu" it seems to be related but I'm not sure if it would just be an all out replacement for using vhosts:
https://github.com/nodejitsu/haibu
Note: I'm a front-end guy, so I'm not very familiar with a lot of server terminology
I never figured out Haibu or Cluster. But I did find a good solution that addressed my issue. To my surprise, it was actually quite simple. However, I don't know much about servers, so while this works, it may not be optimal.
I set up virtual hosts like normal on Apache
(http://httpd.apache.org/docs/2.0/vhosts/examples.html)
I installed the following on Node
Express (http://expressjs.com/)
node-http-proxy (https://github.com/nodejitsu/node-http-proxy)
Then, as a matter of personal style, I placed all my virtual hosts in a common directory (/localhost)
I then switched Apache to listen on a port other than port 80. I just happened to choose port 9000 because I had seen that used somewhere. (In httpd.conf, changed "Listen 80" to "Listen 9000"). I also had to make sure that all my virtual hosts, as defined in extra/httpd-vhosts.conf were set to an IP based nameVirtualHost (127.0.0.1) instead of using a port (*:80).
On the Node side, I created my app/server (aka node virtual host) that listened on port 8000 (somewhat arbitrarily choice of port number) See this link on creating a server with express: http://expressjs.com/guide.html
In my /localhost directory I then created a file called "nodeHttpProxy.js"
Using node-http-proxy, in nodeHttpProxy.js I then created a proxy server that listens on port 80. Using express, which wraps connect (http://www.senchalabs.org/connect/) I created my virtual hosts.
The nodeHttpProxy.js file looks like this:
// Module dependancies
var httpProxy = require('/usr/local/lib/node_modules/http-proxy/lib/node-http-proxy')
, express = require('/usr/local/lib/node_modules/express/lib/express');
// Http proxy-server
httpProxy.createServer(function (req, res, proxy) {
// Array of node host names
var nodeVhosts = [
'vhost1'
, 'vhost2'
]
, host = req.header('host')
, port = nodeVhosts.indexOf(host) > -1
? 8000
: 9000;
// Now proxy the request
proxy.proxyRequest(req, res, {
host: host
, port: port
});
})
.listen(80);
// Vhosts server
express.createServer()
.use(express.vhost('vhost1', require('./vhost1/app')))
.use(express.vhost('vhost2', require('./vhost2/app')))
.app.listen(8000);
As you can see, I will have to do two things each time I create a new Node virtual host:
add the virtual host name to my "nodeVhosts" array
define a new express virtual host using the .set method
Of course, I will also have to create the actual host path/files in my /localhost directory.
Once all this is done I just need to run nodeHttpProxy.js:
node nodeHttpProxy.js
You might get some weird "EACCESS" error, in which case, just run as sudo.
It will listen on port 80, and if the host matches one of the names in the nodeVhosts array it will forward the request to that host on port 8000, otherwise it will forward the the request onto that host on port 9000.
I've been giving this some thought lately as I'm tackling the same problems on my personal test environment. You are not going to be able to get around having each node application running on it's own port, but you can abstract away the pain of that process. Here is what I am using now, but I hope to build an npm package around this to simplify things in the future.
Each of my node.js applications has a map file that contains the port that the application is listening on as well as a map that indicates the expected path which the application is being served on. The contents of the file look like this:
{"path": "domain.com/path", "port": 3001}
When I start my application, it will read the port from the map.json file and listen on the specified port.
var map = fs.readFileSync('map.json', 'ascii');
app.listen(map.port);
Then in my proxy setup, I iterate over each of my node.js application directories, and check for a map.json file which indicates port 80 traffic should be proxied to this application.
I use almost the exact same method to setup the proxy for our apache hosted applications as well. We use a folder based convention on the PHP websites that we are serving and it uses the following configuration:
VirtualDocumentRoot /var/www/%-2.0.%-1/%-3+/
VirtualScriptAlias /var/www/%-2.0.%-1/%-3+/cgi-bin/
This essentially allows us to map domains to folders using the following structure.
http://sub.domain.com/ = /var/www/domain.com/sub/
There is no additional configuration needed to add or remove sites. This is very close to what I am currently using to proxy both apache and node sites. I am able to add new node and new apache sites without modifying this proxy application.
proxy.js
var fs = require('fs');
var httpProxy = require('http-proxy');
var proxyTable = [];
// Map apache proxies
fs.readdirSync('/var/www/').forEach(function(domain) {
fs.readdirSync('/var/www/' + domain).forEach(function(path) {
var fqd = domain + '/' + path;
var port = fs.readFileSync('port', 'ascii');
proxyTable[fqd] = fqd + ':' + 8080;
});
});
// Map node proxies
fs.readdirSync('/var/www-node/').forEach(function(domain) {
var map = fs.readFileSync('map.json', 'ascii');
proxyTable.[map.path] = '127.0.0.1:' + map.port;
});
var options = {
router: proxyTable
};
var proxyServer = httpProxy.createServer(options);
proxyServer.listen(80);
In the future, I will probably decouple the path from the port that the application is listening on, but this configuration allows me to build the proxy map automatically with very little work. Hopefully this helps.
I took some inspiration from #uglymunky and wrote a chef script to do this on Ubuntu.
With this script you can install express and apache with vhost support on a single server using 1 line after you pull down my chef project from github
https://github.com/toranb/ubuntu-web-server
If you have git installed and you pull it down you can kick it off like so ...
sudo ./install.sh configuration.json
This does require Ubuntu 12.04 or greater as I took advantage of an upstart script to start node when you reboot the machine
When the script is finished you will have a working ubuntu web server with express to run any node apps you configured, along side apache to run any wsgi apps you configured
I'm working on an extremely minimal and to the point library that can be totally segregated from your projects. Basically the idea would be run this independently on your servers and don't ever worry about having to bundle this in your projects how you would with connect.
Take a look at the config.json file to see how simple it actually is to setup.
I was looking for this and I did find a few things but they didn't support everything I needed which specifically is HTTPS, WS and WSS!
Right now the library I wrote only works for HTTP. But in the next few days I hope to have it finished and working for HTTPS, WS and WSS as well.