Hosting multiple express sites on different domains in RHEL server - express

Business has requested to setup an environment to host multiple Express sites on different domains in the same RHEL 7.2 server with minimal inter-site impact/interference.
We tried using the vhost approach wherein we mapped domains to different shells and then mapped using the below code in server.js which serves as the main entry point for our all Express sites:
var fs = require('fs'),
path = require('path'),
express = require('express'),
vhost = require('vhost'),
var app = express();
var virtualHosts = JSON.parse(fs.readFileSync('vhosts.json', 'utf8'));
virtualHosts.forEach(function(virtualHost) {
app.use(express.static(path.join(__dirname, virtualHost.path)));
app.use(vhost(virtualHost.domain, app));
});
// Listen on port 8082
app.listen(8082);
But this approach is not working out as for each of these domains, the same content is being displayed if we browse through host file entry.
Any quick suggestions are highly appreciated?
PS: We are not inclined on using the nginx proxy.

Related

Domain name only works when adding Https, Node

this is my first real problem with node. I have recently converted my website to https. This is great but my site can only be accessed by "https://www.example.com". Before converting I used port 80 (The default port), and basic express routing. I used to be able to connect to the ip address of my server and typing the domain name "www.example.com". I have the DNS set to redirect to the ip address of the server with ipv4 and ipv6. After switching to https I am no longer able to access the server using the ip address and the port 443 (I host on port 443). I am wanting to know why I can not access my website using the ip address + the port number (123.456.78.90:443) and why I must be so specific when getting at my website using https://www.example.com and not just www.example.com.
express = require('express');
app = express();
var http = require('https')
var fs = require('fs')
var sslPath = '/etc/letsencrypt/live/www.example.site/'
var options = {
key: fs.readFileSync(sslPath + 'privkey.pem'),
cert: fs.readFileSync(sslPath + 'fullchain.pem')
}
app.use(express.static("public"));
app.set("view engine", "ejs");
app.get("/", function(req,res){
console.log("Someone Connected")
res.render("homepage");
});
server = http.createServer(options, app)
io = require('socket.io').listen(server)
server.listen(443)
If your server only accepts https, then you HAVE to specify https:// in the URL. That's the only way the browser knows that is the protocol you intend to use. When you leave out the https:// from a URL in the url bar of the browser, the browser assumes http, no matter what port number you specify.
You could make your server respond to http on port 80 and automatically forward to https. Then, you could type www.example.com in the browser and it would automatically redirect to https://www.example.com.
Your server will never respond to 123.456.78.90:443 which the browser will turn into http://123.456.78.90:443 because on port 443, you're listening for https connections, not http connections (you can't listen for both on the same port).
Using the auto-redirect logic described about, you could type 123.456.78.90 into the browser and it would redirect to https://123.456.78.90 which would work.
You can either implement the auto-redirect yourself (set up a simple http server listening on port 80 and redirect all incoming requests to the same URL, but with https in front of it). Or, many hosting providers offer this feature built into their infrastructure (built into proxies or load balancers).

How can Apache (maybe Tomcat) redirect to a specific file

I have a reverse proxy that points to a server with a specific port number (ex. 8443). On that server, we have a running Tomcat that listen to that port using HTTPS. This port have some mapping set to different services (ex. " https://myURL.com/myApp/api/myService").
What I would like to do is having Tomcat to also redirect to a file. Example: https://myURL.com/myApp/download/file12354.exe points to \myServer\files\file12354.exe
My question is: Is is possible to do that from Tomcat configurations or should I setup a normal Apache Web Server that redirects to a Tomcat when url = /api/ and to a file when /download/ ?
The files may be over 4GB each and the server is on Windows Server 2012.
I found a way to do that.
I am now streaming the file byte by byte directly from tomcat and my tablet is downloading it with HttpsURLConnection.
InputStream is = new FileInputStream(myFile);
OutputStream os = response.getOutputStream();
while((count = is.read(buffer)) != -1){
os.write(buffer,0,count);
}
This works for small files. I still have some problems with SocketTimeoutException.

Varnish cache with only 1 IP?

I'm starting my adventure with the Varnish Cache. Still have some questions about it, I couldn't find the straight answer.
Can I use Varnish cache server v.3.0.6 with eg. Plesk web hosting control panel, and have some domain (virtual hosts) with only 1 IP?
Please advise
If Plesk web hosting allows you to setup different ports on the same ip you can use varnish as frontend on port 80 or 443(https) and let varnish communicate to a different port on the same server to retrieve the webpages.
in your vcl you can add the following host
backend website {
.host = "127.0.0.1";
.port = "8080";
}

how do I route multiple domains to multiple node applications?

I'm used to the typical lamp web hosting environment where you just click a few buttons in cpanel and your domain gets zoned and mapped to a folder within htdocs. I've been using node.js a lot and it doesn't seem as simple to do the same thing. If I had multiple node apps and I wanted to route domain1.com:80 and domain2.com:80 each to its own node app and port, how do I go about doing that? where do I start?
This is typically done with nginx. Nginx is a reverse proxy, a piece of software you put infront node.js.
server {
listen 80;
server_name www.domain1.com;
root /var/www/domain1;
location / {
proxy_pass http://localhost:1337; # this is where your node.js app_domain1 is listening
}
}
server {
listen 80;
server_name www.domain2.com;
root /var/www/domain2;
location / {
proxy_pass http://localhost:1338; # this is where your node.js app_domain2 is listening
}
}
From here: Nginx Different Domains on Same IP
I dont recomend apache to do these, nginx fits better with nodejs.
You can run the apps for example at the port 3000 and 3001,
then proxy it to mydomain1:80, and mydomain2:80.
To get mydomain1 and mydomain2 unther the port 80, these is all about DNS not apache.
Theres no way to run apache/nginx and your node httpserver on the same port. ull get a error.
p.s. Im not sure in u can do these #tipical lamp webhost
hope it helps
You can setup virtual domains in Node if you're using Express.
The code you would use to start your server with would look something like this.
var sys = require('sys'),
express = require('express');
var app = express.createServer();
app.configure(function() {
app.use(express.vhost('subdomain1.local', require('./subdomain1/app').app));
app.use(express.vhost('subdomain2.local', require('./subdomain2/app').app));
app.listen(3000);
});
Then you would export app in each subdomain.
var app = express.createServer();
exports.app = app;
Here's a post to read more about vhost in Express.js.

How to use vhosts alongside node-http-proxy?

I'm running Nodejs and Apache alongside each other.
node-http-proxy is listening on port 80 and then forwarding requests to either Apache(:9000) or to Express(:8000).
My virtual hosts on Apache look like:
<VirtualHost 127.0.0.1>
DocumentRoot "/localhost/myVhost"
ServerName myVhost
</VirtualHost>
My question is, what is the "correct" way to have vhost like functionality on the Express/Nodejs side? I would prefer to not have to place each Nodejs app on its own port as is suggested here:
https://github.com/nodejitsu/node-http-proxy
(Section titled "Proxy requests using a 'Hostname Only' ProxyTable")
I noticed Connect (which as I understand it, gets bundled in Express) has some vhosts functionality. Should I be using that? If so, what would be the correct way to run it alongside node-http-proxy?
http://www.senchalabs.org/connect/middleware-vhost.html
I also noticed this other module called "Cluster", it seems to be related but I'm not sure how:
http://learnboost.github.com/cluster/
While not wanting to overwhelm, I also came across one called, "Haibu" it seems to be related but I'm not sure if it would just be an all out replacement for using vhosts:
https://github.com/nodejitsu/haibu
Note: I'm a front-end guy, so I'm not very familiar with a lot of server terminology
I never figured out Haibu or Cluster. But I did find a good solution that addressed my issue. To my surprise, it was actually quite simple. However, I don't know much about servers, so while this works, it may not be optimal.
I set up virtual hosts like normal on Apache
(http://httpd.apache.org/docs/2.0/vhosts/examples.html)
I installed the following on Node
Express (http://expressjs.com/)
node-http-proxy (https://github.com/nodejitsu/node-http-proxy)
Then, as a matter of personal style, I placed all my virtual hosts in a common directory (/localhost)
I then switched Apache to listen on a port other than port 80. I just happened to choose port 9000 because I had seen that used somewhere. (In httpd.conf, changed "Listen 80" to "Listen 9000"). I also had to make sure that all my virtual hosts, as defined in extra/httpd-vhosts.conf were set to an IP based nameVirtualHost (127.0.0.1) instead of using a port (*:80).
On the Node side, I created my app/server (aka node virtual host) that listened on port 8000 (somewhat arbitrarily choice of port number) See this link on creating a server with express: http://expressjs.com/guide.html
In my /localhost directory I then created a file called "nodeHttpProxy.js"
Using node-http-proxy, in nodeHttpProxy.js I then created a proxy server that listens on port 80. Using express, which wraps connect (http://www.senchalabs.org/connect/) I created my virtual hosts.
The nodeHttpProxy.js file looks like this:
// Module dependancies
var httpProxy = require('/usr/local/lib/node_modules/http-proxy/lib/node-http-proxy')
, express = require('/usr/local/lib/node_modules/express/lib/express');
// Http proxy-server
httpProxy.createServer(function (req, res, proxy) {
// Array of node host names
var nodeVhosts = [
'vhost1'
, 'vhost2'
]
, host = req.header('host')
, port = nodeVhosts.indexOf(host) > -1
? 8000
: 9000;
// Now proxy the request
proxy.proxyRequest(req, res, {
host: host
, port: port
});
})
.listen(80);
// Vhosts server
express.createServer()
.use(express.vhost('vhost1', require('./vhost1/app')))
.use(express.vhost('vhost2', require('./vhost2/app')))
.app.listen(8000);
As you can see, I will have to do two things each time I create a new Node virtual host:
add the virtual host name to my "nodeVhosts" array
define a new express virtual host using the .set method
Of course, I will also have to create the actual host path/files in my /localhost directory.
Once all this is done I just need to run nodeHttpProxy.js:
node nodeHttpProxy.js
You might get some weird "EACCESS" error, in which case, just run as sudo.
It will listen on port 80, and if the host matches one of the names in the nodeVhosts array it will forward the request to that host on port 8000, otherwise it will forward the the request onto that host on port 9000.
I've been giving this some thought lately as I'm tackling the same problems on my personal test environment. You are not going to be able to get around having each node application running on it's own port, but you can abstract away the pain of that process. Here is what I am using now, but I hope to build an npm package around this to simplify things in the future.
Each of my node.js applications has a map file that contains the port that the application is listening on as well as a map that indicates the expected path which the application is being served on. The contents of the file look like this:
{"path": "domain.com/path", "port": 3001}
When I start my application, it will read the port from the map.json file and listen on the specified port.
var map = fs.readFileSync('map.json', 'ascii');
app.listen(map.port);
Then in my proxy setup, I iterate over each of my node.js application directories, and check for a map.json file which indicates port 80 traffic should be proxied to this application.
I use almost the exact same method to setup the proxy for our apache hosted applications as well. We use a folder based convention on the PHP websites that we are serving and it uses the following configuration:
VirtualDocumentRoot /var/www/%-2.0.%-1/%-3+/
VirtualScriptAlias /var/www/%-2.0.%-1/%-3+/cgi-bin/
This essentially allows us to map domains to folders using the following structure.
http://sub.domain.com/ = /var/www/domain.com/sub/
There is no additional configuration needed to add or remove sites. This is very close to what I am currently using to proxy both apache and node sites. I am able to add new node and new apache sites without modifying this proxy application.
proxy.js
var fs = require('fs');
var httpProxy = require('http-proxy');
var proxyTable = [];
// Map apache proxies
fs.readdirSync('/var/www/').forEach(function(domain) {
fs.readdirSync('/var/www/' + domain).forEach(function(path) {
var fqd = domain + '/' + path;
var port = fs.readFileSync('port', 'ascii');
proxyTable[fqd] = fqd + ':' + 8080;
});
});
// Map node proxies
fs.readdirSync('/var/www-node/').forEach(function(domain) {
var map = fs.readFileSync('map.json', 'ascii');
proxyTable.[map.path] = '127.0.0.1:' + map.port;
});
var options = {
router: proxyTable
};
var proxyServer = httpProxy.createServer(options);
proxyServer.listen(80);
In the future, I will probably decouple the path from the port that the application is listening on, but this configuration allows me to build the proxy map automatically with very little work. Hopefully this helps.
I took some inspiration from #uglymunky and wrote a chef script to do this on Ubuntu.
With this script you can install express and apache with vhost support on a single server using 1 line after you pull down my chef project from github
https://github.com/toranb/ubuntu-web-server
If you have git installed and you pull it down you can kick it off like so ...
sudo ./install.sh configuration.json
This does require Ubuntu 12.04 or greater as I took advantage of an upstart script to start node when you reboot the machine
When the script is finished you will have a working ubuntu web server with express to run any node apps you configured, along side apache to run any wsgi apps you configured
I'm working on an extremely minimal and to the point library that can be totally segregated from your projects. Basically the idea would be run this independently on your servers and don't ever worry about having to bundle this in your projects how you would with connect.
Take a look at the config.json file to see how simple it actually is to setup.
I was looking for this and I did find a few things but they didn't support everything I needed which specifically is HTTPS, WS and WSS!
Right now the library I wrote only works for HTTP. But in the next few days I hope to have it finished and working for HTTPS, WS and WSS as well.