Can the restund server be hosted alongside my dedicated hosting server? - apache

I want to implement a restund server for WebRTC audio on my website. I wish to have one user be able to talk to all the other users on the platform (if anyone knows an easier way to do this than implementing a restund WebRTC server, please let me know, would help me out a lot).
But before I go and try to get restund working, I was wondering if it could be installed to work alongside my Apache HTTP dedicated server I use to host my website.

Well, STUN/TURN services are running on ports 3478 and 5349 by default. That should not conflict with those required for HTTP operations (e.g. 80, 443, 8080). So yes, this should be possible.

Related

How user is able to connect with turn server and not directly to webtrc machine?

I was curious on why a client cannot directly to a machine running webrtc server but can do that via turn server. Both turn & webrtc are in same VPC of AWS.
Could be a lot of things.
Assuming you have the TURN configuration file correct, and as you are noting both AWS instances have public IPs, then it's possible that on the instance with the TURN server, you do not have all the firewall ports opened needed related to the TURN server: https://stackoverflow.com/a/59212004/8201657
Or, maybe it's a DNS issue and the domain of your TURN server is unknown to your peer, so it is not able to access it.
Or, maybe you are attempting to connect via WebRTC but not securely. WebRTC requires a secure connnection (https).

WebRTC call between two networks connected to the same server

I currently have the following network setup and would like to be able to make WebRTC calls between the two clients in different networks.
I enabled IPv4 forwarding on the openSuse Leap 15.2 server and both devices have either 192.168.2.1 or 192.168.4.1 as their default gateway. The web application as well as the signaling service are both hosted on this server as well.
With the Firewall disabled the call works as suspected, but with the Firewall on the call no longer works. I thought about hosting a Coturn STUN/TURN server on this server, as I've read that you should provision one, if you run into troubles with a firewall.
Is a setup like this doable with lets say Coturn and what would the configuration look like for a scenario like this?
I ended up solving it as I describe in my GitHub issue for this matter.

Is nginx needed if Express used

I have a nodeJS web application with Express running on a Digital Ocean droplet.The nodeJs application provides back-end API's. I have two react front-ends that utilise the API's with different domains. The front-ends can be hosted on the same server, but my developer tells me I should use another server to host the front-ends, such as cloudflare.
I have read that nginX can enable hosting multiple sites on the same server (i.e. host my front-ends on same server) but unsure if this is good practice as I then may not be able to use cloudflare.
In terms of security could someone tell me If I need nginx, and my options please?
Thanks
This is a way too open-ended question but I will try to answer it:
In terms of security could someone tell me If I need nginx, and my
options please?
You will need Nginx (or Apache) on any scenario. With one server or multiple. Using Express or not. Express is only an application framework to build routes. But you still need a service that will respond to network requests. This is what Nginx and Apache do. You could avoid using Nginx but then your users would have to make the request directly to the port where you started Express. For example: http://my-site.com:3000/welcome. In terms of security you would better hide the port number and use a Nginx's reverse proxy so that your users will only need to go to http://my-site.com/welcome.
my developer tells me I should use another server to host the
front-ends, such as cloudflare
Cloudflare does not offer hosting services as far as I know. It does offer CDN to host a few files but not a full site. You would need another Digial Ocean instance to do so. In a Cloudflare's forum post I found: "Cloudflare is not a host. Cloudflare’s basic service is a DNS provider, where you simply point to your existing host.".
I have read that nginX can enable hosting multiple sites on the same
server
Yes, Nginx (and Apache too) can host multiple sites. With different names or the same. As domains (www.my-backend.com, www.my-frontend.com) or subdomains (www.backend.my-site.com, www.my-site.com) in the same server.
... but unsure if this is good practice
Besides if it is a good or bad practice, I think it is very common. A few valid reasons to keep them in separated servers would be:
Because you want that if the front-end fails the back-end API continues to work.
Because you want to balance network traffic.
Because you want to keep them separated.
It is definitively not a bad practice if both applications are highly related.

Hosting Slim Framework Rest API in Windows

I created an api using SLIM framework, but the services are not accessible to public as they are limited to localhost. how to host the services on a realtime server, so that, they can be accessible from anywhere?
please some one help me
This question requires more detail in order to answer properly.
If you are hosting your API on a windows server, then it is likely you have configured some kind of "WAMP" stack, correct? Or maybe serving PHP through IIS? This are important questions because we need to know what port you have bound your web application server to, which leads us to the next question...
Where are you hosting the server which is running the application which bound to what port?
Ultimately, a public, external IP will need to be either:
a. NAT'ed to the internal IP of your web server instanced
b. Port-forwarded to the internal IP of the server running your web application
Still, we are making a lot of assumptions here because getting a web application "accessible from anywhere" will require different work depending on your environment.
Here is the most basic example:
You are at home, running this API on your Windows workstation and will like to be able to hit it from a remote location.
Ensure Windows firewall allows inbound traffic to the port on which your application is running (probably port 80/HTTP, maybe 443/HTTPS).
Log into your ISP's router and configure port-forwarding to ensure inbound traffic on, say, port 80, is routed to the internal IP of the workstation running the API.
That's all there is to it.
Keep in mind that this also assumes that your ISP even allows you to expose your own web server to the internet on port 80 (or 443). Also, since we know nothing about your environment, this is all pure conjecture. Please provide more information you would like a real answer.
The most traditional way to host Slim Framework, would be through Apache. Install Apache and be sure you have the proper network settings to allow inbound connections, but more information about your setup could be needed for proper guidance.
http://httpd.apache.org/docs/2.4/platform/windows.html
When Apache is installed and working, you need to set Rewrite rules on the URL, information on that can be found on http://docs.slimframework.com/routing/rewrite/.
Your question on the verge of off topic, it probaly is, but read up on what questions can be asked and not, here on Stackoverflow, hope i could help.

How to put up an off-the-shelf https to http gateway?

I have an HTTP server which is in our internal network and accessible only from inside it. I would like to put another server that would listen to an HTTPS port accessible from outside, and forward the requests to that HTTP server (and send back the responses via HTTPS). I know that there are several ways to do this with some programming involved (and I myself made a temporary solution with Tomcat and a very simple servlet I wrote), but is there a way to do the same just plugging parts already made (like Apache + modules)?
This is the sort of use-case that stunnel is designed for. There is a specific example of using stunnel to wrap an HTTP server.
You should consider whether this is really a good idea, though. Web applications designed for use inside a corporate firewall are often fairly lax about security. Merely encrypting the connections prevents casual eavesdropping, but does not secure the site. If an attacker finds your outward facing server and starts connecting to it, they can still try to find exploitable flaws in the web service (SQL injection, cross-site scripting, etc).
With Apache look into mod_proxy.
Apache 2.2 mod_proxy docs
Apache 2.0 mod_proxy docs