Videochat without using webrtc? - webrtc

I say "without" using webrtc because, webrtc currently supports very few codecs, I would like to make a videochat using a custom or experimental codecs, so I'm starting with implementing basic udp websockets, but how do I implement something like SDP? Since the ip of the clients will keep changing, how do I maintain connection? Can anyone point me to a decent way of implementing this?

Related

React Native - Connecting to remote WebRTC stream

We have mobile application that historically has used RTSP streaming to allow a user to watch a live stream, which currently is published via Wowza Streaming Engine. We have had a need to lower stream latency, so have gravitated towards WebRTC to achieve this.
The problem is that there seems to be a lack of documentation, or examples regarding the implementation of a react-native WebRTC viewer which connects to a remote stream.
Does anyone out there have any documentation, or code examples for this kind of implementation?
I do note there is a react-native-webrtc library, however, all examples demonstrate connecting two peers on mobile phones with their video cameras i.e. Like facetime. We are after an example demonstrating someone on a phone connecting to a remote streaming server with a video feed.
Cheers,
If you want a webrtc client to connect to a server you need a server doing webrtc with the proper signaling that fit your need. Webrtc don't care which signaling you use, so you have to choose it or choose a the platform you need.
There are a lot of different media server, or library that support webrtc in server side all having there specific signaling(ex: Freeswitch, Kurento etc), or no signaling (ex: Mediasoup). Few will have a react native version as Media Streaming is not really something in the javascript/UI side but you can do something with the webrtc react-native lib.
Twillio has a lot of supported platform and could be a good start if you search a ready to use solution.

What are the pros and cons of implementing webRTC?

I would like to implement a video / audio call feature from a browser. The goal is to allow two users to communicate remotely without having to install a third part (when I say third part, I'm talking about a software or an extension on a browser).
I know WebRTC, which is very popular today and free. However, it is very difficult to implement and the documentation is difficult to understand (not very easy for a beginner).
Here is the official webRTC documentation, and honestly, where to start? https://webrtc.org/start/
If you have an experience about WebRTC, is it possible to share with positive or negative points? This would be very useful for the community.
Moreover, if you have experience with another library, I think it would be interesting to hear it.
There is no other way to develop a call service in a website without the use of WebRTC today.
The alternatives are:
Use WebRTC
Use Flash (which is... dead)
Use a plugin (which is... dying as a mechanism in browsers)
Use an app you download (not exactly a service in a website)
Node.js is the way to go, but you will need to learn some new technology, especially when it comes to the backend.
The servers you will need are:
1. The traditional web application server
2. A signaling server (the one you plan on using Node.js for - you can use that for the web application server as well)
3. A STUN/TURN server (for NAT traversal)
4. Maybe a media server, depending on your use case
For some alternative open source and commercial products, you can check this WebRTC Developer Tools Landscape

Streaming RTSP to WebRTC using Kurento

I have been testing out Kurento for a while now.
I have gone through one2many sample, and got everything working.
Now I would like to do the same, but have the "presenter" be an RTSP source.
I don't have much experience with RTSP, so I might be missing something. I have looked over several samples and they all use the PlayerEndpoint, which receives an rtsp://... address.
For my implementation, I would rather the camera access a Kurento URL in order to initiate the RTSP stream.
Since I have very limited experience with RTSP, I'm not sure if this is possible and if it's a common practice.
If not, what are the alternative in a case where I don't know the RTSP URI in advance and don't have a UI to input it at runtime?
Follow this example project, you can simply make Kurento streaming RTSP to WebRTC on the fly.

SIP-WebRTC gateway/bridge: Kurento OR openwebrtc OR Intel CS for webrtc

I am researching implementation of a WebRTC-SIP gateway/bridge. That is, for example, to make a WebRTC call to a SIP end point via a SIP server like Asterisk. I know that Asterisk already supports this but I need an intermediary server for various needs like logging, recording, integration with local auth/signalling and other app modules. I looked at Kurento, Openwebrtc (Ericson) and the lesser known Intel's Collaboration Suite for WebRTC.
I need a server-side solution to interact with my Node Application server. Specifically, the server-API should be able to generate a SDP for a RTP end point and convert WebRTC SDP to the more generic SDP used by Legacy SIP servers or have a way to bridge these two end-points. I feel comfortable that this is possible with Kurento (saw a post on except that I am not aware of any jsSip/sipML5 kind of API for Kurento. Kurento itself is not meant to provide signalling. For e.g., if the SDP generated by Kurento for the rtpEndpoint in Kurento has to be used in a SIP call/INVITE, how would one implement it? For that matter, how would one initiate a SIP INVITE, for example, from Kurento? Are there third-party modules to do this?
Has anyone used the any of the servers listed above for a similar use case?
This is a programming question. I am looking for server APIs to implement a WebRTC to SIP gateway/bridge for media transcoding (if required), SDP transformation and SIP signalling.

iOS client app with Mac server

I am attempting to build a client/server game architecture and would like to begin testing the game using my local Mac as the server. I have found several articles on Bonjour, but that is for local network traffic only. My goal is to make this application work over the Internet, connecting to a hosted application on a static address to facilitate turn data. However, I'm at a loss as to which Cocoa APIs to use for this purpose. Should I use NSConnection, NSStream subclasses, or good 'ol C sockets and whatnot. The game state is already built in Objective-C and is ready to be set in motion once I have the server facilities ready. Any insight?
NSConnection, NSStrean and C sockets are build for different needs. You need to specify the needs of your game and the kind of service in order to get more help. If you want to develop a client-server application that relies on the Internet and not on the local network, Bonjour will not be able to help.
C sockets, and Cocoa APIs that wrap around them are intended to operate on an open network stream between the client and the server. The advantage of having an open stream is that you can have the server send data to the client without the client having requested for it. NSURLConnection in Cocoa works differently. With it, you can perform HTTP requests and receive formatted responses from a server.
If your application is based on HTTP requests, I recommend you take a look an NSURLConnection, or AFNetworking, as a 3rd party alternative. If your application relies on open streams, I recommend you take a look at CFNetwork from Apple (C wrapper around BSD sockets that originates from the days Macs had Carbon, with great performance, stability and versatility) and GCDAsyncSocket, a 3rd party library wrapped around BSD Sockets, supports Crand Central Dispatch, is Objective-C ready, and does the job wonderfully.
I hope I helped.
I suggest you to use sockets, since they're not hard to use and are a standard way. I've even written an asynchronous wrapper class around BSD sockets: https://github.com/H2CO3/TCPHelper
This is for simple, one-to-one TCP protocol connections, supporting both IPv4 and IPv6. You can send and receive raw NSData and possibly build a protocol around it.
Foundation classes such as NSURLConnection are not particularily for this purpose; rather than to interact with standard HTTP servers (I suppose you don't want to implement a full HTTP server for a game).
NSNetServices may suit you just like CFNetwork, but the latter is a bit harder to use. If you'd like to use Foundation classes, I'd recommend NSNetServices.
Hope it helps.
There are many different ways to accomplish this. It really depends on how you'll be passing the data and what it will be used for.
First, I would setup a hostname that you can use for development purposes with your game. You can use anything like http://dyn.com/dns/ to setup this for your Mac. Then you can enable a compiler setting to switch out the development / production URL's.
Next, I would recommend using TCP sockets for your game (using CocoaAsyncSocket - https://github.com/robbiehanson/CocoaAsyncSocket). This method should work fine your your use case. Since you are doing turn-based data (and since all of that data is vital) I would not recommend using UDP sockets (but those would work if you were solely passing position, video, or audio data where a dropped packet might not matter).