How to develop a video chat and conference app with High Performance using any good Open Source frameworks - webrtc

I see WebRTC is the the best way for developing it. But there are some paid frameworks in market for establisting video chat between wide range of clients like Web-Web, Web-Mobile(IOS, Andriod, Windows, etc.,).
Web-Web communication flow is very simple to implement. Now, I want the same for Web-to-Mobile and vice versa without using any external frameworks built on top of Native WebRTC. Please suggest me some best approach to achieve this.

The latest Chrome on Android is WebRTC friendly, that means if you have a web app that implements WebRTC. It will be working on Android's Chrome.
If you decided to create you own native app that implements WebRTC. Here are some great sources.
iOS WebRTC: https://webrtc.org/native-code/ios/
Android WebRTC: https://webrtc.org/native-code/android/
Follow the instructions in each allow you to build the native WebRTC framework that you can later on import them into your native projects.
The WebRTC APIs are somewhat related to the ones you are using in your web application. You need to do more documentation reading for those as you are using the official framework that built from the source, not a third library.

Before starting you need to review and test platform to make sure it works fine for all your target user categories. You can do that by reviewing references and also testing some existing apps for user types you plan to support.
As you mentioned wide range of clients, you need to identify the limitations of WebRTC technology. You can also evaluate other technologies: in example you could reliably serve most client types with mobile and web apps that use RTMP.

Related

How I make live video chat for my website

I want to add video chat option in my website please guide me how i do this task and what should i required for doing this.How much it's cost if i will make it for my website and also it's maintenance(Like server ETC).
You are looking for something like rtchub.com
If you want it free, you can develop it yourself, using WebRTC:
WebRTC is a free, open project that provides browsers and mobile
applications with Real-Time Communications (RTC) capabilities via
simple APIs. The WebRTC components have been optimized to best serve
this purpose.
See WebRTC Tutorial
On client side you use JavaScript (jQuery), and clients communication directly using browser, but you need server part and signaling mechanism, and you can use for example SignalR or Node.js.
As example you can look at my site: SignalRTC.
P.S. WebRTC works only on selected browsers, for example Chrome, FireFox, unfortunately not on IE or Edge.

Usage of already deployed REST service in mobile application using IBM Mobile First

I've published a native android application which uses a services with response in JSON format. Now, I want to develop a Hybrid application using IBM Mobile first platform. I want to use the same service here in this case also. I'm not able to findout the mechanism to do so. Anyone please suggest me some solution.
I assume that you intend to build your hybrid app using HTML, CSS and JavaScript.
You could directly call the service using standard JavaScript
XMLHttpRequest(serviceUrl)
But you probably will use some framework such as AngularJS - such frameworks really do pay off in the long run - and in which case you have nice APIs
$http.get(serviceUrl).then(doSomeWork);
However this raw JavaScript approach does not exploit the MobileFirst programming model. We tend to find that using a Mobile Gateway architectural pattern, where MobileFirst adapters act as the gateway, pays off as your application becomes more complex. The adapters provide a security model and can implement aggregation and filtering so that precious mobile bandwidth is used more efficiently.
Hence we recommend that you build adapters, as described in the links given by Idan, and then you use the MobileFirst API to call the adapters:
WL.Client.invokeProcedure( ... );
Starting MobileFirst Platform Foundation 7.0, both the JS framework (for Hybrid apps) and the Native SDK (for iOS and Android) provide REST support. You can accomplish this using either JavaScript adapters or Java adapters.
Read the following tutorials explaining how to use MFP adapters:
Server-side development
If you're interested in Java adapters as well, take a look at these videos as well:
Getting familiar with IBM MobileFirst Platform Foundation Java Adapters [Part 1]
Getting familiar with IBM MobileFirst Platform Foundation Java Adapters [Part 2]

PJSIP - (Web)RTC integration

The PJPROJECT libraries are organized as follows:
Base libraries (PJLIB/PJLIB-UTIL/PJSIP/PJNATH/PJMEDIA)
APIs (PJSUA/PJSUA2)
I'm trying to develop a new API based on PJSUA but using RTC native libraries (as far as I know, the term WebRTC is more related to the Web API) instead of PJMEDIA.
However, according to the official docs, I understand that the RTC native libraries are used for signalling and media.
Is it possible to only use the media part of the RTC libraries? If yes, where can I find resources to integrate the RTC libraries with PJSIP?
Thanks,
Mickael
CSipSimple (Android SIP Client) has bolted a part of WebRtc into PJSIP as patches: https://code.google.com/p/csipsimple/source/browse/trunk/CSipSimple/jni/pjsip/patches/002pjsip-webrtc-aec.diff

IBM Worklight - Any advantages for when developing a Mobile web app?

Going through the IBM Worklight product documentation,the product looks great for building hybrid or native applications. However for building mobile web (with responsive web design) what are the specific advantages one can get from worklight?
For (the) Mobile Web (environment), I don't think that at this time there is much left.
However, you do still:
get to use Worklight Adapters and its extensive integration abilities, which do make it easier to connect to various backends
use Cordova to access some device native capabilities
use the WL Client JavaScript API

Browser <-> Client Hardware API?

Are there any initiatives to implement/agree upon a standard API for connectivity between web browsers and client hardware.
Example: The iPhone has a GPS/Camera/Accellerometer in it. It'd be very cool if my web app could communicate with them (rather than me having to write a thick ObjectiveC application).
The closest thing I've seen to that is the Android phone API, which lets your programs access its hardware (relatively) painlessly. Google's pushing for it to become the new standard, but its hardly the same thing as a web-app (which, by most definitions, runs entirely in your browser?).
The upcoming version of FireFox has an API to read your lat/long off a GPS device.
To add to my own question; Yahoo provides a geolocation service called FireEagle that could act as a mediator and provide similar functionality.
In essence the phone communicates with a central Yahoo server updating its location. Your web app can then determine your approx location from that central server.