Unsubscribe my own video in opentok - webrtc

Hi I am using opentok.
When I say
publisher = OT.initPublisher();
session.publish(publisher);
My own video is visible to myself. I want to see only other participants video but not my own. I want my video to be visible to everyone else in the session except me.
How to make this possible.
And can I make other people video to full-screen?
Please help...

You can try this...
var publisher = OT.initPublisher('myPublisherDiv', {display: 'none'});
session.publish(publisher);
Or you can explicitly check on the video using firebug or chrome dev tool and see if it has an existing class or id, then you can just add in a display none on the class ex.
.video-class{
display:none
}

Related

Embed Reddit Video with Sound

I'm trying to embed Reddit videos with sound from a url. So far I've figured out that you can get the a soundless video by doing:
Link is https://www.reddit.com/r/aww/comments/jien07/living_the_good_life/
Go to https://www.reddit.com/r/aww/comments/jien07/living_the_good_life.json
Find fallback_url, in this case it's https://v.redd.it/4ymh7g5fzfv51/DASH_720.mp4
The issue is that that video doesn't have any sound, how can I get it with sound? I've tried doing share->embed, but the code it gives me doesn't work when I put it into a page. It also has more information than I need, I just need the video.
Yes, there were some bugs with reddit embed video for a while, the console shows the error is Uncaught TypeError: Cannot read property 'getItem' of null, it's because the iframe script can not access the localStorage, localStorage will be null, but reddit does not handle null case:
RedditVideoPlayer.prototype._prepareLocalStorage = function() {
if (typeof localStorage != "undefined") {
var e = localStorage.getItem(this.storageNamespace + ".volume") || this.options.volume
, t = localStorage.getItem(this.storageNamespace + ".muted") || this.options.muted;
this._lastVolume = parseFloat(e),
t.toString().toLowerCase() == "true" ? this.mute() : this.unmute()
}
}
There are also some posts on Reddit like point the problem: https://www.reddit.com/r/bugs/comments/jhx80w/embed_imagevideo_code_is_broken/, but there is no response by the official.
I also found that the iframe works on the mobile, because they don't enable the localStorage at the mobile device.
For PC now, you can use the reddit dash url instead of a cross origin proxy in your site.
I found where to get the video with sound. Although this is an old question, some may still stumble upon it.
you should to take it out
secure_media.reddit_video.hls_url
If you insert this url in the browser, it will download. But if in the player, then everything will be ok! video with sound

Skip videos in youtube using Selenium

My question is like there are 10 videos running one after another in youtube , some of the videos get the "Skip video" option while other's just don't. Now how can I skip the videos which get the option of skipping and just run the other one's as it is. Also youtube uses flash so what api do I need to use for automating it? Thanks
Note : This is an idea which you would have to code. Please don't hesitate.
if(<skip button is available>) {
click Skip button
}
WebElement skipButton = driver.findElement(<locator>);
If(skipButton.isDisplayed())
{
skipButton.click();
}

Get video progress of OSMF Player

A few years ago, Adobe had an OSMF video player wiki, and one of the examples on the wiki teaches on how to execute something for an specific ammount of time, while the video is playing using the javascript api's.
The wiki is no longer online, so I'm no longer able to access the example. How to get the current video time? I'm able to get currently the player status but not the video time. My code is below:
function onJavaScriptBridgeCreated(playerId){
var player = document.getElementById(playerId);
playerswf=document.getElementById(playerId);
var state = player.getState();
if(state=="playing"){
isplaying=1;
completeFunc();
}else{
isplaying=0;
}
}
I think that you can do like this :
player = document.getElementById(playerId);
player.addEventListener("currentTimeChange", "onCurrentTimeChange");
function onCurrentTimeChange(time, playerId)
{
document.getElementById("currentTime").innerHTML = time;
}
This code is a part of an OSMF working example which you can find here. There are many other examples attached to the source code which you can download from sourceforge.
Hope that can help.

Hide controls - use play / pause

Is there a way when you hide controls and still be able to pause / play video ?
I've seen that when controls are visible the player works fine and reported no other problem.
I tried to find an answer on docs but no luck. So is there a way to do this ?
Many thanks and great work here!
Video.js works like the html5 video video element, where if you turn off controls the only way to control the video is through the API. The easiest way to set up what I think you're talking about would be to do the following.
videojs('my_video_id').on('click', function(){
if (this.paused()) {
this.play();
} else {
this.pause();
}
});

Screen sharing with WebRTC?

We're exploring WebRTC but have seen conflicting information on what is possible and supported today.
With WebRTC, is it possible to recreate a screen sharing service similar to join.me or WebEx where:
You can share a portion of the screen
You can give control to the other party
No downloads are necessary
Is this possible today with any of the WebRTC browsers? How about Chrome on iOS?
The chrome.tabCapture API is available for Chrome apps and extensions.
This makes it possible to capture the visible area of the tab as a stream which can be used locally or shared via RTCPeerConnection's addStream().
For more information see the WebRTC Tab Content Capture proposal.
Screensharing was initially supported for 'normal' web pages using getUserMedia with the chromeMediaSource constraint – but this has been disallowed.
EDIT 1 April 2015: Edited now that screen sharing is only supported by Chrome in Chrome apps and extensions.
You guys probably know that screencapture (not tabCapture ) is avaliable in Chrome Canary (26+) , We just recently published a demo at; https://screensharing.azurewebsites.net
Note that you need to run it under https:// ,
video: {
mandatory: {
chromeMediaSource: 'screen'
}
You can also find an example here; https://html5-demos.appspot.com/static/getusermedia/screenshare.html
I know I am answering bit late, but hope it helps those who stumble upon the page if not the OP.
At this moment, both Firefox and Chrome support sharing entire screen or part of it( some application window which you can select) with the peers through WebRTC as a mediastream just like your camera/microphone feed, so no option to let other party take control of your desktop yet. Other that that, there another catch, your website has to be running on https mode and in both firefox and chrome the users are gonna have to install extensions.
You can give it a try in this Muaz Khan's Screen-sharing Demo, the page contains the required extensions too.
P. S: If you do not want to install extension to run the demo, in firefox ( no way to escape extensions in chrome), you just need to modify two flags,
go to about:config
set media.getusermedia.screensharing.enabled as true.
add *.webrtc-experiment.com to media.getusermedia.screensharing.allowed_domains flag.
refresh the demo page and click on share screen button.
To the best of my knowledge, it's not possible right now with any of the browsers, though the Google Chrome team has said that they're eventually intending to support this scenario (see the "Screensharing" bullet point on their roadmap); and I suspect that this means that eventually other browsers will follow, presumably with IE and Safari bringing up the tail. But all of that is probably out somewhere past February, which is when they're supposed to finalize the current WebRTC standard and ship production bits. (Hopefully Microsoft's last-minute spanner in the works doesn't screw that up.) It's possible that I've missed something recent, but I've been following the project pretty carefully, and I don't think screensharing has even made it into Chrome Canary yet, let alone dev/beta/prod. Opera is the only browser that has been keeping pace with Chrome on its WebRTC implementation (FireFox seems to be about six months behind), and I haven't seen anything from that team either about screensharing.
I've been told that there is one way to do it right now, which is to write your own webcamera driver, so that your local screen appeared to the WebRTC getUserMedia() API as just another video source. I don't know that anybody has done this - and of course, it would require installing the driver on the machine in question. By the time all is said and done, it would probably just be easier to use VNC or something along those lines.
navigator.mediaDevices.getDisplayMedia(constraint).then((stream)=>{
// todo...
})
now you can do that, but Safari is different from Chrome in audio.
it is Possible I have worked on this and built a Demo for Screen share. During this watcher can access your mouse and Keyboard. If he moves his mouse then Your mouse also moves and if he types from his Keyboard, it will be typed into your pc.
View this code this code is for Screen share...
Right now in this days you can share screen with this, you not need any extentions.
const getLocalScreenCaptureStream = async () => {
try {
const constraints = { video: { cursor: 'always' }, audio: false };
const screenCaptureStream = await navigator.mediaDevices.getDisplayMedia(constraints);
return screenCaptureStream;
} catch (error) {
console.error('failed to get local screen', error);
}
};