In my web app i am using flash to get the live streaming of my web cam ,and then to display that to my web page using the swf file generated after publishing the Flash file.But the problem i am facing is that the video is not shown in good resolution.I tried many settings but nothing worked.Here is my code for getting the video in Flash.
var camera;
var video;
var bandwidth:int = 100;
var quality:int = 100;
camera=Camera.getCamera();
camera.setMode(190,130,10000);
camera.setQuality(bandwidth*1024/8, quality);
video = new Video(camera.width * 2.5, camera.height *2.4);
video.attachCamera(camera);
video.smoothing;
addChild(video);
Can anyone please tell me ,what i am doing wrong here.Any solutions to get the video with high resolution.Any help will be appreciated.
The issue is fixed.i was doing a minor mistake of providing greater height and width of video than that of camera height and video.Just removed that stuff and my video resolution was back on track.The change in code is :
Changed the line of code
video = new Video(camera.width * 2.5, camera.height *2.4);
to
video = new Video(camera.width, camera.height);
Related
I am attempting to use an HTML5 video player to preview a local video before uploading it. The video is stored as a webm blob, which I convert into a data URL so that it can be sent as a JSON payload around a Chrome Extension.
The video plays great, but I can't seem to scrub the time at all. The time increments, but trying to use the controls just does nothing. The control circle is completely to the left of the control bar, and doesn't increment as the video progresses. I do not see the total time either.
Once the video completes, the control bar works as expected. The total time is present, and the video can be scrubbed. Playing it at this point causes the control circle to proceed from left->right as expected.
The weird part is that I can set currentTime on the video before it finishes and it works great.
Here's my construction. There's nothing fancy here:
const video = document.createElement('video')
video.controls = true
video.preload = 'auto'
video.currentTime = 30 // This works, somehow
video.id = 'video'
video.width = 800
const source = document.createElement('source')
source.type = 'video/webm'
source.src = resp.recordingDataUrl // This looks like data:video/webm;base64,GkXfo59Ch.....
video.appendChild(source)
document.getElementById('root').appendChild(video)
Am I missing something necessary to scrub the time? I have noticed this when browsing videos on the web (usually Reddit), which makes me wonder if there's a webm bug of some sort.
I'm running a VueJS application that displays a full screen story of videos. I don't create as many tag as number of media in my story : I'm just changing component video sources each time I play a new video.
But it looks like Safari (Desktop & mobile) still does not cache HTML video once loaded : when I'm playing again a previous media, Safari is downloading again the asset. Instead of getting from cache like Chrome does.
The same issue has already been reported here but sill no correct answer.
Safari even stops downloading the final bytes video (producing a sort of timeout) when we go back and forth quicky in the story, so the story looks stuck.
Here's an example link.
Does anyone know a good alternative that avoids re-downloading video data at each play on Safari ?
Partial solution
Found myself a workaround that works pretty well if video is small size - all video are less than 3Mb in my case.
The trick is to use js fetch API to download full video, then stream it into video tag.
const videoRequest = fetch("/path/to/video.mp4")
.then(response => response.blob());
videoRequest.then(blob => {
video.src = window.URL.createObjectURL(blob);
});
Contrary to video src attribute, fetch API will get video data from cache if the same video was already fetched before.
Here a codepen demo that can be tested in Safari desktop/mobile (when NOT in private mode).
Pro : Video are now pulled from cache in Safari !
Con : You can't start the video until full data has been downloaded. That's why this solution can be used only for small video (like < 5Mb), else your users may wait a while before being able to play the video.
I am trying to test WebRTC and want to display my own stream as well as the peer's stream. I currently have a simple shim to obtain the camera's stream and pipe that into a video element, however the frame rate is extremely low. The rare thing about this is that I can try examples from the WebRTC site and they work flawlessly.. The video is smooth and there are no problems. I go to the console and my code resembles theirs.. What could be happening? I tried to create both a fiddle and run that code within brackets but it still performs horribly.
video = document.getElementById('usr-cam');
navigator.mediaDevices.getUserMedia({video : {
width : {exact : 320},
height : {exact: 240}
}})
.then(function(stream){
if(navigator.mozGetUserMedia)
{
video.mozSrcObject = stream;
}
else
{
video.srcObject = stream;
}
})
.catch(function(e){
alert(e);
});
Pretty much everything I do. Take into account that I am using the new navigator.mediaDevices() API instead of navigator.getUserMedia() but I don't see how that would matter since 1.I am using a shim provided by the WebRTC group named adapter.js which they themselves use. 2. I don't think how you obtain hold of the video stream would affect performance.
Alright, I feel very stupid for this one... I was kind of deceived by the fact that the video element will update the displayed image without you having to do anything but pipe the output stream, which means the image will update but just at really long intervals, making it seem as if the video is lagging. What I forgot to do was actually play() the video or add autoplay as its property... it works well now.
I am new in the flash and I am using the red5 recorder to record video. Camera that i use is imac around 2 megapixel. While i am recording the video its fine and after record done, i save the video in vzaar. While i fetch the video from vzzar then the video is badly pixelated. How can i make the video fine?
cam.setQuality(0,90);
I keep this code in the as file. But this didn't work.
How can i improve the quality of video can you please suggest?
Thanks in advance :)
You can read the documentation of the class Camera for a better understanding of the properties.
Anyway here is a code suggestion to get the best webcam quality:
var camera:Camera = Camera.getCamera()
//width, height of the video you are recording and fps(frame per seconds)
camera.setMode(400, 400, 15);
//set the bandwith to 0 which means that it will use all it can,
//and the quality to 100 which is the best quality it can get
camera.setQuality(0, 100);
This article will help you to get the better video quality with red5 and flash
http://www.technogumbo.com/tutorials/Recording-Decent-Quality-Video-And-Audio-With-Flash-and-Red5/index.php
I am using Opentok JavaScript WebRTC library to host a 1-to-1 video chat (peer-to-peer).
I can see my peer's video and hear the audio flawlessly.
My wish is to record audio / video of other chat party (remote). For this purpose, I'm using RecordRTC.
I was able to record the video of other chat participant (video is outputted to HTML video element), but, so far, I have not succeeded in recording audio (a dead-silence .wav file is as far as I could get). Using Chrome Canary (30.0.1554.0). This is my method:
var clientVideo = $('#peerdiv video')[0];//peer's video (html element)
var serverVideo = $('#myselfdiv video')[0];//my video (html element)
var context = new webkitAudioContext();
var clientStream = context.createMediaStreamSource(clientVideo.webRTCStream);
var serverStream = context.createMediaStreamSource(serverVideo.webRTCStream);
webRTCStream is a custom property i assigned to HTMLVideoElement object by modifying source of opentok js library. It contains MediaStream object linked to respective < video > element.
var recorder = RecordRTC({
video: clientVideo,
stream: clientStream
});
recorder.recordAudio();
recorder.recordVideo();
Video is recorded. Audio file is also created, it has a length that is close to video's length, however, it's completely silent (and yes, there was a lot of noise making on the other side during recording)
I've tested this with video element which displays my webcam's video stream (and audio), and it worked: both audio and video were recorded:
...
var recorder = RecordRTC({
video: serverVideo,
stream: serverStream
});
...
Is there something special about streams originating from a remote location? Any guidance on this issue would be very helpful.
This is the same issue occurs in following situations...
If not a stereo audio (dual channel audio)...i.e. it is mono audio
If audio input channels are not equal to audio output channels
If audio input device is not the default device selected on chrome
I'm still trying to find the actual issue.
I added this experiment for testing purpose... see console...
https://webrtc-experiment.appspot.com/demos/remote-stream-recording.html
Updated at: Saturday, 1 February 2014, 09:22:04 PKT
Remote audio recording is not supported; and this issue is considered as low-priority wontfix:
Support feeding remote WebRTC MediaStreamTrack output to WebAudio
Connect WebRTC MediaStreamTrack output to Web Audio API
Updated at March 28, 2016
Remote audio+video recording is now supported in RecordRTC, since Chrome version 49+.
Firefox, on the other hand, can merely record remote-audio.
If Chrome/WebRTC/Opus outputs mono audio by default and if that is the problem here, I see two options in that case:
By making opus output stereo - not sure how.
By making the RecordRTC/Recorderjs code work with mono
Or does anyone know any other recording library that works?
This actually now works fine in Firefox. I am using FireFox 29.0.1 and the AudioAPI can now work with audio streams sources grabbed from remote parties from a peer connection.
To test go to Muaz Khan's experiment page. I am not sure with what version of Firefox this rolled out but I would like to thank the team for cranking it out!
The chrome bug was moved to the AudioAPI team cr bug to track progress