IMFSourceReader.ReadSample never hitting callback after reading on stream 1. calls to stream 0 work fine - webcam

So background im working on reworking an application that was using direct show to use Windows Media Foundation. in Directshow i have UVC camera still pins working fine. however when i switched to using a SourceReader in WMF i have stream 0 (the live video stream) however when i use the same interface to try and request samples on Stream1 i dont receive anything. This is with the following call.
hr = StreamReader.ReadSample(1,
MediaFoundation.ReadWrite.MF_SOURCE_READER_CONTROL_FLAG.None,
IntPtr.Zero,
IntPtr.Zero,
IntPtr.Zero,
IntPtr.Zero
);
if i switch it to
hr = StreamReader.ReadSample(1,
MediaFoundation.ReadWrite.MF_SOURCE_READER_CONTROL_FLAG.Drain,
IntPtr.Zero,
IntPtr.Zero,
IntPtr.Zero,
IntPtr.Zero
);
i receive only null IMFSamples. ive checked the state of hr and it is always S_OK. During this time i am also running the same call but on stream 0 and it is working fine. The only error or flag i get is StreamTick on the first frame on stream 0.
Im not entirely sure where to go from here if anyone has suggestions im open
Edit1:
I do have streams selected for both and the desired MediaTypes set with SetCurrentMediaType

There is IMFSourceReader::SetStreamSelection method to configure the reader and set it to expect or not expect samples on specific streams. You need to make sure you selected the stream before reading from it.

Related

Browser web cam stream has extremely low performance/frame rate

I am trying to test WebRTC and want to display my own stream as well as the peer's stream. I currently have a simple shim to obtain the camera's stream and pipe that into a video element, however the frame rate is extremely low. The rare thing about this is that I can try examples from the WebRTC site and they work flawlessly.. The video is smooth and there are no problems. I go to the console and my code resembles theirs.. What could be happening? I tried to create both a fiddle and run that code within brackets but it still performs horribly.
video = document.getElementById('usr-cam');
navigator.mediaDevices.getUserMedia({video : {
width : {exact : 320},
height : {exact: 240}
}})
.then(function(stream){
if(navigator.mozGetUserMedia)
{
video.mozSrcObject = stream;
}
else
{
video.srcObject = stream;
}
})
.catch(function(e){
alert(e);
});
Pretty much everything I do. Take into account that I am using the new navigator.mediaDevices() API instead of navigator.getUserMedia() but I don't see how that would matter since 1.I am using a shim provided by the WebRTC group named adapter.js which they themselves use. 2. I don't think how you obtain hold of the video stream would affect performance.
Alright, I feel very stupid for this one... I was kind of deceived by the fact that the video element will update the displayed image without you having to do anything but pipe the output stream, which means the image will update but just at really long intervals, making it seem as if the video is lagging. What I forgot to do was actually play() the video or add autoplay as its property... it works well now.

mediaStreamSource doesn't work, using an audio object with the media stream doesn't loop it through mediaElementSource

I am receiving a stream through the RTCPeerConnection, but can't get it to work properly. Using the mediaStreamSource doesn't work (I read that there was a bug in chrome?). So I tried a workaround: using webkitURL.createObjectURL to put the stream in an audio object. This works, but not with 'mediaElementSource'. When I first create the mediaElementSource and then put a dataURL of a local file in the audio element's stream, it works like normal, but whenever I add the stream to the element, it starts playing normal (like when mediaElementSource has not been made, as if it is just a normal audio element without the AudioContext. When I then put another dataURL from a local file in the element, it directly plays through the AudioContext. Does anyone have any idea?
Edit: found the chrome issue.
Guess I will use canary then. Other solutions are welcome :)
edit2: sadly that doesn't work...

how to connect disconnect the camera device using getUserMedia and webRTC

I am creating an audio/video and chat application using webRTC and Node.js. I need to mute and unmute the camera device.
Presently, I am able to disconnect and the other party is not able to see me, but the problem I see is that it doesn't disconnect the camera. It still remains active and connected as I see the camera flash still on.
I need help how to disconnect when muted and connect it back when unmuted. I want the same feature as we see in skype video call.
It varies a bit between Firefox and Chrome. These steps, in this order, work for me.
1) Set the src property on your video element to empty string ''.
2) Make sure the stop method exists before calling it as a function. Firefox doesn't have it, and if you try to run it, your code will throw an error.
if (localStream && localStream.stop) {
localStream.stop();
}
3) After you call cameraStream.stop() (or not), set localStream = null. (Maybe not actually necessary, but it couldn't hurt to let the object get garbage-collected. And when the user asks to start the camera up again, you can check against the variable to see if you need to clean up after the previous stream before starting a new one.)
When you are getting your media, in your success callback function you have to keep your localstream in a variable. Then, when you want to stop your stream, you can do localstream.stop();
To start again, you can repeat to call your getUserMedia() method again.

How to get multiple cams connected

I am trying to connect multiple webcams at the same time but it produces the Video source dialog to select the webcam. IS there any way to bypass it? Can I pass the webcamm info explicitly?
Here is the code that is producing the problem
Dim infoReturn As VariantType
infoReturn = SendMessage(hHwnd, WM_CAP_DRIVER_CONNECT, iDevice, 0)
If SendMessage(hHwnd, WM_CAP_DRIVER_CONNECT, iDevice, 0) Then
'Set the preview scale
Call SendMessage(hHwnd, WM_CAP_SET_SCALE, True, 0)
WM_CAP_DRIVER_CONNECT is the message that shows the dialog.
any help will be appreciated.
Thanks
You send WM_CAP_DRIVER_CONNECT twice, you don't need to. This is ancient Video for Windows API and you don't have flexibility to override default behavior, nor you have support for full range of video capture sources with it.
Perhaps you should rather look into switching to newer APIs (DriectShow/Media Foundation).

Canon EDSDK device busy can't cancel half completed image transfers

In EDSDK if an error occours during image transfer
(i.e. connection lost, app error,...) when the application restarts and reconnects to
the device it's impossible to take a new picture because the
EdsSendCommand(hdev,kEdsCameraCommand_TakePicture,0) always return
EDS_ERR_DEVICE_BUSY.
This because the previous image data are in camera memory even I switch off the
camera. The only solution is to put off the battery!!!
Does anyone have a better solution (sofwtare) to send a kind of RESET and
release all image data in memory waiting for transfer and let the camera come to normal operation conditions.
I run the SDK under following modes
I first set the property to
EdsdkWrapper.PropID_SaveTo : EdsdkWrapper.EdsSaveTo.Host
EdsSendCommand EdsdkWrapper.CameraCommand_TakePicture
Questions: How to reset the camera after an abrupt termination so that it can forget about all the half completed transfers? Note that I don't have the IntPtr reference after abrupt termination.
(i.e I can't call EdsdkWrapper.EdsDownloadComplete(imageReference))
This is probably because your program exits before the EdsTerminate() function gets executed. You should make sure this function is executed before your program terminates. Or a more crude solution would be to have another program with just the EdsTerminate function. You could run this program before running your application again if an abrupt termination occurred.