Canon EDSDK Device Busy when changing property - edsdk

I have a little problem with EDSDK.
I tried to make a function that can switch from liveview to movie record.
Everything is fine with liveview, I can take picture.
But when I want to start movie record, I have a ERR_DEVICE_BUSY everytime I want to change a property (save to camera instead of PC). I try everything, make a loop, nothing. And it's the only place where I get this error.
Here's my code :
EdsDataType dataType;
EdsUInt32 dataSize;
EdsUInt32 enregistrement;
err = EdsGetPropertySize(camera, kEdsPropID_SaveTo, 0, &dataType, &dataSize);
err = EdsGetPropertyData(camera, kEdsPropID_SaveTo, 0, dataSize, &enregistrement);
EdsUInt32 saveTo = kEdsSaveTo_Camera;
if(enregistrement != kEdsSaveTo_Camera){
err = EdsSetPropertyData(camera, kEdsPropID_SaveTo, 0, sizeof(saveTo), &saveTo);
if(err != EDS_ERR_OK){
printf("Erreur de changement d'emplacement de sauvegarde, arret de l'enregistrement\n");
printf("err : 0x%X\n", err);
return err;
}
}
//Changement du mode de prise de vu
EdsUInt32 mode = 20; //Le monde 20 correspont à l'enregistrement vidéo
EdsSetPropertyData(camera, kEdsPropID_AEMode, 0, sizeof(mode), &mode);
//Variable pour la mise en route et l'arrêt de l'enregistrement
EdsUInt32 debutEnregistrement = 4;
EdsUInt32 finEnregistrement = 0;
err = EdsSetPropertyData(camera, kEdsPropID_Record, 0, sizeof(debutEnregistrement), &debutEnregistrement);
if(err != EDS_ERR_OK){
printf("Erreur lors du lancement de l'enregistrement");
return err;
}
//Wait for stop record
err = EdsSetPropertyData(camera, kEdsPropID_Record, 0, sizeof(finEnregistrement), &finEnregistrement);
if(err != EDS_ERR_OK)
printf("Erreur lors de l'arret de l'enregistrement");
//L'enregistrement est fini, vérification des evenement
EdsGetEvent();
If you have solution, I take, thanks.

Regardless of what the docs say, the EDSDK sometimes returns EDS_ERR_DEVICE_BUSY for EdsSetPropertyData() when the actual error is a bad input parameter. For example, try setting kEdsPropID_Av with a value of decimal 50 (0x32) which is not in the table given in the documentation. On my EOS 5Ds, this returns EDS_ERR_DEVICE_BUSY no matter how many retries are attempted. Passing a legal value, e.g. 0x33 (for f/6.3) succeeds first time. The bug is 100% reproducible here.
So, if you get this "device busy" error when setting properties, check the input values you are passing with a fine-toothed comb.

You can try this I don't know if you already but it should work it has for me so far on all my property changes.
Just after you opened the session use the EdsSetPropertyEventHandler function
Begin by stopping you LiveView by putting kEdsEvfOutputDevice at TFT if I recall good. (the parameter that is not PC).
In the call back of the property event handler you make a switch on the event and when it is for the property kEdsEvfOutputDevice, then you throw your function to go to movie mode.
the callback function will be called when the property changed will be made so you won't have any device busy or notready. But becareful you have to set your callback function to static so that it works. So you wont be able to call anyother functions that the one that are static, the same for the variables.
If you didn't understand I can explain it to you in french, far more easy for me ^^

Related

EDSDK Live view problems using canon example

I need to make a function for live view image in canon camera, but when I try to use the example code provided from canon I got some errors in variables. So I modified the code so that it could work. But the function EDSDK.EdsGetPropertyData returns the error code 98 (in hex 62) which represents the error EDS_ERR_INVALID_POINTER. I would like to know what is wrong in my code, and how can I proceed it.
IntPtr outData;
public void startLiveview(IntPtr camera)
{
uint err = EDSDK.EDS_ERR_OK;
EDSDK.EdsDataType device = new EDSDK.EdsDataType();
int size;
EDSDK.EdsOpenSession(camera);
// Get the output device for the live view image EdsUInt32 device; err = EdsGetPropertyData(camera, kEdsPropID_Evf_OutputDevice, 0 , , sizeof(device), &device );
// PC live view starts by setting the PC as the output device for the live view image. if(err == EDS_ERR_OK) { device |= kEdsEvfOutputDevice_PC;
err = EDSDK.EdsGetPropertySize(camera, EDSDK.PropID_Evf_OutputDevice, 0,out device, out size);
MessageBox.Show("Error result:"+err.ToString());
MessageBox.Show(device.ToString());
MessageBox.Show(size.ToString());
err = EDSDK.EdsGetPropertyData(camera, EDSDK.PropID_Evf_OutputDevice, 0, size, outData);
MessageBox.Show(outData.ToString());
if (err == EDSDK.EDS_ERR_OK)
{
//type2.GetType();
//device |= EDSDK.PropID_Evf_OutputDevice;
// err = EDSDK.EdsSetPropertyData(camera, EDSDK.PropID_Evf_OutputDevice, 0, sizeof(device), &device);
}
else
{
MessageBox.Show("Error result:" + err.ToString());
}
EDSDK.EdsCloseSession(camera);
}
EdsGetPropertyData expects the outData pointer to have a value but you are passing a zero pointer.
You need to allocate memory first and then call EdsGetPropertyData, i.e. something like this:
outData = System.Runtime.InteropServices.Marshal.AllocHGlobal(size);
err = EDSDK.EdsGetPropertyData(camera, EDSDK.PropID_Evf_OutputDevice, 0, size, outData);
Once you are done, you must release that allocated memory or you'll have a memory leak:
System.Runtime.InteropServices.Marshal.FreeHGlobal(outData);
In the C# examples of the Canon SDK (since version 13.x, I believe) you should find methods that already implement EdsGetPropertyData methods for several data types. Why not use those instead of writing it yourself?

Lua API unload luaL_loadfile

I working on Lua scripter in game that users are able to write custom Lua script and load it.
I already done with implementation and functions
L = luaL_newstate()
luaL_openlibs()
lua_register(L, cFunction, "luaFunction")
lua_register(L, cFunction2, "luaFunction2")
lua_register(L, cFunctionN, "luaFunctionN")
What I trying now?
Possibility to execute/kill script on button press, below picture for more clarity : https://i.stack.imgur.com/eJQzg.png
All scripts from scripter need to access to Lib.lua so i do:
luaL_loadfile(L, "Lib.lua")
lua_pcall(L, 0, 0, 0)
Then to load script i could use this same thing and is ok until user want to kill/unload script.
luaL_loadfile(L, "script.lua")
lua_pcall(L, 0, 0, 0)
I digging in many threads about Lua API and currently no good answer for this problem.
Ppls talking about lua_newthread that i already tried to implement but no success.
T = lua_newthread(L)
luaL_loadfile(L, "script.lua")
lua_pcall(L, 0, 0, 0)
Return >> Attempt to call a nill value. Look like new thread has no access to Lib.lua
Also there is other problem, when i register function i use:
lua_register(L, cFunction, "luaFunction")
Then even i able to create a thread functions from luaL_newstate*L required to pass handle L to working while handle of thread is T.
Example of C function passing to lua:
static int add (lua_State *L) {
double a = lua_tonumber(L, -1); /* get argument 1 */
double b = lua_tonumber(L, -2); /* get argument 2 */
lua_pushnumber(L, a+b); /* push result */
return 1; /* number of results */
}
Any king of is appreciated.
Regards,
Ascer

Problems connecting to the input pins of GMFBridge Sink Filter

I am experiencing a strange problem when trying to use the GMFBridge filter with the output of an Euresys UxH264 card.
I am trying to integrate this card into our solution, that relies on GMFBridge to handle the ability of continuous capture to multiple files, performing muxing and file-splitting without having to stop the capture graph.
This card captures video and audio from analog inputs. It provides a DirectShow filter exposing both a raw stream of the video input and a hardware-encoded H.264 stream. The audio stream is provided as an uncompressed stream only.
When I attempt to directly connect any of the output pins of the Euresys source filters to the input pins of the GMFBridge Sink, they get rejected, with the code VFW_E_NO_ALLOCATOR. (In the past I have successfully connected both H.264 and raw audio streams to the bridge).
Grasping at straws, I plugged in a pair of SampleGrabber filters between the Euresys card filters and the bridge sink filter, and -just like that- the connections between sample grabbers and sink were accepted.
However, I am not getting any packets on the other side of the bridge (the muxing graph). I inspected the running capture graph with GraphStudioNext and somehow the sample grabbers appear detached from my graph, even though I got successful confirmations when I connected them to the source filter!.
Here's the source code creating the graph.
void EuresysSourceBox::BuildGraph() {
HRESULT hRes;
CComPtr<IGraphBuilder> pGraph;
COM_CALL(pGraph.CoCreateInstance(CLSID_FilterGraph));
#ifdef REGISTER_IN_ROT
_rotEntry1 = FilterTools::RegisterGraphInROT(IntPtr(pGraph), "euresys graph");
#endif
// [*Video Source*]
String^ filterName = "Ux H.264 Visual Source";
Guid category = _GUIDToGuid((GUID)AM_KSCATEGORY_CAPTURE);
FilterMonikerList^ videoSourceList = FilterTools::GetFilterMonikersByName(category, filterName);
CComPtr<IBaseFilter> pVideoSource;
int monikerIndex = config->BoardIndex; // a filter instance will be retrieved for every installed board
clr_scoped_ptr<CComPtr<IMoniker>>^ ppVideoSourceMoniker = videoSourceList[monikerIndex];
COM_CALL((*ppVideoSourceMoniker->get())->BindToObject(NULL, NULL, IID_IBaseFilter, (void**)&pVideoSource));
COM_CALL(pGraph->AddFilter(pVideoSource, L"VideoSource"));
// [Video Source]
//
// [*Audio Source*]
filterName = "Ux H.264 Audio Encoder";
FilterMonikerList^ audioSourceList = FilterTools::GetFilterMonikersByName(category, filterName);
CComPtr<IBaseFilter> pAudioSource;
clr_scoped_ptr<CComPtr<IMoniker>>^ ppAudioSourceMoniker = audioSourceList[monikerIndex];
COM_CALL((*ppAudioSourceMoniker->get())->BindToObject(NULL, NULL, IID_IBaseFilter, (void**)&pAudioSource));
COM_CALL(pGraph->AddFilter(pAudioSource, L"AudioSource"));
CComPtr<IPin> pVideoCompressedOutPin(FilterTools::GetPin(pVideoSource, "Encoded"));
CComPtr<IPin> pAudioOutPin(FilterTools::GetPin(pAudioSource, "Audio"));
CComPtr<IBaseFilter> pSampleGrabber;
COM_CALL(pSampleGrabber.CoCreateInstance(CLSID_SampleGrabber));
COM_CALL(pGraph->AddFilter(pSampleGrabber, L"SampleGrabber"));
CComPtr<IPin> pSampleGrabberInPin(FilterTools::GetPin(pSampleGrabber, "Input"));
COM_CALL(pGraph->ConnectDirect(pVideoCompressedOutPin, pSampleGrabberInPin, NULL)); // DOES NOT FAIL!!
CComPtr<IBaseFilter> pSampleGrabber2;
COM_CALL(pSampleGrabber2.CoCreateInstance(CLSID_SampleGrabber));
COM_CALL(pGraph->AddFilter(pSampleGrabber2, L"SampleGrabber2"));
CComPtr<IPin> pSampleGrabber2InPin(FilterTools::GetPin(pSampleGrabber2, "Input"));
COM_CALL(pGraph->ConnectDirect(pAudioOutPin, pSampleGrabber2InPin, NULL)); // DOES NOT FAIL!!
// [Video Source]---
// |-->[*Bridge Sink*]
// [Audio Source]---
CComPtr<IPin> pSampleGrabberOutPin(FilterTools::GetPin(pSampleGrabber, "Output"));
CComPtr<IPin> pSampleGrabber2OutPin(FilterTools::GetPin(pSampleGrabber2, "Output"));
CreateGraphBridge(
IntPtr(pGraph),
IntPtr(pSampleGrabberOutPin),
IntPtr(pSampleGrabber2OutPin)
);
// Root graph to parent object
_ppCaptureGraph.reset(new CComPtr<IGraphBuilder>(pGraph));
}
COM_CALL is my HRESULT checking macro, it will raise a managed exception if the result is other than S_OK. So the connection between pins succeeded, but here is how the graph looks disjointed when it is running:
So, I have three questions:
1) What could VFW_E_NO_ALLOCATOR mean in this instance? (the source filter can be successfully connected to other components such as LAV Video decoder or ffdshow video decoder).
2) Is there a known workaround to circumvent the VFW_E_NO_ALLOCATOR problem?
3) Is it possible that a filter gets disconnected at runtime as it seems to be happening in my case?
I found a reference by Geraint Davies giving a reason as to why the GMFBridge sink filter may be rejecting the connection.
It looks as though the parser is insisting on using its own allocator
-- this is common with parsers where the output samples are merely pointers into the input samples. However, the bridge cannot implement
suspend mode without using its own allocator, so a copy is required.
With this information, I decided to create an ultra simple CTransformFilter filter that simply accepts the allocator offered by the bridge and copies to the output sample whatever comes in from the input sample. I am not 100% sure that what I did was right, but it is working now. I could successfully plug-in the Euresys card as part of my capture infrastructure.
For reference, if anyone experiences something similar, here is the code of the filter I created:
class SampleCopyGeneratorFilter : public CTransformFilter {
protected:
HRESULT CheckInputType(const CMediaType* mtIn) override { return S_OK; }
HRESULT GetMediaType(int iPosition, CMediaType* pMediaType) override;
HRESULT CheckTransform(const CMediaType *mtIn, const CMediaType *mtOut) override { return S_OK; }
HRESULT DecideBufferSize(IMemAllocator *pAlloc, ALLOCATOR_PROPERTIES *pProp) override;
HRESULT Transform(IMediaSample *pSource, IMediaSample *pDest) override;
public:
SampleCopyGeneratorFilter();
};
//--------------------------------------------------------------------------------------------------------------------
SampleCopyGeneratorFilter::SampleCopyGeneratorFilter()
: CTransformFilter(NAME("SampleCopyGeneratorFilter"), NULL, GUID_NULL)
{
}
HRESULT SampleCopyGeneratorFilter::GetMediaType(int iPosition, CMediaType* pMediaType) {
HRESULT hRes;
ASSERT(m_pInput->IsConnected());
if( iPosition < 0 )
return E_INVALIDARG;
CComPtr<IPin> connectedTo;
COM_CALL(m_pInput->ConnectedTo(&connectedTo));
CComPtr<IEnumMediaTypes> pMTEnumerator;
COM_CALL(connectedTo->EnumMediaTypes(&pMTEnumerator));
AM_MEDIA_TYPE* pIteratedMediaType;
for( int i = 0; i <= iPosition; i++ ) {
if( pMTEnumerator->Next(1, &pIteratedMediaType, NULL) != S_OK )
return VFW_S_NO_MORE_ITEMS;
if( i == iPosition )
*pMediaType = *pIteratedMediaType;
DeleteMediaType(pIteratedMediaType);
}
return S_OK;
}
HRESULT SampleCopyGeneratorFilter::DecideBufferSize(IMemAllocator *pAlloc, ALLOCATOR_PROPERTIES *pProp) {
HRESULT hRes;
AM_MEDIA_TYPE mt;
COM_CALL(m_pInput->ConnectionMediaType(&mt));
try {
BITMAPINFOHEADER* pBMI = HEADER(mt.pbFormat);
pProp->cbBuffer = DIBSIZE(*pBMI); // format is compressed, uncompressed size should be enough
if( !pProp->cbAlign )
pProp->cbAlign = 1;
pProp->cbPrefix = 0;
pProp->cBuffers = 4;
ALLOCATOR_PROPERTIES actualProperties;
COM_CALL(pAlloc->SetProperties(pProp, &actualProperties));
if( pProp->cbBuffer > actualProperties.cbBuffer )
return E_FAIL;
return S_OK;
} finally{
FreeMediaType(mt);
}
}
HRESULT SampleCopyGeneratorFilter::Transform(IMediaSample *pSource, IMediaSample *pDest) {
HRESULT hRes;
BYTE* pBufferIn;
BYTE* pBufferOut;
long destSize = pDest->GetSize();
long dataLen = pSource->GetActualDataLength();
if( dataLen > destSize )
return VFW_E_BUFFER_OVERFLOW;
COM_CALL(pSource->GetPointer(&pBufferIn));
COM_CALL(pDest->GetPointer(&pBufferOut));
memcpy(pBufferOut, pBufferIn, dataLen);
pDest->SetActualDataLength(dataLen);
pDest->SetSyncPoint(pSource->IsSyncPoint() == S_OK);
return S_OK;
}
Here is how I inserted the filter in the capture graph:
CComPtr<IPin> pAACEncoderOutPin(FilterTools::GetPin(pAACEncoder, "XForm Out"));
CComPtr<IPin> pVideoSourceCompressedOutPin(FilterTools::GetPin(pVideoSource, "Encoded"));
CComPtr<IBaseFilter> pSampleCopier;
pSampleCopier = new SampleCopyGeneratorFilter();
COM_CALL(pGraph->AddFilter(pSampleCopier, L"SampleCopier"));
CComPtr<IPin> pSampleCopierInPin(FilterTools::GetPin(pSampleCopier, "XForm In"));
COM_CALL(pGraph->ConnectDirect(pVideoSourceCompressedOutPin, pSampleCopierInPin, NULL));
CComPtr<IPin> pSampleCopierOutPin(FilterTools::GetPin(pSampleCopier, "XForm Out"));
CreateGraphBridge(
IntPtr(pGraph),
IntPtr(pSampleCopierOutPin),
IntPtr(pAACEncoderOutPin)
);
Now, I still have no idea why inserting the sample grabber instead did not work and resulted in detached graphs. I corroborated this quirk by examining the graphs with Graphedit Plus too. If anyone can offer me an explanation, I would be very grateful indeed.

Using a Filter Audio Unit Effect in iOS5

I'm trying to use a remote IO connection and route the audio input through the built in filter effect (iOS 5 only) and then back out of the hardware. I can make it route straight from the input to the output but I can't get the filter to work. I'm not sure whether it's the filter Audio Unit or the routing that I've got wrong.
This bit is just my attempt at setting up the filter and changing the routing so that the data is processed by it.
Any help is appreciated.
// ******* BEGIN FILTER ********
NSLog(#"Begin filter");
// Creates Audio Component Description - Output Filter
AudioComponentDescription filterCompDesc;
filterCompDesc .componentType = kAudioUnitType_Effect;
filterCompDesc.componentSubType = kAudioUnitSubType_LowPassFilter;
filterCompDesc.componentManufacturer = kAudioUnitManufacturer_Apple;
filterCompDesc.componentFlags = 1;
filterCompDesc.componentFlagsMask = 1;
// Create Filter Unit
AudioUnit lpFilterUnit;
AudioComponent filterComponent = AudioComponentFindNext(NULL, &filterCompDesc);
setupErr = AudioComponentInstanceNew(filterComponent, &lpFilterUnit);
NSAssert(setupErr == noErr, #"No instance of filter");
AudioUnitElement bus2 = 2;
setupErr = AudioUnitSetProperty(lpFilterUnit, kAudioUnitSubType_LowPassFilter, kAudioUnitScope_Output, bus2, &oneFlag, sizeof(oneFlag));
AudioUnitElement bus3 = 3;
setupErr = AudioUnitSetProperty(lpFilterUnit, kAudioUnitSubType_LowPassFilter, kAudioUnitScope_Input, bus3, &oneFlag, sizeof(oneFlag));
// ******** END FILTER ******** //
AudioUnitConnection hardInToLP;
hardInToLP.sourceAudioUnit = remoteIOunit;
hardInToLP.sourceOutputNumber = 1;
hardInToLP.destInputNumber = 3;
setupErr = AudioUnitSetProperty (
remoteIOunit, // connection destination
kAudioUnitProperty_MakeConnection, // property key
kAudioUnitScope_Input, // destination scope
bus3, // destination element
&hardInToLP, // connection definition
sizeof (hardInToLP)
);
AudioUnitConnection LPToHardOut;
LPToHardOut.sourceAudioUnit = lpFilterUnit;
LPToHardOut.sourceOutputNumber = 1;
LPToHardOut.destInputNumber = 3;
setupErr = AudioUnitSetProperty (
remoteIOunit, // connection destination
kAudioUnitProperty_MakeConnection, // property key
kAudioUnitScope_Input, // destination scope
bus3, // destination element
&hardInToLP, // connection definition
sizeof (hardInToLP)
);
/*
// Sets up the Audio Units Connection - new instance called connection
AudioUnitConnection connection;
// Connect Audio Input's out to Audio Out's in
connection.sourceAudioUnit = remoteIOunit;
connection.sourceOutputNumber = bus1;
connection.destInputNumber = bus0;
setupErr = AudioUnitSetProperty(remoteIOunit, kAudioUnitProperty_MakeConnection, kAudioUnitScope_Input, bus0, &connection, sizeof(connection));
*/
NSAssert(setupErr == noErr, #"No RIO connection");
A couple things going on here:
You're gonna help yourself a lot if you do an assert (or some sort of check-error-and-log-it) after every call that can return an OSStatus. That way you'll figure out how far you're getting. Probably also want to log the actual OSStatus value when it's != noErr, and then look it up (start in "Audio Unit Component Services Reference" in Xcode documentation viewer).
After you create the filter AudioUnit, I don't get what you're doing with the AudioUnitSetProperty() calls. The second parameter should be the name of a property (something that starts with kAudioUnitProperty...). That's almost certainly returning an error right there.
remoteIOunit only has two buses, and they have special meanings. bus 1 is input from the mic, bus 0 is output to hardware. Trying to connect to remote io input scope bus 3 is probably going to be another error
Suggest you roll back to when you had audio pass-through working. That would mean you had just remoteIO, and a connection from output scope / bus 1 to input scope / bus 0.
Then create the filter unit. Change your connections so you connect:
remoteIO output scope bus 1 to filter input scope bus 0
filter output scope bus 0 to remoteIO input scope bus 0
The other thing that's going to be a problem is that all these iOS 5 filters seem to want to use floating-point LPCM formats, which is not the canonical format your other units will default to. You may have to get the stream format from the filter unit (input or output scope are probably the same?) and then set that as the format that remoteIO output scope / bus 1 produces and remoteIO input scope / bus 0 accepts. Another option would be to introduce AUConverter units before and after the filter unit.
The first answer given here just saved me a lot more frustration. No where does the Apple documentation tell you that the file formats for the Effect units require floating point. I couldn't figure out why it kept failing to play my audio properly until I read this post. I followed the advice above and retrieved the stream format from the low pass filter unit, and used that to set up two converter units that I created (ie. set the output format of the pre filter converter, and the input format of the post filter converter. Once I did that and connected all the nodes together it started working as expected.
im trying to use a low pass filter and when trying to do as suggested aka set the format i keep getting an error "the operation could not be completed" what in this code is faulty?
After retrieving the lowpassUnit I also check for errors but there are none.
result = AudioUnitSetProperty(lowpassUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 0, &stereoStreamFormat, sizeof (stereoStreamFormat));
if (noErr != result)
{
NSLog(#"%#", [NSError errorWithDomain:NSOSStatusErrorDomain code:result userInfo:nil]);
return;
}
PS: If anyone knows of proper Audio unit documentation please share as the official documentation is really lacking

DirectShow's PushSource filters cause IMediaControl::Run to return S_FALSE

I'm messing around with the PushSource sample filter shipped with the DirectShow SDK and I'm having the following problem:
When I call IMediaControl::Run(), it returns S_FALSE which means "the graph is preparing to run, but some filters have not completed the transition to a running state". MSDN suggests to then call IMediaControl::GetState() and wait for the transition to finish.
And so, I call IMediaControl::GetState(INFINITE, ...) which is supposed to solve the problem.
However, to the contrary, it returns VFW_S_STATE_INTERMEDIATE even though I've specified an infinite waiting time.
I've tried all three variations (Bitmap, Bitmap Set and Desktop) and they all behave the same way, which initially lead me to believe there is a bug in there somewhere.
However, then, I tried using IFilterGraph::AddSourceFilter to do the same and it did the same thing, which must mean it's my rendering code that is the problem:
CoInitialize(0);
IGraphBuilder *graph = 0;
assert(S_OK == CoCreateInstance(CLSID_FilterGraph, 0, CLSCTX_INPROC_SERVER, IID_IGraphBuilder, (void**)&graph));
IBaseFilter *pushSource = 0;
graph->AddSourceFilter(L"sample.bmp", L"Source", &pushSource);
IPin *srcOut = 0;
assert(S_OK == GetPin(pushSource, PINDIR_OUTPUT, &srcOut));
graph->Render(srcOut);
IMediaControl *c = 0;
IMediaEvent *pEvent;
assert(S_OK == graph->QueryInterface(IID_IMediaControl, (void**)&c));
assert(S_OK == graph->QueryInterface(IID_IMediaEvent, (void**)&pEvent));
HRESULT hr = c->Run();
if(hr != S_OK)
{
if(hr == S_FALSE)
{
OAFilterState state;
hr = c->GetState(INFINITE, &state);
assert(hr == S_OK );
}
}
long code;
assert(S_OK == pEvent->WaitForCompletion(INFINITE, &code));
Anyone knows how to fix this?
IBaseFilter *pushSource = 0;
graph->AddSourceFilter(L"sample.bmp", L"Source", &pushSource);
AddSourceFilter adds a default source filter, I don't think it will add your pushsource samplefilter.
I would recommend to add the graph to the ROT, so you can inspect it with graphedit.
And what happens if you don't call GetState()?
hr = pMediaControl->Run();
if(FAILED(hr)) {
/// handle error
}
long evCode=0;
while (evCode == 0)
{
pEvent->WaitForCompletion(1000, &evCode);
/// other code
}
Open GraphEditPlus, add your filter, render its pin and press Run. Then you'll see states of each filter separately, so you'll see what filter didn't run and why.