Deciding if external display is activated or not - objective-c

I'd like to be able to decide if the display on the computer where my app is running is currently active or shutdown. I need this for a media center software so I know if I need to activate the display before starting the playback of movies.
So far I tried to use this code:
CGError err0 = CGDisplayNoErr;
CGError err1 = CGDisplayNoErr;
CGDisplayCount dspCount = 0;
err0 = CGGetActiveDisplayList(0, NULL, &dspCount);
CGDisplayCount onlineCount = 0;
err1 = CGGetOnlineDisplayList(0, NULL, &onlineCount);
// Error handling omitted for clarity ;)
NSLog(#"Found %d active and %d online displays", dspCount, onlineCount);
But this code out puts always the same. When I try it on my mac mini, with the display turned off I get the following output:
Found 1 active and 1 online displays
The display is not in a standby mode as I disconnect the power to it when it is not in use. I also tried this on my mac book, which has an internal and an external display. And there it returns:
Found 2 active and 2 online displays
Here it is the same, I deactivate the display and disconnect the power to it but is still returns as beeing active.
The display on the mac mini is a tv-set connected with a dvi to hdmi cable. The display on the mac book is connected with a dvi to vga connector.
I hope somebody has an idea how to solve this. Thanks in advance.

It sounds like you want to know whether any connected display is asleep or not?
Have you looked at the CGDisplayIsAsleep function?
http://developer.apple.com/library/mac/documentation/GraphicsImaging/Reference/Quartz_Services_Ref/Reference/reference.html#//apple_ref/c/func/CGDisplayIsAsleep

To close this open question. My final findings were that as soon as an external monitor is connected to the computer the given methods will return that it is there. And this also works when the monitor is powered of and not connected to the power source.
So as far as I can tell there is no way to find out what I would like to know :(
As I control the event which activates the monitor from my application (in my case its a TV which I control with a usb to ir box) I can get the state of the monitor in this way, but this only has the downside that when the application is crashing, I will lose the state. But thats the best solution I could find.

Related

Take pictures using camera shutter when receiving live view data

I am developing using EDSDK.
However, if I press the shutter of the camera (physical button) while receiving live view data(EVF Mode), the picture won't be taken. Is this normal?
My camera model is 200D II.
What I'm trying to do is as follows, and it's very simple.
My software activates the camera through EDSDK and receives live view data.
The person behind the camera takes a picture by pressing the camera shutter, and my software shows the picture on the screen.
The questions are as follows.
How to take pictures using physical camera shutter buttons when receiving live view data(EVF).
HDMI connections are not considered because there are features that need to be controlled directly through EDSDK.
Thank you.
Below is what I added after Johannes Bildstein's answer.
As Johannes Bildstein answered, the following code was inserted to unlock the UI.
But it still hasn't been solved.
if (!MainCamera.IsLiveViewOn) {
MainCamera.StartLiveView();
MainCamera.UILock(false);
}
Error message occurs when I try to unlock UI and get EVF data. (Shutter still doesn't work)
If I unlock UI after receiving EVF data,
When dial is in photo mode: EVF data is coming in, but the shutter still does not work.
When dial is in video mode: EVF data does not come in due to BUSY error. Is it a conflict due to UI unlocking? We check your answers and SDK documents and try in many ways, but they are still unresolved. We are currently testing the more recent model, 200D II.
"| EvfOutputDevice.Camera " should be added like below!
public void StartLiveView()
{
CheckState();
if (!IsLiveViewOn) SetSetting(PropertyID.Evf_OutputDevice, (int)(EvfOutputDevice.PC | EvfOutputDevice.Camera ));
}
The camera "UI" is probably locked. The EDSDK does that automatically when connecting and before doing certain commands. You can unlock it with EdsSendStatusCommand using kEdsCameraStatusCommand_UIUnLock.

Sending Mouse Input to Another Application Without Focus

I'll try to describe my problem and question as clearly as possible,
I am attempting to create a program that would allow you to plug two keyboards and mice into a single computer and play two copies of the same game at the same time by splitting their inputs to the different application windows. I found no good tool for this online, so I'm making my own.
From all my research and testing it seems like SendMessage and PostMessage are my best bet. I've been intercepting WM_INPUT messages and sending them to the desired application and so far this has been working... to an extent.
I have two videos to help you understand the problem I'm experiencing:
The first simply shows my program working as I would expect when sending input to Chrome. As I scroll around and click on the InputMapper window you can see Chrome reacting to the superimposed cursor as if I were scrolling right over it.
Video 1: Nice
The second video shows me doing the exact same thing with a game's client, and for some reason it does not quite work right. It just plain doesn't work with the dummy cursor, but it does respond to the real cursor, even though the window technically doesn't have focus (which tells me that the PostMessage is working, but it just refuses to acknowledge the clicks at the location of the dummy cursor instead of the real cursor). Video 2: Not Quite
So my question is simple: I am using the same code in both scenarios, but the two applications responded differently and it didn't quite work for the game client. Does anyone have any idea why I would be getting this behavior?
It seems like the game is for whatever reason still registering clicks as being at the location of the real cursor and not the location of the L_PARAM that I send it.
To give an example of my code for handling mouse click events, here you go:
private void m_MouseButtonDown(object sender, InputDevice.MouseControlEventArgs e)
{
if (e.Mouse.deviceHandle != MouseHandle1)
return;
uint L_Param_Window = (uint)(cursorPosX + ((int)cursorPosY << 16));
uint W_Param = 0;
switch (e.Mouse.buttonMessage)
{
case WM_LBUTTONDOWN:
W_Param = (uint)MouseButtons.LEFT;
break;
case WM_MBUTTONDOWN:
W_Param = (uint)MouseButtons.MIDDLE;
break;
case WM_RBUTTONDOWN:
W_Param = (uint)MouseButtons.RIGHT;
break;
}
// Send Messages
PostMessage(applicationHandle1, e.Mouse.buttonMessage, (IntPtr)W_Param, (IntPtr)L_Param_Window);
}
Thank you in advance.
Edit: fixed the video links because I noticed I'm a dummy
Edit 2: Updated because I've improved the program a bit and hopefully my problem is more clear
Also: I have a theory that maybe it is doing this because of how the cursor's image changes in the game client. So maybe there's some kind of underlying aspect of that that is breaking what I'm trying to do.
You could send message keyboard or message mouse to a specific handle with Sendmessage by giving the handle, but the problem is to identify the origin of the message : what is the device who sends this message?? To distinguish between input devices, you need to use Raw Input API. Thas requires some skills....

Screen record with sound - AVFoundation? Desktop Mac

I was trying to create two things. Both for desktop mac. Both which involve recording screen/audio.
In first thing, which is my main priority right now, I am making a song identifier. The second thing, is a screen capture (with audio) thing.
I was thinking of using AVFoundation. I don't see any sound recording capabilities though, just playing - https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVAudioPlayerClassReference/index.html#//apple_ref/doc/uid/TP40008067
Is it possbile to record system audio somehow?
Thanks
I've used this document in the past to figure out the live screen recording part. https://developer.apple.com/library/mac/qa/qa1740/_index.html
You'll probably also find the code snipped in the AVCaptureSession overview useful.
The gist of it is that AVCaptureSession is the object that controls all your inputs and outputs for the given capture session. In this case it would be AVCaptureScreenInput and I believe for audio you want AVCaptureDeviceInput of type audio. There is a way to get the list of all the available devices for a AVCaptureDevice of a specific type. Then you add AVCaptureMovieFileOutput to your session output.
I know that's a little high level, but that technical Q&A as well as looking into getting particular input types should help.

How do I detect that the Windows Mobile transitioned to continuum mode?

Is there a way to detect that Windows Mobile 10 transitioned into continuum mode?
The message-box on Windows Phone does not look anything like the one on Desktop and our designers want parity. I want to write our own version, but I only want it to work on Phone - I want the default one on desktop or when the app transitions to continuum on phone.
Any ideas?
I could not find anything on the web nor find any API that allows me to detect it.
I may be wrong but I don't think there is an API for Continuum. The idea of Continuum for Phone is that you're going from a fixed display size to something that is variable. The best way to detect this would be using the Visual State Triggers or checking if the size of the window has changed.
By also checking that the device family, AnalyticsInfo.VersionInfo.DeviceFamily, is Windows.Mobile, you'll know that you're using a phone device which is currently in the Continuum mode.
To detect if app is running in Continuum mode you'll need to check two things: DeviceFamily and UserInteractionMode.
public static bool IsInContinuum()
{
if (DeviceFamily() == DeviceFamilyType.Mobile && UIViewSettings.GetForCurrentView().UserInteractionMode == UserInteractionMode.Mouse)
return true;
else
return false;
}
Quote from this post:
"With Continuum, “touch” will always be returned when your app is on the mobile device, and “mouse” will always be returned when your app is on the connected display."
So you'll need to check if app runs in Continuum in SizeChanged event.
Due to MSDN Documentation Below,
There's no spesific trigger for Windows 10 Mobile continuum feature detection.
Continuum for Universal Apps
In order to find a solution on Mobile Apps, you can benefit from adaptive UI, you can check the app via screen resolution change Window.Current.SizeChanged, then you can combine with Device family AnalyticsInfo.VersionInfo.DeviceFamily to check if device is in Continuum mode.

adobe air, detect when a webcam is unplugged?

I have tried to check activity, fps, status. nothing is triggered when an active camera is unplugged in as3/air. Has anyone found a working way? In my case i have a kiosk running air 2.7 running two webcams. In some cases a usb webcam might be unplugged and plugged back in. I have been trying find a way to detect when its unplugged so I can restart it. Ideas?
Unfortunately i have no USB camera to test this with (only got insight).
You could try ActivityEvent and set motionlevel to some low value.
ActivityEvent will give you response if and when motion is detected in camera. I belive when Camera is physically disconnected, activity event should trigger since no activity will be detected.
Here's a piece of example:
import flash.media.Camera;
import flash.display.Stage;
import flash.media.Video;
import flash.events.ActivityEvent;
import flash.events.StatusEvent;
var camera:Camera = Camera.getCamera();
camera.setMode(stage.stageWidth, stage.stageHeight,25);
camera.addEventListener(ActivityEvent.ACTIVITY, activityEventHandler,false,0,true);
camera.setMotionLevel(3);
var video:Video = new Video();
video.width = stage.stageWidth;
video.height = stage.stageHeight;
video.attachCamera(camera);
addChild(video);
function activityEventHandler(a:ActivityEvent):void{
trace('Motion detected: '+a.activating);
}
Note:
setMotionlevel default value is 50 so if you set it to eg. 3 then camera still notices some small changes, even eye-blinking. That would help you to detect if there is any kind of motion at all. If no motion is detected then camera is probably dead.
You maybe even can use motionlevel as 1, but this value is very sensitive and even slightest change in room lightning is probably detected as motion.
Let me know if that helps, would be interesting to hear about this in practice with real USB camera.