How to find out if the video is currently playing in full screen on OS X - objective-c

We have an application for Mac OS X that needs to know when the user is watching a movie in full screen to change its behavior.
Is there any system programmatic "hooks" that allow native Objective-C application to know when fullscreen playback is started?

You can get a list of all windows by using the CGWindow API, like in the Son of Grab sample.
From there, you can look at the window levels to figure out which windows are full screen, but I am not aware of any way to look for video playback specifically, as different apps (VLC, QickTime Player) all use slightly different methods. Of course, you could hard code specific process names, and assume that they are doing video playback if they have a fullscreen window.

Related

Agora Live Stream Dual Camera

I am currently trying to produce an android app that can do live broadcasting. May I know if Agora has the functionalities to access both the rear and front camera of the broadcaster at the same time? If yes, which part of the code do we need to modify (based on Open-Live-Android)?
Agora does offer a demo that directly displays the code you are looking for, but if you can get both camera frames (which some devices may not support that), you can take a look at this demo app: https://github.com/AgoraIO/Advanced-Video/tree/dev/win-screenshare/Screensharing/Agora-Screen-Sharing-Android. In this demo app, the SDK is sending both camera view and screen share view at the same time. In order to achieve that, you need to make screen share as a standalone service. Following a similar logic, you can change the screen sharing part to one of the camera view.

How can I detect/observe when third party app launches a full screen process?

I would like to build a helper app for gamers, and to build some extra functionality I would like to observe/time certain third party games behaviors, more specifically when the game actually launches the full screen process.
For example: my app is a system tray app, the game has a "launcher" app with lobby and menu screens. Once the game launches the extra process, usually OS X will switch resolutions (optionally) and my App would be notified somehow. Then I can start a timer. Once game match finishes, either the process is closed, or the game is not full screen anymore, my app gets a second notification and I can stop the timer.
Are there official Apple APIs that provide any way to observe/poll for the app going full screen and/or launch additional windows that I can reliably assume it's the actual game screen?
I doubt you're going to find a completely comprehensive solution. There are many ways for apps to achieve a full-screen experience and most don't provide a notification about that fact.
A full-screen app can modify the presentationOptions of NSApplication to hide the Dock and menu bar. Another app can use key-value observing to monitor its application object's currentSystemPresentationOptions property, which will reflect the current system status.
A full-screen app can capture the displays (although Apple discourages this technique). You can try to detect this by calling CGDisplayIsCaptured(), although it's been deprecated since 10.9 with no replacement. It may be possible that, if you register a callback with CGDisplayRegisterReconfigurationCallback(), you'll get called when something captures the display. However, capturing the display is sort of about preventing other processes from noticing such changes, so maybe not. In that case, you'd have to poll. You might also poll for the current display mode; changing the mode is the primary reason why a game would capture the display in the first place.
A full-screen game could also just create a borderless window the size of the screen and set its window level to be in front of the Dock and menu bar (and other apps' windows). There's not really a notification about this. You could detect it using the CGWindowList API, but you would have to poll. For example, you could call CGWindowListCopyWindowInfo(kCGWindowListOptionOnScreenOnly, kCGNullWindowID) and iterate through the dictionaries looking for one the size of the screen and at a window level above kCGStatusWindowLevel.
(You might be able to use the Accessibility API to get a notification when the frontmost window changes, so you'd only have to poll when that happens.)
You cannot observe a notification if there is none. So firstly you need to know if the app you want to observe is actually sending a notification that is observable. You cannot 'hook' into other apps without their planned consent.

Playing a muted video without interrupting background music

My design calls for a video playing in the background of my login screen, exactly like 6snap has.
I would like to avoid the default behavior of stopping the user's music when the video starts to play. My video does not have sound.
I'm using:
<MediaElement Source="MyVideo.mp4" />
I tried setting IsMuted=true which didn't help. Does anyone have an idea how 6snap managed it?
Edit: currently trying the animated gif route. Using the ImageTools 3rd party library and having converted my MP4, it works fine. My 9 second 640x1136 3MB video became a 41MB GIF, so I have to reduce the quality drastically. Still trying to find a better way if possible.
You won't be able to do that with Background Audio and MediaElement, hence as MSDN says:
When a MediaElement control plays audio or video content, any background sounds or media already playing are halted. The app launches the playback experience when the user taps the control. Only one MediaElement control can operate at a time.
It's no matter you have no sound - when you start to play all background sounds/media are halted.
I'm not sure how the App you have mentioned achieved that, but maybe you can try with DirectX/XNA - thought I've not tried this and don't know if that would help.

Camera Lense overlay - WindowsRT

I'm creating an app where I would like the user to be able to take their own photos. However, I'd like to apply an overlay of where the face should be in the picture (in order for the app to work best).
So when the camera is launched from the app, I'd like there to be a faint outline that is visible on the screen. This way the user can line up the face inside of that outline.
Where would I even look to see how that is done?
Start with MSDN or with the Nokia Developer's library
Nokia just released an Imaging SDK for Windows Phone 8 which might be useful.

Is it possible to capture touch events in the background on a jailbroken iOS device?

I have an installation project in mind which involves a hacked iPad - I'd like to have a background process running recording all the touch events regardless of what app is running in the foreground, and send them out via OSC.
Note that this is using a jailbroken iPad with root access, and users will be alerted about not entering any sensitive data. But I'm not an iOS developer so I'm not sure if this is even possible. I'd appreciate any kind of input/suggestions.
[edit] Since someone questioned my motive behind this question, I'll try to explain a bit: to be specific, I'd like to build a mechanical system with Arduino that emulates the user's touch input on the iPad, but I do not want to limit them to using an app that does nothing else but recording touch events.
There are three options:
Use the IOHIDFamily subsystem to capture all the touch events. This will do most of the processing for you, the only thing you'll need to do is fetch the events using a HID client, get their types, and if they are touch events, get their position, radius and other things you need.
Use the MultitouchSupport framework. This way you will have to process the digitizer data frames manually which is tricky.
Use a MobileSubstrate hook to hook the already existing HID client inside SpringBoard.