I received this request: https://github.com/Simbul/baker/issues/291
So I tried to look up if there was any event fired on Objective-C side from the UIWebView when the video began and stopped to be in full screen, in order to enable/disable again the orientation lock.
I'm aware this is possible in JavaScript:
var player = document.getElementsByTagName("video")[0];
player.addEventListener('play',videoPlayHandler,false);
player.addEventListener('pause',videoPauseHandler,false);
but I would like to avoid injecting code in the page to detect it, since Baker is a framework and we allow any content running inside it.
Thanks in advance. :)
Related
iam learning app development so it is very important for me if you people guide me. I have main activity and and two fragments, 1 for listitem and another for now playing. When clicked list item it pass the url to 2nd(now playing) fragment and start playing the music, everything going normal but i want to do this in background/foreground service with notifications control and i want to use exoPlayer or which one is better to stream deferent types of audio. Please guys i need to achieve this in any how help me. Waiting for yours answer. Thank you
My design calls for a video playing in the background of my login screen, exactly like 6snap has.
I would like to avoid the default behavior of stopping the user's music when the video starts to play. My video does not have sound.
I'm using:
<MediaElement Source="MyVideo.mp4" />
I tried setting IsMuted=true which didn't help. Does anyone have an idea how 6snap managed it?
Edit: currently trying the animated gif route. Using the ImageTools 3rd party library and having converted my MP4, it works fine. My 9 second 640x1136 3MB video became a 41MB GIF, so I have to reduce the quality drastically. Still trying to find a better way if possible.
You won't be able to do that with Background Audio and MediaElement, hence as MSDN says:
When a MediaElement control plays audio or video content, any background sounds or media already playing are halted. The app launches the playback experience when the user taps the control. Only one MediaElement control can operate at a time.
It's no matter you have no sound - when you start to play all background sounds/media are halted.
I'm not sure how the App you have mentioned achieved that, but maybe you can try with DirectX/XNA - thought I've not tried this and don't know if that would help.
Before iOS7 came, we noticed an issue:
Music remote-control from earbud or springboard can hijack our audio session even if we set the category to solo-ambient or another exclusive mode.
We thus tried a few things:
We tried to take ownership of the audio session back. But this requires that our audio code knows when to take it back and from whom. We thought we could let the app code become the first responder to remote-control events, do our stuff, and then pass the events on to the music app. However, we found that the events got detained by the first responder and there is no way to push it back to the chain of commands.
We tried to become first-resonder and block remote-control events all together when we are in solo-ambient. This worked fine with iOS6, still works with earbud control in iOS7, but fails with iOS7's control center. The control center seems to bypass the remote-control event handler remoteControlReceivedWithEvent completely, where we put our blocking code.
I read something elsewhere that:
You can't block the music app. your app can become one though (apple
won't like that) and then the control center would control yours.
But I found no documentation whatsoever about control center.
And as said above, control center does not enter the normal remote control hooks even if an app is the first responder.
Another quoteP
Remote Control Event handling is so your app can be controlled by
Control Center, the earbuds, etc... it is not so that your app can eat
said controls, preventing control of other apps from said sources. It
only worked in iOS6 because of a bug in iOS, now fixed in iOS7
Is it that what had were using was due this bug? I find it hard to believe because we got the solution on this list and the Xcode mailing list so I assume that was an accepted solution.
Now we really wonder if we are missing something from the very beginning:
Is solo-ambient really an exclusive mode for audio session or is it that music app is an exception to that exclusivity?
How can our app live in harmony with the remote-control, and control center?
Where can we find up-to-date documentation of remote-control and control center?
The remote control has been mysteriously fixed after clean building everything agains iOS7 SDK. Now app delegate can receive remote-control events from Control Center. However, the play/pause events are UIEventSubtypeRemoteControlPause and UIEventSubtypeRemoteControlPlay instead of the iOS6's UIEventSubtypeRemoteControlTogglePlayPause.
I've looked through the Apple documentation but have seen no mention of how to do this, nevermind if it's even possible or not. I'd like to make it so that an iPhone/iPad begins video recording automatically when a certain view is loaded, and stops and saves when the view is dismissed. Is there any way that I can do this or am I just going to have to use the normal UI for video recording?
Use AVCaptureSession
https://developer.apple.com/library/ios/#documentation/AVFoundation/Reference/AVCaptureSession_Class/Reference/Reference.html#//apple_ref/occ/cl/AVCaptureSession
I've created an animation which runs inside of the Google Earth plugin (browser) and I'd like to somehow encode this animation into a video format that I can upload to YouTube or a related video site. Are there any tools out there to help me do this?
**EDIT: more detail
This animation changes depending on user input. So it needs to be scalable. The user would click a button: download video after which a server would convert the animation.
You can use FRAPS to record a video of the animation running on your machine.