IOS Segemnting a video - objective-c

I want to segment a video into 9 sections jumble the sections about and then play them together. I am currently trying to use AVPlayer to do this. I can get the sections to load on the simulator but only the first 4 will load on the actual phone. I guess the resources are topping out.
My question is: Is AVPlayer the best framework to use for this (as it seems a waste to create a player item, player and playerlayer for each segment(Which I think is why the resources are topping out)) or is there a lower framework I can use to load one video and display certain segments of it on certain sreas of the screen?
Thanks

Related

How to display the screen of one player to all other players in Godot ,gdscript?

I'm using Godot engine to develop a multiplayer Lan WiFi game,at some point the game will give a player a task to solve ,the task is a mini game that has some random aspects ,one player should control and solve this task while other players will be just watching and should not be able to control anything ,so I want to know how to display exactly what's happening on that player screen to the rest of players?
enter image description here
what i would recommend is to record the person game screen and just send the recording to the other person via live this is going to take some bandwidth tho
(https://github.com/henriquelalves/GodotRecorder)
also when sending a screen recording it just a matrix array or a pool byte array in Godot i think.
another way is to get person movement and location and set the camera to that exact location you could also just use two cameras and a split screen to see the current player and the other player.

Show subtitles with an AVFoundation AVPlayer on OS X

I'm trying to display subtitles when playing video using AVFoundation on OS X.
I've looked through the documentation and I can't find a way to enable a subtitle track. The API contains multiple references to subtitle tracks, which leads me to believe that it's supported.
On iOS the method -[AVPlayerItem selectMediaOption:inMediaSelectionGroup:] is used to enable subtitle tracks. This method isn't available in the 10.7 SDK. Is there another way to show subtitles?
EDIT:
Quicktime Player X has subtitle support, for example on opening this movie the subtitle submenu appears to offer a choice of language, and will display them when English is chosen. This leads me to believe that they're included in the API...
I ran into this same issue myself. I found that unfortunately the only way to do it, other than switching to QTKit, is to make a separate subtitles layer (a CATextLayer) and position it appropriately as a sublayer to the player layer. The idea is that you set up a periodic time observer to trigger every second or so and update the subtitles, along with (and this is optional) some UI element you might have that shows what the elapsed time is in the video.
I created a basic SubRip (.srt) file parser class; you can find it here: https://github.com/sstigler/SubRip-for-Mac . Be sure to check the wiki for documentation. The class is available under the terms of the BSD license.
Another challenge you might run into is how to dynamically adjust the height of the CATextLayer to account for varying lengths of subtitles, and varying widths of the containing view (if you choose to make it user-resizable). I found a great CALayoutManager subclass that does this, and made some revisions to it in order to get it to work for what I was trying to do: https://github.com/sstigler/height-for-width .
I hope this helps.
The way to add the subtitle file is by adding scc subtitle as AVMediaTypeClosedCaption track to the AVPlayer using AVMutableCompositionTrack and the player will control it I tried it and it worked or by adding a text file with the quick time text or any text as AVMediaTypeText track and the player will show the subtitle I don't know why this AVMediaTypeSubtitle track what subtitle file is supported for it

Is it possible play multiple clips using presentMoviePlayerViewControllerAnimated?

I have a situation where I'd like to play 2 video clips back to back using an MPMoviePlayerViewController displayed using presentMoviePlayerViewControllerAnimated.
The problem is that the modal view automatically closes itself as soon as the first movie is complete.
Has anyone found a way to do this?
Three options:
You may use MPMoviePlayerController and start the playback of the 2nd (Nth) item after the previous is complete. This however will introduce a small gap between the videos cause by identification and pre buffering of the content.
You may use AVQueuePlayer; AVQueuePlayer is a subclass of AVPlayer you use to play a number of items in sequence. See its reference for more.
You may use AVComposition for at runtime composing one video out of the two (or N) you need to play back. Note, this works only on locally stored videos and not on remote (streaming or progressive download). Then use AVPlayer for the playback.
It's not possible. If the video assets are in local file system, consider AVComposition.

ios app question video effects

I am trying to piece together a solution to let users take and edit videos in an app. I have seen the 8mm app and am wondering how they did it... and made it so smooth.
At first I was thinking the effects might have been a series of pngs streamed together like a animated gif and then placed on top of the real video. but then for merging the images to the video I am at a loss. Also the app is so smooth I think it has to be using some low level Core.media Framework but am not sure.
Any ideas or advise on where to begin?
Thanks
AVFoundation combined with OpenGL ES 2.0 (with shaders) provides great performances for adding effects to camera / video in realtime (and even better with the ios 5 but i can't say too much due to the NDA).
You probably read most the documentation of AVFoundation to start with, because there is a lot going on. One method that might be of interests is this one:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection;
which allow you to work directly with blocks of data representing video information coming from the camera. You can then modify this data to change the video information, for example, adding additional content or pictures on top of the video frame. You can use Open GL ES to do this processing.

Playing Audio and Video of a mp4 file separately using AVFoundation Framework

I have developed a media player using AVFoundation for iOS. I am using AVPlayer to play the audio-video files (eg. a mp4 file). It seems quiet simple to play the file using AVPlayer, by directly calling the play, pause APIs.
Now I want to separate the audio and video and play them as individual entities simultaneously. I want to do this because, I may do some editing to the audio or video track, and then play the file.
I can separate the two using AVAssetTracks, but I dont know how to play the tracks. Also, I would like to play the two tracks simultaneously, so that no AVSync problem occurs.
Please guide me how to achieve this target, i.e. audio and video rendering with no AVSync problem.
Thanks..
The easiest way to achieve this would be to have multiple player items. I would create a playerItem with all the tracks in their original form (ie the mp4 file). Then create another asset using AVMutableComposition (a subclass of AVAsset). This allows you to put only certain tracks into the composition (ie the audio track only). When you want to play the audio only, play the playerItem (AVPLayer setCurrentItem:) that has the mutable composition with the audio only track. When you want to play the video only, play the playerItem that has the mutable composition with the video only track. When you want to play both in sync, play the playerItem with the original asset.
I'm assuming you want to play the edited versions in sync. For that you will have to create another AVMutableComposition and add all of the edited tracks. Then call setCurrentItem: with the newly created AVMutableComposition.
If all you are trying to do is edit the different tracks, and never have to play them by themselves, you can do this with one single AVMutableComposition. Just add an audio and video AVMutableCompositionTrack and edit until your hearts content. They will always play in sync no matter how much you edit them separately (assuming your editing logic is correct). Just make sure you don't try to edit while playing. For that, you must create a copy and edit the copy.