I would like to place a CCLabelBMFont object over a playing video. How can I achieve this? I am using the MPMoviePlayerController to play the video. Also, can I specify the duration, start time and end time that the label will appear?
Related
I have some code that used CreateJS /EaselJS to create a MovieClip that contains a Tween that contains an mp4 video. In MovieClip there is a method called 'gotoAndPlay' that you can use to change the timeline position of the playhead to a certain frame number. When using this method to change the play position of the video the tweens work but not the Tween that contains the mp4 movie...this object does not load is result in a blank video tag on the page except for the first play through of the clip. Once the mp4 video has been played it didn't play again if the position was set to it through gotoAndPlay...any ideas on how to fix this or if something wrong might be happening?
In ActionScript animations, FLV movies can be locked to the timeline. But in HTML Canvas animations, MP4 movies are not really fully-fledged "Animate" objects. They look the same for the most part but the integration is not as tight as in Flash.
Since the videos exist outside of the Canvas, you'll need to use jQuery or JavaScript to address them. This can be done by using the Code Snippets in the HTML5 Canvas - Components - Video folder.
As an advance warning, "seeking" to different locations in an MP4 video the way you described is not as reliable as it was in Flash. Browsers like Internet Explorer don't handle seeking well and will likely crash. If frame -by-frame accuracy is important, you may find the best visual results by avoiding the video component and converting your movie to an actual MovieClip in Animate CC, which will increase your file size significantly.
I'm working on an iOS movie editor project. For this editor, i use MPMoviePlayer to show the video file selected by the user.
I use custom controls, and I have a UISlider that enables the user to move the player's currentTime position. When the user touches the slider, movie is paused and its currentTime changes along with the UISlider's value.
Everything works perfectly, but now i need to let the user hear the sound at this currentTime position.
For those who know iMovie, when you move your mouse over a movie event, you see the image and hear the sound at this position, and that's what i'd like in my editor.
I've tried to call player's play method with à NSTimer to stop after 0.2 seconds, but the result is kind of messy.
Has anyone already achieved to do something like this ?
Thanks !
Best regards.
Seeking takes time; that's why you've ended up using a timer. The real problem here is that MPMoviePlayerController, while convenient because it gives you controls, is a blunt instrument; it's just a massive simplified convenience built on top of AVFoundation. But you don't need the built-in controls, so I would suggest throwing away your entire implementation and getting down to the real stuff, using AVFoundation instead (AVPlayer etc). Now you have a coherent way to seek and get a notification when the seek has completed (seekToTime:completionHandler:), so you'll be able to start playing as soon as possible. Plus, AVFoundation is the level where you'll be doing all your "editing" anyway.
I have video that uses different timecode tracks when we export our QT's. Is there a way to make that timecode get displayed? It always is a 00:00:00:00 formatted number.
http://i.stack.imgur.com/3FDgF.png
HTML5 video does not report frame information, so you would have to implement this yourself based on the current time & frame rate of the current video clip. There are a lot of pitfalls here if you are supporting a lot of different frame rates, so you'll have to do your calculations correctly in either a javascript timer or the HTML5 time updated event. If you want, you could then display the timecode over the video by absolutely positioning a div over the video element.
I have a video clip that will work like a intro video when application starts. Video contains many objects that comes from all directions and incorporates with each other.
I have two ways to do this:
I start video in paused mode. Video goes forward or backward with finger left-right movement, exactly similar to the play strip of a MPMoviePlayerController. (video will remain paused and will not play)
I make every object separate and inside a UIScrollView I move them as per the video describes with the help of - (void)scrollViewDidScroll:(UIScrollView *)scrollView method.
2nd way is too complicated because the video is too lengthy and has near 45 objects. So how to go for 1st option?
You should export your movie as a serie of image. Then use [UIImageView animationImages:listOfImages] and bind a UIGestureRecognizer on your imageView and display an image instead of another depending on the direction of the slide.
I would like to open a mp4 web asset in an ios app to a specific set of time offsets, like chapters for example, 2:31 in to a 10 minute video
any ideas on how to accomplish this?
Use an AVAssetReader and set a time range before starting to read frames:
CMTimeRange timeRange = CMTimeRangeMake(startTime, kCMTimePositiveInfinity);
assetReader.timeRange = timeRange;
Presumably you have an MPMoviePlayerController which you're already using to play the movie, I suggest looking at the initialPlaybackTime and endPlaybackTime properties on the controller to set the start and end point for movie playback.