API method for video frame in video.js - video.js

Is there a way to get the current frame number and a way to seek to a given frame? It would be similar to the currentTime method

Related

How to put limit on the size of video being recorded in iOS app?

In my app I need to record the video and share in my app's network. I need to put some limit on the size of the video NOT on the duration. I had a look into the documents and I found duration property on which I can put a check for duration but NOT on the size of the video. Is there any way by which it can be achieved?
I am using "MPMoviePlayerController"

Display Custom Embded TimeCode instead of Standard Timecode

I have video that uses different timecode tracks when we export our QT's. Is there a way to make that timecode get displayed? It always is a 00:00:00:00 formatted number.
http://i.stack.imgur.com/3FDgF.png
HTML5 video does not report frame information, so you would have to implement this yourself based on the current time & frame rate of the current video clip. There are a lot of pitfalls here if you are supporting a lot of different frame rates, so you'll have to do your calculations correctly in either a javascript timer or the HTML5 time updated event. If you want, you could then display the timecode over the video by absolutely positioning a div over the video element.

how to layer CCLabel over video

I would like to place a CCLabelBMFont object over a playing video. How can I achieve this? I am using the MPMoviePlayerController to play the video. Also, can I specify the duration, start time and end time that the label will appear?

Animation iOS with sound file

I have some animation which needs to appear on screen at very specific timings, these are stored in SQLite database. What I am planning to do is use nstimer to keep time and pop the animation when the specific time is reached.
Im thinking of using an NSTimer to count for the duration of the sound file and animation then when certain points are reached pop an on screen animation. The problem is the set timing are like this 55.715000 seconds so are very accurate and these need to sync with an audio track that will be played with the animation.
Firstly is this even possible, secondly how can i compare such specific timings the problem i seem to be facing is the code can't run quick enough and the time jumps more than .001 of a second.
I have no knowledge of openGLES or Cocos2d and learning these is not really feasible for the time scales.
If your visuals need to be exactly in sync with the audio (for example a music app where animations have to appear on the beats) then you need to use the following approach. It works on very old iPhone hardware and there is basically nothing that can go wrong at runtime.
Pre-render each "frame" of your visuals so that each one is stored as a full screen image. This could be done on the desktop or you can run the render logic on the phone and then capture the output to a file as pixels.
Once each frame is saved as an array of pixels, play the audio via the standard AVAudioPlayer APIs on iOS. This API takes care of the audio playback and reports a time that you will use to determine which video frame to display.
Once audio is playing, get the "time" and divide it by your video framerate to determine which image from an array of N images to display. Get the image data and wrap it up in a CGImageRef/UIImage, this will blit the image data to the screen in a optimal way.
If you would like to see working source code for this approach, take a look at AVSync. This example code shows the implementation that I used in my own app called iPractice in the app store. It is very fast and can run at 30FPS even on the old iPhone 3G.
Regardless of how accurate your timings are, you aren't going to get the device to display more frequently than about 60Hz (16.7 ms per frame). As I see it, you have at least two options here:
1) Use a CADisplayLink callback to check the playback progress of the audio and trigger each animation as it becomes timely.
Display link timers are created in a similar way to regular NSTimers:
-(void)viewDidLoad
{
// ...
self.displayLink = [CADisplayLink displayLinkWithTarget:self selector:#selector(scheduleAnimations:)];
[self.displayLink addToRunLoop:[NSRunLoop mainRunLoop] forMode:NSDefaultRunLoopMode];
}
- (void)scheduleAnimations:(CADisplayLink *)displayLink
{
// get playback time from player object
// iterate queue of timings
// if the timestamp of a timing is equal to or less than the playback time
// create its animation for immediate execution
// remove it from the queue
}
2) Create a collection of animations, setting their respective beginTime properties to the appropriate trigger time (or, if using implicit or block-based animations, use the delay parameter).
[CATransaction begin];
// iterate collection of timings
CABasicAnimation *animation = [CABasicAnimation animationWithKeyPath:#"key"];
animation.startTime = /* time to trigger animation */
animation.removedOnCompletion = NO;
[layer addAnimation:animation forKey:nil];
[CATransaction commit];

Best way to play video frame by frame - zoomed

I need some advice on playing video frame by frame...
Right now I shoot a video and extract all frames using mpmovieplayer thumbnailImageAtTime for each frame.
The video could be zoomed as well. I am zooming by extracting the frames as mentioned above then resizing and cropping the frames.
This would be great except that thumbnailImageAtTime seems to be very slow. My videos will be less than 30 seconds long...most of the time only a few seconds and it takes about 20 seconds on iphone 4s to grab 60 frames. If you think this should be faster I can post the code I am using, but it is pretty straight forward. I am performing it on a background thread so UI is not affected.
I have been looking at AVFoundation to grab the frames, but have read that it is not exact and I need all 30 fps.
I am really looking for advice on the best way to do this. I need to be able to use a slider and buttons to move frame to frame backwards and forwards as well as jump to a specific frame. As I said the video might be digitally zoomed as well.
Should I not extract frames and just use the video file and move from frame to frame? If so, what is the best way to do this because the mpmovieplayer doesn't seem to allow me to move to an exact frame easily. Also, if I just use the video file what is the best way to zoom? Can I go through each frame of an asset and resize and crop it then save back to the video file? Is this the best way? Can I achieve everything I want to do using AVFoundation?
I have been trying things for about a week now and I do have everything working extracting the frames using mpmovieplayer...the speed is just unacceptable. If I could extract the frames very quickly that solution would be the best in my opinion. I might mention I only have to extract the frames once, not each time the user clicks on the video...if that makes a difference.
I hope this all makes sense and sorry for rambling. Any help would be much appreciated!
After a bit of research I am going to go with AVFoundation to play the video frame by frame and not extract the frames. It works great.