Android2.3 camera HAL capture picture issue - camera

I'm working an Android 2.3 project and have some questions with Camera HAL implementation.
It takes me a lot of time to capture pictures, about 4s, how do I reduce it? The camera sensor is ov3640 on this project. I try to wait for 2 Vsync and capture the 3rd frame follow the Ov3640 spec for capture picture instead of sleep 1s directly when it complete the autofocus.
My second question is, how do I calculate the Vsync time?

Related

CreateJS MovieClip performance issue

I'm using Adobe Animate HTML5 to create a board game to run on Smart TV (low-performance machine).
All my previous games were done using AS3.
I quickly found there is no way to create a Sprite anymore (A movie clips with only 1 frame).
After creating my board game (no code yet just elements) which is basically movie clips inside other movie clips. All single frame.
I checked the FPS on LG TV and so it is done from 60 to 20. On a static image.
After research, I found that is the advance method in MovieClip class there is a constant check to update the frame.
I added a change to check if the MovieClip class total frame is equal to 1 to change it the mode of the MovieClip to a single frame. This increases performance back to 60 FPS.
Who do I go to, to check and maybe fix/"add a feature" to the code of createjs
Thanks
code issues or suggestions can be logged here https://github.com/CreateJS/EaselJS/issues for CreateJS. All the best.
inside html-code in script part there is a line
createjs.Ticker.addEventListener("tick", stage);
Remove it and call the update manually when you need it (when something has changed)
stage.update();

Video getting ahead of audio during recording HLS

I have a problem where I'm using the Kickflip.io (see code base which is in github) base and building off of it, but after about 10+ seconds the video is getting ahead of the audio. I was able to verify that the audio is playing at the correct rate, so its definitely the video.
I've tried adjusting the sample frame rate for the video down to 10 fps, and this does nothing. Based on all the other audio out of sync with video which is out there for FFMpeg, I'm starting to wonder if there's something in the FFMpeg cocoa pod.
My guess is that the issue is in:
https://github.com/Kickflip/kickflip-ios-sdk/blob/master/Kickflip/Outputs/Muxers/HLS/KFHLSWriter.m

Does the Qt Quick 2 Scene Graph sync to vsync on EGLFS?

I am studying about Qt Quick 2 in Qt 5.1. It is cool. However, I do not get it about Scene Graph feature. If I just create Flickable, put a bunch of Images to it, and show this in QQuickView on embedded device with EGLFS backend - am I using the Scene Graph or not?
If yes, why do I see tearing - Scene Graph should be VSynced?
If not - what should I do to use it with Flickable? Or does it mean I should implement all from scratch?
My app is basically allowing to browse an endless generated image.
QtQuick2 has as its only backend the scene graph. So yes, you're using it. Whether that's going to be vsynced, it's another story -- mostly depends whether the scene graph is using a render thread or not, and the quality of the drivers (i.e. if swapping buffers will wait for the vsync). You can try
to run your app with the environment variable QSG_RENDER_TIMING set to a non-zero value; this will print frame statistics for your application. (See here for more info.) If your frame lengths are not an exact multiple of the vsync interval, then you're not having vsync
to run your app with the environment variable QT_QPA_EGLFS_FORCEVSYNC, which will try to use the FBIO_WAITFORVSYNC ioctl to attempt to sync with the vsync.

Unity3d external camera frame rate

I am working on a live augmented reality application. So far I have worked on many AR-Applications for mobile devices.
Now I have to get the video signal from a Panasonic P2. The camera is an European version. I catch the signal with a AJA io HD Box, witch is connected by firewire to a MacPro. So far everything works great - just not in Unity.
When I start the preview in Unity the framebuffer of the AJA ControlPanel jumps to a frame-rate of 59.94 fps. I guess because of a preference on unity. Because of the European version of the camera I can not switch to 59,94fps or 29,47fps.
I checked all settings in Unity, but couldn't find anything...
Is there any possibility to change the frame-rate unity captures from an external camera?
If you're polling the camera from Unity's Update() function then you will be under the influence of Vsync, which limits frame processing to 60 FPS.
You can switch off Vsync by going to Edit > Project Settings > Quality and then setting the option Vsync Count to "don't sync".

Modifying video frames with QTKit and OpenGL

I am working on a project where I would like to open a video (on a Mac) with QTKit. That part I can do no problem, but as I am playing it, I would like to edit or modify the video on the fly using OpenGL.
From what I understand, I should be able to intercept the frames and change them before it hits the display, but no matter what I do, I cannot seem to do so.
It sounds like you should have a look at Core Video and the display link mechanic.
You can basically get a callback on a high priority thread with the decoded frame in a CVImageBuffer and do whatever you like with it (including packing it up as a texture for OpenGL processing and display).
Apple provides documentation and demo code snippets on the developer sites.