We are working on a ov5640 sensor camera, we brought up the camera on our ARM platform(based on i.Mx6) and we tested the camera using v4l2 overlay application. Now we need to use that v4l2 application in the Qt5.
Is any application explaining us the method to create the overlay application (v4l2) on Qt? It will be helpful. We want to show the overlay inside a Qt widget(640x480) widget.
Any reference source or if methodology means will be helpful.
Related
I'm just playing around with agora.io, WebRTC and I want to implement a "camera tile view". I hope you understand what I mean, so all the (small) cameras of the users should be displayed in a row/table one next to each other or in a list, if too many users. The active speaking user gets a border around his camera view or sth like that.
Can anybody tell me the name of this kind of view or point me to a location, where I can check some samples about this?
Best regards, Alex
The Agora SDK's provide all the API's for building your own Ui, so there is no method within the SDK for generating a tile view, you would have to do that yourself.
That being said, the Agora developer community has some open source UI Kits that serve as a good starter template for your UI that you can adjust. The Agora Web UIKit supports tile view as the default.
Vanilla JS: https://www.agora.io/en/blog/adding-video-chat-or-live-streaming-to-your-website-in-5-lines-of-code-using-the-agora-web-uikit/
React: https://agoraio-community.github.io/Web-React-UIKit/
I am working on project which deals with audio video playback in Android KitKat. I am able to play video using video view and it's MediaPlayer helps to modify the audio track of video. I know about Playbackparams which can be used for pitch and tempo shifting but it's available from API 23.
I heard about a library called Sound touch which can be used for pitch and tempo pitching but the source code available for the same shows that it works for WAV files.
I am interested in using sound touch library as a audio effect library which I can place in Android's /system/etc/audio_effects.conf and place the SO file in /system/lib.
But I don't know how to turn this sound touch library in order to make it read audio from Android MediaPlayer as I am planning to call this library in the same as we use presetreverb/equalizer?
Kindly help me out if anyone has done such an implementation.
We need to open the iPhone camera, to take images that will be saved to the camera roll.
I have read many examples here that all of them opens the UIImagePickerView.
Besides the fact that i cant understand why i have to open the picker view in order to open the camera , i just can't do that- i dont want the picker view, because i have my costumed photo album that we build, and we just need to have a little button in it, that opens the camera to take an image . without opening any other views above it .
Is that possible to use the camera without this pickerview that will cover my scene ?
or can i lead the user to the camera app and than take him back to my app ?
Thanks.
Instead of high level (i.e. Apple supplies the UI element) classes, you have to go to a more foundational (lower) level of API's, which would be AVCaptureDevice and AVCaptureDeviceInput.
And Apple has some nice source code available in their AVCam project.
If you want to display camera stream in you app without UIImagePickerController than you should you AVFoundation framework.
Here some some examples and tutorials:
take-photos-with AVFoundation
Custom camera
Displaying camera
How to draw an overlay on the mapview?
See a discussion on MacRumors.
Brief version: use iphone-google-maps-component as a way around not being able to overlay on the mapview.
Basically iphone-google-maps-component is:
A component that you can add to your iPhone application to access all basic features of Google Maps (similar to Android's MapView). It uses a UIWebView in the background to load the HTML/Javascript version of Google Maps, and offers a set of Objective-C methods that mimic a subset of the original Javascript methods for controlling the map. It currently supports setting the center location and zooming & panning using the touch interface. You can see it in action
i've modified the quartz composer slideshow sample from xcode to render a high speed slide show using a custom transition.
The sample uses OpenGL (Cocoa) to render the slide show.
I would like to export this slideshow into a video.
Is there a way to use Cocoa/OpenGL to output this scene into a quicktime video?
OR, should I just reimplement the SlideShow sample in Quartz composer and use it's export to Quicktime functionality?
check the sample-code in /Developer/Examples/Quartz Composer/Applications/QCTV for code that does exactly what you want.
i believe that in previous versions of osx, the sample-code was called Quartz Composer TV
I don't believe Cocoa supports rendering to QuickTime video directly, and I know for a fact that OpenGL itself does not.
From what I found, Apple recommends you use Quartz Composer for rendering. Here's a link to a breakdown.