How to manually set white balance on Processing Android mode with Ketai? - camera

I am beginner to the Ketai library under Processing framework in Android mode. I need to manually set the white balance of the camera, however I can not find any reference for this in either Ketai docs or Google.
Thus, is this possible under Processing without digging into the native android.hardware.camera2 (which is a bit beyond my expertise)? If not, then what would be the easiest way to manually set the white balance?
Regards!

Related

Image Detection for expo react native

I would like to include a feature in my app, where you can scan a certain picture and the app would recognise this image. Just like the image recognition feature in Viro: https://docs.viromedia.com/docs/ar-image-recognition . So I would set a certain image and it only needs to recognise this image.
I'm working with expo react native.
Does anybody have and idea how I might build this feature?
Thanks 😊
You can possibly just make it call a python service on the backend and pass it the image and get back the result for it.
If you want to go serverless, you can even get a premade AWS lamda function, which you can call and pass it the image, and it'll do the processing for you.
In the end, it's better if you do this kind of processing on the server side, your app could get locked up because image processing can take time and you don't want that to happen on a mobile app.

ImageResizer adjusting white balance

Been playing with StudioJS wrapper for ImageResizer. Can ImageResizer adjust the white balance relative to a color? Ultimately, I want the user to be able to click an area in the image that should be white, and the rest of the image adjusts.
I see the Auto Balance functionality that uses the AdvancedFilters plugin, I guess I want to take it a step further.
No, ImageResizer doesn't current support adjusting relative to a color. However, if you wanted to implement said feature, it would not be difficult.
You could modify AdvancedFilters.cs, inserting an IF statement near line 150, and add support for a new command, such as "&a.white=rrggbb".
Simply provide the RGB values to the LinearLevels class to perform the scaling on the image.
http://www.aforgenet.com/framework/docs/html/29bf1191-47c9-314b-eb8d-bea3f903ac28.htm
If you like the result, and want to share it, send me a pull request on GitHub or e-mail me the code snippet. If it passes testing, it might make it into the next release.
Client side, you might find this library helpful for selecting a color from an image.

Retrieving an app's DockTile (view)

I'm getting my hands dirty with a bit of ObjC by trying to write something Dock-like, with a little less visual bells and whistles. It's going pretty well. However I've stumbled over a problem which I can't quiet solve:
Retrieving an app's icon via NSRunningApplication is easy. However, some apps don't use their icon as DockTile, they use a custom view because their DockTiles are dynamic (f.e. most torrent apps display their current up/down speeds in the dock).
Is there any way to get this DockTile and display it in my own app?
Thanks
No, there is not. The methods which set a custom dock tile end up communicating the contents of the view directly to the Dock; it is not made available to other processes.
For what it's worth, writing a replacement for the Dock is going to be a kind of hopeless task -- Apple's Dock.app uses numerous private, undocumented APIs to interact with the WindowServer, some of which simply cannot be used by any process which is not the Dock.

I cannot get a QTCaptureSession to Capture when in a Terminal Application

I've got a terminal application that needs to take a webcam picture and then perform some processing on it. I'm having trouble getting it to initialize. There's a fairly complete demo with an app called MyRecorder in the Apple docs that uses QTKit, which I was able to make work fine. I was also able to modify it to grab a single frame instead of a stream.
When I move this to a terminal application, the startRunning of the QTCaptureSession command simply does nothing. There are no errors, and everything reports as successful, but my webcam doesn't light up, and no frames are captured.
Any idea what's going on here? Are there any kind of security restrictions, or other kinds of restrictions that would prevent the QTCaptureSession from working?
So switching to AVFoundation solved my problem. I'm still not certain what the issue is, but for now using AVFoundation seems like the way to go since it was designed to replace QtKit anyways.

block ipad camera

Is there any way in which the usage of the camera of the iPad2 can be restricted only to my application? even if it is using i tunes.
could not find any code related to it. some code would be helpful.
There's no way to achieve this. I think it could be done with quite a bunch of hacking if you were developing for Cydia, but I'm not sure ever then. If the user quits your application or switches from it, the system will make the camera available to any other app requesting it.