sony-camera-api trackingFocus - camera

I can turn tracking Focus on and use the actTrackingFocus. Once the actTrackingFocus is set how can I get the coordinates back from the camera so I can draw a box in the Liveview box showing what the camera is focused on?

That is not possible with the existing API unfortunately.

Appreciate that this is an old question, but if you are still trying and OK playing in python...
The tracking focus location is (apparently) reported via the frame info packets, and thus you have to enable them and then decode.
We are attempting to do this with pysony 1
Use 'python src/example/pygameLiveView -i' to see the reported locations. You might need to add your 'actTrackingFocus()' call to enable tracking focus, but they should be rendered (box with triangle corners) on screen.
Since none of the devs have a camera which support tracking focus, we'd love to hear whether it works on not. :-)

Related

How to track the location of a window belonging to another app

When screen sharing a specific window on macOS with Zoom or Skype/Teams, they draw a red or green highlight border around that window (which belongs to a different application) to indicate it is being shared. The border is following the target window in real time, with resizing, z-order changes etc.
See example:
What macOS APIs and techniques might be used to achieve this effect?
You can find the location of windows using CGWindowListCopyWindowInfo and related API, which is available to Sandboxed apps.
This is a very fast and efficient API, fast enough to be polled. The SonOfGrab sample code is great platform to try out this stuff.
You can also install a global event tap using +[NSEvent addGlobalMonitorForEventsMatchingMask:handler:] (available in sandbox) to track mouse down, drag and mouse up events and then you can respond immediately whenever the user starts or releases a drag. This way your response will be snappy.
(Drawing a border would be done by creating your own transparent window, slightly larger than, and at the same window layer as, the window you are tracking. And then simply draw a pretty green box into it. I'm not exactly sure about setting the z-order. The details of this part would be best as a separate question.)

iOS custom status bar that only replaces carrier icon

I'm creating a zombie preparedness app for iOS and I thought it would be cool to have an "Apocalypse mode" which is similar to Airplane mode in that it replaces the status bar carrier icon with a little airplane except possibly with a little mushroom cloud or something instead?
Apocalypse mode would just be a boolean flag in my app the disables all data connection required features (only within the app, not using any private APIs or anything...). If possible, I would still like to have the clock, battery life, Bluetooth icons and whatever else that pops up onto the status bar during normal operation.
I'm looking at the MTStatusBarOverlay library to implement this feature. Related (Stackoverflow post here). I know there is a possibility my app could get rejected for style because of this, but my thought is that I don't want to stray to far from the norm and cross my fingers Apple doesn't jump on me for it.
My question is
How can I copy over the clock and battery life icons? Do I need to hook into an event or is there a UI element I can add.
Am I going about this the right way? Would it be better to just make a transparent overlay on top of the normal status bar with a mushroom cloud that overlays the carrier icon instead of replacing the status bar entirely? I'm worried about variable length carrier icons...
Of course option 3 is I just forget that idea entirely and make some sort of different background or something for this mode, but that seems lame :P
I had a go with something similar a while ago. I created a status bar overlay that accepted touch events, but didn't block the status bar from receiving touches, which is crucial for app store acceptance.
You can check out my question and my answer, however keep in mind it might not be actual anymore, it worked great in iOS4, but never tested it on 5. Worth a try though.
As for the overlay itself, I suggest covering everything up to the clock, and leaving the rest transparent, it should do the job.

Extending Functionality of Magic Mouse: Do I Need a kext?

I recently purchased a Magic Mouse. It is fantastic and full of potential. Unfortunately, it is seriously hindered by the software support. I want to fix that. I have done quite a lot of research and these are my findings regarding the event chain thus far:
The Magic Mouse sends full multitouch events to the system.
Multitouch events are processed in the MultitouchSupport.framework (Carbon)
The events are interpreted in the framework and sent up to the system as normal events
When you scroll with one finger it sends actual scroll wheel events.
When you swipe with two fingers it sends a swipe event.
No NSTouch events are sent up to the system. You cannot use the NSTouch API to interact with the mouse.
After I discovered all of the above, I diassembled the MultitouchSupport.framework file and, with some googling, figured out how to insert a callback of my own into the chain so I would receive the raw touch event data. If you enumerate the list of devices, you can attach for each device (trackpad and mouse). This finding would enable us to create a framework for using multitouch on the mouse, but only in a single application. See my post here: Raw Multitouch Tracking.
I want to add new functionality to the mouse across the entire system, not just a single app.
In an attempt to do so, I figured out how to use Event Taps to see if the lowest level event tap would allow me to get the raw data, interpret it, and send up my own events in its place. Unfortunately, this is not the case. The event tap, even at the HID level, is still a step above where the input is being interpreted in MultitouchSupport.framework.
See my event tap attempt here: Event Tap - Attempt Raw Multitouch.
An interesting side note: when a multitouch event is received, such as a swipe, the default case is hit and prints out an event number of 29. The header shows 28 as being the max.
On to my question, now that you have all the information and have seen what I have tried: what would be the best approach to extending the functionality of the Magic Mouse? I know I need to insert something at a low enough level to get the input before it is processed and predefined events are dispatched. So, to boil it down to single sentence questions:
Is there some way to override the default callbacks used in MultitouchSupport.framework?
Do I need to write a kext and handle all the incoming data myself?
Is it possible to write a kext that sits on top of the kext that is handling the input now, and filters it after that kext has done all the hard work?
My first goal is to be able to dispatch a middle button click event if there are two fingers on the device when you click. Obviously there is far, far more that could be done, but this seems like a good thing to shoot for, for now.
Thanks in advance!
-Sastira
How does what is happening in MultitouchSupport.framework differ between the Magic Mouse and a glass trackpad? If it is based on IOKit device properties, I suspect you will need a KEXT that emulates a trackpad but actually communicates with the mouse. Apple have some documentation on Darwin kernel programming and kernel extensions specifically:
About Kernel Extensions
Introduction to I/O Kit Device Driver Design Guidelines
Kernel Programming Guide
(Personally, I'd love something that enabled pinch magnification and more swipe/button gestures; as it is, the Magic Mouse is a functional downgrade from the Mighty Mouse's four buttons and [albeit ever-clogging] 2D scroll wheel. Update: last year I wrote Sesamouse to do just that, and it does NOT need a kext (just a week or two staring at hex dumps :-) See my other answer for the deets and source code.)
Sorry I forgot to update this answer, but I ended up figuring out how to inject multitouch and gesture events into the system from userland via Quartz Event Services. I'm not sure how well it survived the Lion update, but you can check out the underlying source code at https://github.com/calftrail/Touch
It requires two hacks: using the private Multitouch framework to get the device input, and injecting undocumented CGEvent structures into Quartz Event Services. It was incredibly fun to figure out how to pull it off, but these days I recommend just buying a Magic Trackpad :-P
I've implemented a proof-of-concept of userspace customizable multi-touch events wrapper.
You can read about it here: http://aladino.dmi.unict.it/?a=multitouch (see in WaybackMachine)
--
all the best
If you get to that point, you may want to consider the middle click being three fingers on the mouse instead of two. I've thought about this middle click issue with the magic mouse and I notice that I often leave my 2nd finger on the mouse even though I am only pressing for a left click. So a "2 finger" click might be mistaken for a single left click, and it would also require the user more effort in always having to keep the 2nd finger off the mouse. Therefor if it's possible to detect, three fingers would cause less confusion and headaches. I wonder where the first "middle button click" solution will come from, as I am anxious for my middle click Expose feature to return :) Best of luck.

Display something on the screen everytime action made

I have a problem not sure how to solve this. Hmm I am developing a game, a multi touch game, I already can make everything working fine, except a small issue that I want to show messages on the playing screen, each time the player makes actions. like his finger moves right the message says : "this finger moving right" nicely at the bottom of the screen, then if the finger move left, then it says the his finger moves left... something like that, can anyone show me how. I am using Cocos2D , it shall be much easier in Cocoa.
Thanks a alot for any help.
You'll probably need to be more specific with your question, but for now, here's a general answer:
Handling touch events on the iPhone and Handling touch ("trackpad") events on the Mac.
You'll receive and process the events per the above, then you'll display the results somehow. For testing, you'll probably just want to log the results to the console. For the final version, you might have a label or even a custom view that draws the "instruction" in some fancier way. If the latter is the case, you'll want to read up on custom views and drawing for whichever platform you're using (or both).

Apple Magic Mouse Api

I just bought a Magic Mouse and I like it pretty much. But as a Mac Developer it's even cooler. But there's one problem: is there already an API available for it? I want to use it for one of my applications. For, example, detect the user's finger positions, swipe or stretch gestures etc...
Does anyone know if there's an API for it (and how to use it)?
The Magic Mouse does not use the NSTouch API. I have been experimenting with it and attempting to capture touch information. I've had no luck so far. The only touch method that is common to both the mouse and the trackpad is the swipeWithEvent: method. It is called for a two finger swipe on the device only.
It seems the touch input from the mouse is being interpreted somewhere else, then forwarded on to the public API. I have yet to find the private API that is actually doing the work.
get a look here: http://www.iphonesmartapps.org/aladino/?a=multitouch
there's a full working proof-of-concept using the CGEventPost method.
--
all the best!
I have not tested, but I would be shocked if it didn't use NSTouch. NSTouch is the API you use to interact with the multi-touch trackpads on current MacBook Pros (and the new MacBooks that came out this week). You can check out the LightTable sample project to see how it is used.
It is part of AppKit, but it is a Snow Leopard only API.
I messed around with the below app before getting my magic mouse. I was surprised to find that the app also tracked the multi touch points on the mouse.
There is a link in the comments to some source that gets the raw data similarly, but there is no source to this actual app.
http://lericson.blogg.se/code/2009/november/multitouch-on-unibody-macbooks.html