Nextion IDE Frame layer priority - ide

I recently purchased the Nextion 2.4" touchscreen. I have run some test code that allows me to switch images on the current page. The image will overlap the buttons when the image is called. Is there a way to have the button be on the lop layer at all times, that is to say, is there a way to make the buttons have higher priority?
http://wiki.iteadstudio.com/Nextion_HMI_Solution

Related

How to track the location of a window belonging to another app

When screen sharing a specific window on macOS with Zoom or Skype/Teams, they draw a red or green highlight border around that window (which belongs to a different application) to indicate it is being shared. The border is following the target window in real time, with resizing, z-order changes etc.
See example:
What macOS APIs and techniques might be used to achieve this effect?
You can find the location of windows using CGWindowListCopyWindowInfo and related API, which is available to Sandboxed apps.
This is a very fast and efficient API, fast enough to be polled. The SonOfGrab sample code is great platform to try out this stuff.
You can also install a global event tap using +[NSEvent addGlobalMonitorForEventsMatchingMask:handler:] (available in sandbox) to track mouse down, drag and mouse up events and then you can respond immediately whenever the user starts or releases a drag. This way your response will be snappy.
(Drawing a border would be done by creating your own transparent window, slightly larger than, and at the same window layer as, the window you are tracking. And then simply draw a pretty green box into it. I'm not exactly sure about setting the z-order. The details of this part would be best as a separate question.)

A-Frame - ensuring object is in view for different browser window sizes and resolutions

A-Frame newby here: I am trying to position a menu in the current view of the A-Frame camera (adding the related entities as childs of the camera entity, as suggested elsewhere as a way to create a heads-up display).
I want to position that to the top left of what the user currently sees. Therefore specifiying fixed x and y coordinates do not quite work, since the field of vision of camera depends on e.g. the screen resolution of the client the browser is running on (so on some devices the menu can be seen, on others one would have to move the camera back or turn to see the menu).
Is there a way to find out at which x,y coordinates (relative to the camera) the top left of the view is? Or another solution of how to place a menu in constant view (that works both for a classical browser window as well as in full screen and VR mode)?

Replicating Preview image viewing with NSPageController

Thanks to Apple's PictureSwiper sample code and the very nice NSPageController tutorial from juniperi here on stackoverflow, it's pretty easy to get tantalizing close to image viewing capabilities in Preview. Specifically I want to replicate the ability to swipe forwards/backwards between images/pages, use pinch-to-zoom resize the images, gesture to rotate the images/pages, and support two-page mode.
But there are some hurdles that make me wonder if NSPageController is the right approach or if it is too limiting and a custom view controller is needed.
1) Images of varying sizes are simply displayed stacked and if the top/upper layer image is smaller, the underlying image(s) show through. Using the same images in preview, they hide the larger "underlying" images/pages and fade the underlying image in/out with the swipe transition. I could hide underlying images by linking the page controller to the view rather than the image cell (like PictureSwiper), but that causes the entire view to scale on pinch to zoom and overall looks clunky.
2) Is it possible to use NSPageController with more than one image cell, e.g. two-page mode?
3) Is page/image rotation possible with NSPageController?
4) Is it possible to lock the zoom level for all the images, so they are uniformly displayed as navigated?
My apologies if this too general of a question, but the gist is whether NSPageController too limited and problematic to extend which would necessitate building a custom controller from scratch.
Thanks.

How can I scale each and every screen to the browser resolution In a Expression Blend Sketchflow project?

I've been looking all over the place to find the solution to this but haven't had success. I have a Sketchflow project and I want to scale every Screen to the browser resolution on running, as in scaling every element of the current Layout to fit the screen.
Do you want the objects themselves to get bigger to fill the screen or to spread out? For objects to get bigger you can wrap the whole thing in a ViewBox.

Collision Detection with animateWithDuration

I have two UIButtons on the base, that on click of a third UIButton move to specific positions (These positions are generated on application of a complex algorithm). To move them , I am using animateWithDuration:delay:options. When I was making a different application, I was moving UIButtons randomly around the screen using NSTimers so detection of a collision was easy with a simple CGRectIntersectsRect. I have two options: 1. Is it possible to detect their collision with each other if Im moving them using animateWithDuration? 2. If I use NSTimers, I would be able to detect collsion but in that case, how do I move them to a particular position on the screen? Any help would be appreciated!
check out the link detect collision of two moving buttons in iPhone