I just wanted to ask how to handle the Event when for Example the Head comes in the Sensorview. I did read the Documentation, but dont find the Method for that.
There is no event emitter from the SDK for your purpose. You need to pool Bodyframe and check is there any new skeleton is arrived. You can check out my repository if you need any reference, In there I build more complicated gestures such as click and swipe gestures.
https://github.com/shanilfernando/VRInteraction
Specially following places
https://github.com/shanilfernando/VRInteraction/blob/master/main.cpp#L187
https://github.com/shanilfernando/VRInteraction/blob/master/DepthBasics.cpp#L243
Related
iam learning app development so it is very important for me if you people guide me. I have main activity and and two fragments, 1 for listitem and another for now playing. When clicked list item it pass the url to 2nd(now playing) fragment and start playing the music, everything going normal but i want to do this in background/foreground service with notifications control and i want to use exoPlayer or which one is better to stream deferent types of audio. Please guys i need to achieve this in any how help me. Waiting for yours answer. Thank you
I have a client that adamantly insists on a solution with embedded camera in terms of having a ContentPage with an camera stream and custom buttons and icons, similar to https://github.com/pierceboggan/Moments, or at very least as I understand it seeing as it is a Snapchat clone. And my client wants similar swipe capabilities as to how navigation works in Snapchat.
However, as far as I can tell most of what is utilized in that solution has been deprecated.
I have suggested using the Media Plugin https://github.com/jamesmontemagno/MediaPlugin but they're not satisfied with the camera being pushed on the stack.
I've looked into implementing it natively and using dependency injections but it appears to be an overwhelming amount of work just to implement the most basic functions, particularly for Android's Camera2.
I'm hoping someone can provide me with good news of an easier alternative or an alteration to either Moments or Media Plugin or anything similar that will facilitate the requirements or if my only option is time consuming and complex?
From the code of Moment, you can do what you want to achieve. I did this for iOS.
You will have to create a custom renderer to display the camera page. You will be able to add buttons on top of it.
You could try this example which use custom renderer to add a take photo button and switch camera button on the camera view. Which is able to use on iOS and Android platform.
Main Page:
Camera View with custom button page:
all!
Is there any way to build custom camera component in codename one?
I googled and got about PeerComponents, however don't know how to use it.
I just would like to use it as Label component with 2 buttons (Taking picture button and retaking picture button)
Kindly provide some small sample code for me. Best regards.
[UPDATE]
I need this because of following reason.
First of all, I need square Image taken by camera, and user should be able to know how the picture will be taken.
2 resolutions here:
First, if I have to use full screen camera, it will be better to draw or overlay square rectangle on camera view so that user can know which area will be taken.
Second, if overlaying and drawing is difficult (or should use native code for that), I need some custom components for camera area such like PeerComponent. So I would like to place it anywhere of screen area as square rectangle. Then, users won't need any overlay or drawing something on it because it is already square.
That's all what I need.
Regards.
Yes, it's certainly possible to create a component like this, and we do intend to create one at some point in the future. You can beat us to the punch.
First, you should familiarize yourself with how native interfaces work. This video is a good start:
https://www.codenameone.com/how-do-i---access-native-device-functionality-invoke-native-interfaces.html
This series of blog posts demonstrates how to wrap 3rd party SDKs into codename one on Android and iOS.
https://www.codenameone.com/blog/integrating-3rd-party-native-sdks-part-1.html
https://www.codenameone.com/blog/integrating-3rd-party-native-sdks-part-2.html
https://www.codenameone.com/blog/integrating-3rd-party-native-sdks-part-3.html
Although it doesn't include an example with PeerComponent, it is helpful for getting a grasp on the process. Adding peer components into the mix is really just a matter of returning the corresponding "View" type from a native interface. On Android, that is an android.view.View, on iOS it is a UIView, in Javascript it is an DOM element, in UWP it is a FrameworkElement, and in the simulator, it is a javax.swing.JComponent.
This blog post does include an example of a peer component, but it is targeting UWP:
https://www.codenameone.com/blog/uwp-native-interfaces-mix-c-java.html
Once you have a grasp of the material, you should look at relevant examples. Currently the most complete example I'm aware of of a cn1lib that implements a native peer is the Google Maps lib:
https://github.com/codenameone/codenameone-google-maps
You can see the Android native implementation here, and the iOS native implementation here
You may want to refer to the existing code for image capture in Codename One as well.
Android: https://github.com/codenameone/CodenameOne/blob/master/Ports/Android/src/com/codename1/impl/android/AndroidImplementation.java#L5788-L5811
https://github.com/codenameone/CodenameOne/blob/master/Ports/Android/src/com/codename1/impl/android/AndroidImplementation.java#L5701-L5714
Though it uses intent to open the native capture dialog, so it may not be too relevant.
IOS: https://github.com/codenameone/CodenameOne/blob/master/Ports/iOSPort/nativeSources/IOSNative.m#L2879-L2927
Before iOS7 came, we noticed an issue:
Music remote-control from earbud or springboard can hijack our audio session even if we set the category to solo-ambient or another exclusive mode.
We thus tried a few things:
We tried to take ownership of the audio session back. But this requires that our audio code knows when to take it back and from whom. We thought we could let the app code become the first responder to remote-control events, do our stuff, and then pass the events on to the music app. However, we found that the events got detained by the first responder and there is no way to push it back to the chain of commands.
We tried to become first-resonder and block remote-control events all together when we are in solo-ambient. This worked fine with iOS6, still works with earbud control in iOS7, but fails with iOS7's control center. The control center seems to bypass the remote-control event handler remoteControlReceivedWithEvent completely, where we put our blocking code.
I read something elsewhere that:
You can't block the music app. your app can become one though (apple
won't like that) and then the control center would control yours.
But I found no documentation whatsoever about control center.
And as said above, control center does not enter the normal remote control hooks even if an app is the first responder.
Another quoteP
Remote Control Event handling is so your app can be controlled by
Control Center, the earbuds, etc... it is not so that your app can eat
said controls, preventing control of other apps from said sources. It
only worked in iOS6 because of a bug in iOS, now fixed in iOS7
Is it that what had were using was due this bug? I find it hard to believe because we got the solution on this list and the Xcode mailing list so I assume that was an accepted solution.
Now we really wonder if we are missing something from the very beginning:
Is solo-ambient really an exclusive mode for audio session or is it that music app is an exception to that exclusivity?
How can our app live in harmony with the remote-control, and control center?
Where can we find up-to-date documentation of remote-control and control center?
The remote control has been mysteriously fixed after clean building everything agains iOS7 SDK. Now app delegate can receive remote-control events from Control Center. However, the play/pause events are UIEventSubtypeRemoteControlPause and UIEventSubtypeRemoteControlPlay instead of the iOS6's UIEventSubtypeRemoteControlTogglePlayPause.
I'm wondering if there's any way to have my application be notified when a drag-and-drop operation starts anywhere on the screen, even if I don't have an active window there.
I've looked into the normal drag-and-drop APIs, but I haven't spotted anything that does this. The NSDraggingDestination protocol along with the -[NSWindow/NSView registerForDraggedTypes:] method allows you to notice when someone is dragging something and that crosses over into your window, but I'd like to notice it when any dragging operation is started anywhere on the screen.
Any tips on how to go about this? Is there a standard Cocoa API for it, or is there a private API / some kind of dirty hack to get this information?
Thanks in advance :)
Take a look at NSEvent’s +addGlobalMonitorForEventsMatchingMask:handler:. I’m not sure if you can track mouse dragging but it’s certainly possible to track mouse button up/down events.
i haven't done it,
but i am assuming you need some kind of external software monitoring ALL mouse activity on the system, and reporting it to your app (or your application doing this itself),
as dragging events are usually reported in your app only when there is activity inside your app's window..