Turning off the Kinect IR emitter - kinect

I'd like to use two XBOX Kinect cameras simultaneously, but their IR signals interfere.
Therefore, I'd like to be able to switch their IR emitters in an alternating fashion. I read that if using the Microsoft SDK, there was a ForceInfraredEmitterOff flag that can be set, but would not work with XBOS Kinects.
Other sources on the web say that the IR emitter stops when you stop the depth buffer stream. However, that does not seem to work.
I'm using the openni2 python bindings provided by PrimeSense. Is there any way I can shut the IR emitters off with that?

Related

Kinect IR Emitter Continuous or Pulsed?

I'm a student intern and have a work project using Kinect cameras. Unfortunately I can't go into project details due to confidentiality, but need to know if the IR dot array that is emitted from the IR blasters within the Kinect is a continuous stream or pulsed? Just the emitted IR light, not the reception from the IR camera. It would be shining on some other IR light sensors within the environment that detect when something passes through their IR field of view, but I have been told that it would not interfere as long as the stream is continuous.
I would appreciate any help/ information you guys could give.
The kinect 360 cameras has a static pattern of points that are unevenly distributed. The pattern is continuous and not pulsed as far as I know.

How to disable IR emitter on Kinect2 using iai_kinect2 or libfreenect2?

Is there a way to disable the infrared emitter of Kinect2 (i.e., KinectOne) using the library freenect2 or its ros binding iai_kinect2?
I need it to avoid interferences among Kinect2 and Primesense sensor.
In alternative, covering the emitter should do the trick, right? Or Do I risk to mess with the depth readings?
Currently, there is no way to turn off the IR-emitter, while still using the Kinect2. Neither in the official Microsoft SDK, nor in libfreenect2.
But it wouldn't matter anyways, since you won't get any depth readings from the device without the IR-emitter on the Kinect2.

Does web audio api support controlling the volume of each of stereo channels, left and right?

I'm going to develop a javascript game which depends mainly on audio effects.
I made some searches on whether Web Audio API supports controlling the volume of both of stereo channels, left and right so that I can volume Up/Down one of them ? but did not find any answers for my inquiry.
It sounds like you just want a stereo panner node. That would be the simplest way to control volume in the left/right channels.
If for some reason you want more control over the individual channels, you can use a splitter node to turn the stereo signal into two mono signals, modify the mono signals individually, and then use a channel merger node to turn them back into a stereo signal.

Use PsychPy to activate MS Kinect (or other IR devices)

Is there a way to activate the MS Kinect out of PsychoPy?
I am using PsychoPy for an experiment and I am using infrared (IR) cameras to capture participants movements. I want to automatically send a marker that is visible to the IR cameras out of the PsychPy environment. The idea is to use the Kinect or another USB IR device (e.g., asus Primesence, or a simple USB mounted IR LED) to send markers for certain events in my experiments (meaning whenever event X is happening in PsychoPy, and IR signal should be emitted from the kinect or another IR Device).
(I cannot use the sound jack to trigger an IR LED since I need the audio exit for the experiment.)
Thanks!
PsychoPy can send signals over the serial or parallel ports or connect to specific equipment like LabJack. See the API docs here: http://www.psychopy.org/api/serial.html, http://www.psychopy.org/api/parallel.html, http://www.psychopy.org/api/hardware.html.
If you can find or build a piece of hardware with LEDs that can respond to one of the ways in which PsychoPy can communicate, then yes, you could control LED IR pulses as required.

How can I access the Kinect/device via OpenNI?

I was looking over the documentation trying to find anything that will
allow me the Kinect/device?
I'm trying to get accelerometer data, but not sure how. So far there
were two things I've spotted in the guide and docs:
XnModuleDeviceInterface/xn::ModuleDevice and
XnModuleLockAwareInterface/xn::ModuleLockAwareInterface.
I'm wondering if I can use the ModuleDevice Get/Set methods to talk to
the device and ask for accelerometer data.
If so, how can I get started?
Also, I was thinking, if it would be possible to 'lock' openni
functionality temporarily while I try to get accelerometer data via
freenect or something similar, then 'unlocking' after reading is
done.
Has anyone tried this before? Any tips?
I'm currently using the SimpleOpenNI wrapper and Processing, but have used OpenFrameworks and the C++ library, so the language wouldn't be very important.
The standard OpenNI Kinect drivers don't expose or allow access to any accelerometer, motor, or LED controls. All of these controls are done through the "NUI Motor" USB device (protocol reference), which the SensorKinect Kinect driver doesn't communicate with.
One way around this is to use a modified OpenNI SensorKinect driver, i.e., this one which does connect to the NUI Motor device, and exposes basic accelerometer and motor control via a "CameraAngleVertical" integer property. It appears that you should be able to read/write an arbitrary integer property using SimpleOpenNI and Processing.
If you're willing to use a non-OpenNI-based solution, you can use Daniel Shiffman's Kinect Processing library which is based on libfreenect. You'll get good accelerometer, motor, etc..., but will lose access to the OpenNI skeleton/gesture support. A similar library for OpenFrameworks is ofxKinect.
Regarding locking of OpenNI nodes, my understanding is that this just prevents properties from updating and does nothing at the USB driver level. Switching between drivers--PrimeSense-based SensorKinect and libusb-based libfreenect--at runtime is not possible. It may be possible (I haven't tried it) to configure OpenNI for the camera device, and to use freenect to communicate with the NUI Motor device. No locking/synchronization between these devices should be required.