Kinect 2 shows black screen while capturing Infrared Basics - kinect

I am trying to use Kinect 2 and SDK v2 for capturing Infrared Images/videos.
Kinect shows Depth and RGB images properly, But when i try to visualize Infrared Basics in Kinect for Window. It does not show any image, rather a black screen.
What is the reason for it. I reinstalled SDK v2, but still the same problem. In a similar post some one suggested that reinstall a newer version, which I did. But still the same problem. Can any one suggest any solution?
thanks

it is better to use "KinectConfigurationVerifierSetup" for test system requirements. and i suggest you that use Infrared Basic-WPF Samples in SDK Browser, also you could use that sample code and install them to your computer. if still infrared data source not show, you could test Kinect on other computer

I fixed my problem by updating GPU Driver. It has a conflict/bug/error with older version. However Nvidia removed it. And if one install new driver, it start showing infrared images.
Attention for your graphics card setting, Maybe changing your computer to auto or Inter HD Graphics will work.

Related

Aframe VR : Eyeoffset Settings

This SCREENSHOT gets loaded in my Oneplus 6 phone and is creating problem with the Stereoscopic vision, no one is able to create focus with such a high offset between left & right eyes.
Can anyone help me out in setting the eyeoffset parameters, since i haven't been able to figure them out neither on Aframe.io documentation nor Github
Or is there something that i can do from THREE.js Camera
The parameters for OnePlus 6 (A6000) are missing from the webvr-polyfill device database. You can fork the webvr-polyfill and database, add your device (you need to know dpi and bezel width) and see if the problem is solved in the examples. A-Frame will pick it up when we bump the polyfill version on next release. In the meantime you can do your own A-Frame build pointing to your webvr-polyfill fork.

Liveview on Android/QX1 Sony Camera API 2.01 fails

Using the supplied Android demo from
https://developer.sony.com/downloads/all/sony-camera-remote-api-beta-sdk/
Connected up to the WIFI connection on a Sony QX1. The sample application finds the camera device and is able to connect to it.
The liveview is not displaying correctly. At most, one frame is shown and the code hits an exception in SimpleLiveViewSlicer.java
if (commonHeader[0] != (byte) 0xFF) {
throw new IOException("Unexpected data format. (Start byte)");
}
Shooting a photo does not seem to work. Zooming does work - lens is moving. Camera is working fine when using the PlayMemories app directly, so no hardware issue.
Hoping from advice from Sony on this one - standard hardware and demo application should work.
Can you provide some details of your setup?
What version of Android SDK are you compiling with?
What IDE and OS are you using?
Have you installed the latest firmware? (http://www.sony.co.uk/support/en/product/ILCE-QX1#SoftwareAndDownloads)
Edit:
We tested the sample code using a QX1 lens and the same setup as you and were able to run the sample code just fine.
One thing to check is whether the liveview is ready to transfer images. To confirm whether the camera is ready to transfer liveview images, the client can check “liveviewStatus” status of “getEvent” API (see API specification for details). Perhaps there is some timing issue due to connection speed that is causing the crash.

Alignment Kinect OpenNI 2.2

I know device.setImageRegistrationMode(openni::IMAGE_REGISTRATION_DEPTH_TO_COLOR ) doesn't support Kinect, so is there any other way to quickly and efficiently align rgb and depth on the kinect using OpenNI 2.2?
Thanks
the method you have posted up, does not work with "kinect for windows" but only supported by "kinect for xbox360"
But you can try to do this after creating your Devices and your VideoModes
device.setImageRegistrationMode(IMAGE_REGISTRATION_DEPTH_TO_COLOR);
device.setDepthColorSyncEnabled( true );

Kinect shows only grayscale images instead of colored one

I have very weird problem and haven't found any answer yet, so maybe someone here ever saw smth like this?
When I start NiViewer demo, it works and shows depth image on the left, but grayscale image on the right.
When I start Sample-NiSimpleViewer demo it won't work and gives error
The device image format must be RGB24
The weirdest thing is that about 2 weeks ago everything was fine - no errors and image was coloured.
I think changes occurred after installing ROS packages for openni or maybe I made some damage in settings(however I am very inexperienced user, I don't recollect changing anything anywhere).
I thought that maybe kinect is broken, but when I start it from under ROS (rviz or image_view) it actually shows coloured image.
Anyone have any suggestions?
Have you tried openni_launch with RVIZ !? RViz can also visualize RGB images / point clouds.

RGBDToolKit calibrate correspondence cannot display the live view by my canon digital camera

Hi~ I am trying to calibrate my depth camera with my digital camera.. but I meet some problem using the RGBDKinectCapture...
1、why the calibrate correspondence cannot show the RGB view from my digital camera?? (canon EOS 600D)
I have set it the live view mode..
2、it cannot recognize my kinect(1414) on Win7 By the example application (download from the RGBToolKit.com),
and I cannot run the source on Windows,just say the error 'enable_if' also mean the std::tr1::enable_if...
3、What is the methods of taking live preview for (all) the HD camera in RGBToolKit ??
is it without the SDK of camera??
which methods it based on??
which class? and I want to debug it...
I'm so sorry ... I see now...
1、the RGBDToolkit doesnot display the live preview of the HD camera.... it just load its video....
2、I cannot run the example on windows because it run with the different sdk for kinect ...(the source has give the sdk..)