Razberry z wave plus devices vs z wave devices - raspberry-pi2

Wave de Sigma Designs 5202(razberry for my raspberry pi 2 b 2014) so i would like to know if i can use devices with z wave plus and z wave.
Do you know if razberry can control z wave devices also ?

the answer is yes, z wave is a protocol and the new z wave plus is a protocol that use the lastone , so if you was using 2000 devices now you are going to use more devices 4000 or more etc ,... so theres not problem !!!

Yes, razpberry board also integrated z-wave controller. You can connect any z-wave components to any of z-wave controller. It is must behaviour of z-wave.

Yes. Z-Wave Plus is completely backwards compatable with standard Z-Wave. Anything that works with Z-Wave Plus will also work with Z-Wave, and anything that works with Z-Wave will work with Z-Wave Plus although it may not support all the added features of Z-Wave Plus.
Here are some of the added features of Z-Wave Plus.

Related

Type-C USB-C trigger - Set lowest voltage allowed

Is it possible to set the lowest voltage allowed to output at 12 or 20 volts? In other terms, stop it from outputting 5v(Output 12v or 20v only). Using the USB-c trigger ZYPDS from here (https://www.aliexpress.com/item/32969919810.html). Other people will be using a project im working on that uses this usb-c trigger to supply 12 volt to an LED driver. If they accidentally use a standard (Type A) USB to USB-c wall adapter, it will damage the LED driver im using which cannot operate at 5v. Any Suggestions? The part number on the chip being used on the USB-c trigger is ip2721. I cannot read chines but here is the datasheet (http://www.szjuquan.com/upload_files/qb_sell_/pdf/IP2721.pdf).
I would suggest using a Zener diode. Ignore all those voltage regulator tutorials, since that's not how you would be using it. Instead by simply whacking it in series, without a resistor, you are using it as a gate. I would almost consider using a 9v Zener in this application, provided the chip never outputs 9v.
Leave it floating.
I've just "read" the Chinese manual, the settings are:
* pull down: 5V
* pull up: 20V
* floating: 15V
BTW From my experience the chip isn't very stable.

Limit usb power output

I work with an embedded device that has a USB host port. I would like to connect an iPhone to it and communicate via USB. I have done development on this and ported the functionality to connect to usbmux on the iPhone and have successful communication, however there is another problem.
All development was done with the iPhone connected to a powered USB hub that was connected to my device, as soon as I connected it directly, after enumeration it starts to drain the battery of my embedded device and causes a tension (voltage) drop that causes my device to turn off.
I know that after enumeration usb devices can draw up to 500 mA from the usb port, but I was wondering if there was a way to limit that to 100 mA (while still having the iPhone registered).
I found various questions regarding controlling voltage on the data pins or vcc from the usb port and I understand that's not possible, I'm looking for a software solution (although hardware solutions are welcome).
tl;dr: Is there a way to supply the iPhone with less than 500 mA after enumeration? Could I do this in software? Or do I need a hardware solution? I don't want to turn the port on/off, just limit the power draw of the iPhone.
NOTE: I am using Windows CE 6.0, if it is something that can only be done by modifying the drivers, or having direct access, there is no problem.
P.S. also, if there is a way to do this in *nix (or some other open source OS) that I could look at the source code and port it to Windows CE please let me know.
When a device shares its available configurations (see USB chapter 9), it specifies how much power it requires for each configuration. The host should look at all the available configurations and choose which one it wants.
In practice, however, these things don't work so smoothly.
The last time I looked at this, Windows always chose the first configuration. MacOS always chose the lowest power configuration (or highest, I can't remember). I never looked at WinCE or Linux.
If you're writing/modifying the driver, you can set your own rules for which configuration to choose, including looking for one that's 'self powered'. The iPhone, however, might only have one descriptor that always requests 500mA, bus powered. If so, then you're pretty much screwed since there's no way to let the iPhone know it's not OK to draw power.
That being said, I believe all the iPhone accessories are actually USB host (as opposed to USB device), and given that they don't always supply power, the iPhone must be capable of enumerating self powered.
I like the answer by Russ Schultz but I want to add another one:
No.
The descriptor of the peripheral device, iPhone in this case contains bMaxPower. If you enumerate this device, you also accept the power demand. It is not possible to only supply less, lets say 300 mA, if you already enumerated the device with the 500 mA desriptor. If this is what you wanted.
If the device provides multiple configurations, you are as mentioned by Russ free to write a driver which selects the configuration with less power. Hopefully, the device will then only consume the granted power.
Many peripheral devices just don't care. Most devices only provide one configuration with 500 mA. And there are a lot of devices which just consume more than they say ...

Kinect 2.0 for Xbox One to PC USB 3.0 keeps disconnecting?

Does anyone here use USB 3.0, and can tell me why when I plugin my Xbox One Kinect 2.0 USB 3.0 cable into the computer, why it keeps sporadically disconnecting and reconnecting even though I downloaded all the windows updates, all the graphics card updates, all the firmware updates, etc...? And YES, I tried several different Ports. It's not broke. I got it new for Christmas.
After fighting with this for weeks, I finally found the root of my frequent disconnects. At some point, I had disabled the Xbox NUI Sensor Microphone Array to eliminate a feedback loop:
Control Panel > Hardware and Sound > Sound > Recording
After re-enabling the Kinect microphone, the Kinect stopped disconnecting.
To eliminate the feedback loop, I reduced the Kinect microphone's level setting to 0. You can get to the level setting from the Recording tab in the Sound dialog. Select the Xbox NUI Sensor Microphone Array and click the Properties button. From there, select the Levels tab.
I had a similar issue and I kept trying every thing for days, and finally the issue turned to be from the kinect AC adaptor... I tried it with the official windows sdk and developer toolkit, when I attempted one of the example codes the issue persists to appear but with a clear message asking to plug the power cord in, though the adaptor is brand new!!
I searched for some information a bout the AC adaptor and it seems that there is a problem with the adaptor, and most importantly the Kinect manual states that any unoriginal adaptors may cause the device to fail - also the manual says that original AC adaptor power output is 12V-1.1A while the one I have is rated 12V-1.08A (no big deal but who knows)
Kinect for Xbox 360 freezes and disconnects from USB after running Processing SimpleOpenNi depth image example
I had this problem,too. My system using pci-e usb 3 gen 2 and in windows 10 v1903 and kinect sdk 2 ... and all thing OK.. worked correctly .then after a date .reapetdly disconnect and restart.
at last ..I found this problem on my system...
I disabled sound using in windows setting.
I enabled this item in setting and all things became OK.
Read this, if you use an USB3.0 expansion card in a PCIe slot.
In my case I had connected the adaptor to an USB3.0 card (Transcent PDU3). After some hours research I discovered that the mainboard (MSI K9a2 Platinum) had Gen2 PCIe for the PCI-E x16 slots, but not for the PCI-E x1 I had plugged the USB card into. After switching to a PCI-E x16 slot, the constant disconnecting was over.
Don't confuse PCIe version and USB version here. For a Kinect 2.0 you need USB 3.0 and for USB 3.0 to run at super speed, you need PCIe 2.0 (or Gen2).
Testing PCIe version
You can use GPU-Z to determine which PCIe-Version the slot has where you got your graphics card plugged in -- let the mouse pointer hover over the bus interface field and wait for the tooltip and it will reveal the PCI-e version of your graphics card as well as the one of your mainboard. If you confirm it is Gen2 (or PCI 2.0) try to use that slot to put a confirmed-as-working-with-the-Kinect-2-USB3.0-card in it. (Having onboard graphics or a second PCI-Ex16 slot will definitely come in handy here).
Hope this helps.
I think it has to do with the USB 3.0 version, older machines won't run it. You need USB "3.1" and controllers usually manufactured after 2013 have it. It's often mislabelled as USB 3.0 in marketing material. USB 3.1 is also known as "SuperSpeed" or "SS10" which goes up to 10 Gbit/s. USB 3.0 "only" transports at 5 Gbit/s.
I have two five years old big rigs (Z68X-UD3H-B3, i7-3770K, and a 970a-D3, FX-8350) and it constantly disconnects. Both have 2011 board technology.
I also have two laptops, a VAIO and a Lenovo, which were built after 2013 (when they changed to USB 3.1) and it runs fine on both of those machines.
I too suspected the power supply at one point, nope, I thought I had a broken Kinect (bought a second, nope, now I own two.)
Other things to check:
- You might be able to use a USB 3.1 PCI card as long your MotherB will carry it.
- Remember to load the Kinect SDK 2.0 and also update the driver to the 2016 driver (SDK 2.0 comes with the 2014 driver).
- Remember USB 3.x is the "Blue" USB plug not the black.
This is not a case of requirements increasing beyond the capabilities of USB3.0!
Its also not a problem with Win10 1809 or KinectSDK 1409.
It will disconnect if your apps have no access to either microphone or camera.
You can check or reset your settings the easiest with a free program called OOSU10.
Runs fine on my 2012 laptop.
If your problem is that the Kinect Configuration Verifier does not start at all, then this is caused by having disabled the printer spooler service.

How to Connect Kinect 2 for Xbox One with PC

I have an Xbox One with Kinect 2. I want to know if I can connect it to my PC, and if so, how to do it ?
Microsoft Finally came up with a sane solution to Xbox Kinect One problem
Check this out
http://www.microsoftstore.com/store/msca/en_CA/pdp/Kinect-Adapter-for-Windows/productID.308878000
You can un-officially connect the XBone Kinect to a PC.
Although you'll invalidate your warranty on the Kinect you should still be able to use it with the XBone afterwards.
Not sure if its a great idea for your project though - you'll still need a Windows 8 PC with the right USB 3.0 controller for it to work and you are at risk of non-windows Kinect SKUs being blocked/nerfed in future.
But basically:
Disconnect the USB lead from the Kinect
Take the Kinect apart
Solder a 12v power supply to the USB 3.0 powered-B side pins where the connector joins the PCB (these are extra pins in addition to the standard USB 3.0 spec for "special" device power input/output)
Connect the Kinect to the PC with a standard USB 3.0 B cable
A picture of where to solder the 0v/12v wires is here.
I connected them to a barrel connector to fit a spare laptop PSU.
This works for me with Windows 8.1 and the MS KinectSDK public preview 1407.
To connect Kinect 2 (Xbox One) to your PC, you need a 12 V power supply and this cable:
(source: diskdoctors.com)
Using information from this picture:
Kinect 2 cables:
Change standard Kinect cable with a new cable USB 3.0 A, other cables (grey and brown is 12 V power).
Sorry, but there is no official way to connect the XBOX One Kinect with a PC. A hack might be available one day, but I would not recommend going that way.
Buy a "Kinect for Windows V2 Sensor" - that includes the license and SDK to develop your own applications with the Kinect V2.
I connected the 12 V DC , as it must be in some photos;I used a Renesas USB 3.0 PCI-EXPRESS card and a 3m cable;and Kinect XBOX ONE was not detected by windows;I cut and re -made the long USB3 cable to 1m cable; and again nothing detected by PC.
It looks that a POWER ENABLE signal STRAP(CONNECTION) must be made somehow(in the kinect 2)
The "distinct" hackers forgot to explain that signal (how to).
I didn't have the time to analyze the good images of the original USB3 HUB with the industrial USB3 B male connectors uploaded on web(by the way some photos disappeared meanwhile) This industrial USB3 cable of Microsoft has USB3 standard-5 pins, USB2 standard-4 pins + another additional 4 pins (of course one is ground and one is 12 V, and at least one not documented.
Fortunately I have about 4 projects to work till connecting the sensor and Microsoft did
something interesting. It manufactured and sells the adapter for the sensor separately.
A bit expensive, at 50$ but however we speak about one power adapter, a USB3.0 HUB and a USB 3.0 cable. ( the price had to be better at 30$) , even so it is not killing price.
http://www.microsoft.com/en-us/kinectforwindows/purchase/default.aspx#tab=2
and it looks already available for purchase.
Make sure hardware matches standards if you are using a PCIe USB 3 adapter - your motherboard will probably have to support PCIe 2.0 (PCI Express).

Distance between ios devices using bluetooth 4.0 LE and passing message across paired devices

i am working on an app which requires finding out distance between ios devices using BT 4.0 LE and then pass a message to paired devices if they are in range. Is there any feasible solution for it??? Please help me out on this
You cous use RSSI as an indication of distance (more accurately Quality of the current RF condition with that device) and come up with your own mechanism to compuet the distance.