Genicam transport Layer - camera

I am new to Genicam standard, i having issue on understanding the transport layer for gigE camera. I couldnt find any detail information or guideline on writing a gigE camera transport layer as mostly it is provided by the camera vendor directly. Appreciate if anyone can share some information about this.

The most widely used transport layer for GigE cameras is the GigEVision protocol. The standard is available as free from the AIA website, but is not open source compatible.
Aravis is a reverse engineered open source implementation of this protocol.

Dalsa did provide GigE-V Framework at their website. From the source code provided, there are several function is compile as .so file but it is still manageable to reverse engineer the function in driver. I am working until the camera register part. So it is a good reference to understand gige vision driver and genicam. Below is the result i get:
GigE Vision Library GenICam C Example Program (Aug 29 2017)
Copyright (c) 2015, DALSA.
All rights reserved.
[0][22]: 192.168.34.22 , D0:67:E5:2B:B2:3D
[1][26]: 192.168.34.26 , 0C:C4:7A:4C:96:C1
[2][30]: 192.168.34.30 , 00:01:29:65:93:A5
[0][14]: 192.168.128.14 , 3A:F4:E2:F9:AF:F7
4 camera(s) on the network
Please enter selected camera Index:3
Socket Handle success!
Available Port 8080
Available Port 8081
Available Port 8082
Gev_CreateConnection
[testGev_CreateConnection]: IP 192.168.128.14 Port 8080
Connected!
[GevInitCameraRegisters]: supported camera 19
Found Your Camera Model Nano Nano

Related

testing - not able to create IPv6 network on Mac

For testing purposes I tried to create IPv6 network from my Mac. I followed this tutorial: https://developer.apple.com/library/archive/documentation/NetworkingInternetWeb/Conceptual/NetworkingOverview/UnderstandingandPreparingfortheIPv6Transition/UnderstandingandPreparingfortheIPv6Transition.html
Except one thing. Instead of Thunderbolt Ethernet I used only Ethernet. WiFi was successfully created and I am able to use it.
However it seems, that the created WiFi is still not IPv6.
I ran this test: http://ipv6-test.com and in results it says Not supported under IPv6 connectivity
What is the problem? Why is my network still IPv4? How can I create proper IPv6 network?
The NAT64 test network that Apple advises you to create does not provide global IPv6 connectivity. It provides only local IPv6 connectivity between your iOS device connected to the WiFi access point and your Mac. The Mac then uses NAT64/DNS64 to send any Internet traffic via IPv4 (which is similar to what some mobile carriers do). This is why an IPv6 testing website shows you that IPv6 is not supported.
The purpose of this setup is to test IPv6 compatibility of your iOS applications on a physical device. You may download an iOS app which will show whether your device is correctly obtaining an IPv6 address from your Mac (because iOS doesn't natively show this info).

Is LoRaWan only accessible with an internet connection?

I'm planning to build an IoT project for an oil palm plantation through the use of an Arduino and an Android Mobile application for my final year project in University. As plantations have low to no communication signals which includes wifi, it is possible to implement LoRaWAN without access to the internet/use/ of a web-based application?
The LoRaWAN node does not need any other communications channel aside from LoRaWAN, of course. Would not make any sense otherwise. ;-)
The gateway however does need a connection to the server application that is to be used as a central instance for your use case. Usually this is an existing LoRaWAN cloud service such as The Things Network (TTN) with your application connected behind, but in theory you could connect the gateway to your very own central, making your whole network independent. This is possible because LoRa uses frequency bands free for use (ISM bands) so anyone can become a „network operator“. The TTN software is available as Open Source, for example.
Connection from the gateway to the central is usually done via existing Ethernet/WiFi infrastructures or mobile internet (3G/4G), whatever suits best.
Besides, the LoRa modules available for Arduinos can be used for a low-level, point-to-point LoRa (not LoRaWAN) connection between two such modules. No gateway here. Maybe that is an option, too, for your use case.
The LoraWAN is using the Gateway connected to some kind of cloud, for example the TTN network which is community based. If you live in a bigger city you have good chances to have a TTN Gateway in your area.
You can however connect two Lora nodes together to get a point to point connection. You can send data from Node1, which is connected to some kind of sensor and batterypowered, to Node2, which is stationary and stores all the data to a flashdrive for example. From this flashdrive you can import the data to a website or you could use an application like Node-Red to display the data on a Dashboard.
Here you will find instructions on how to send Data from one Lora-Node to another.
Here you will find instuctions on how to use Node-Red to display your Lora-Data. You will have to change the input from the TTN-Cloud to a textfile on your Raspberry, or whatever gateway you use. (Optional)

Ethernet communication Infineon Aurix TC29x Starter Kit

I am trying to enable ethernet communication on an Infineon Aurix Tricore, specifically TC29x. To this end I am using the Aurix SW_framework_tools and the ILLD supplied by Infineon on their Aurix workspace page (if you ask for access).
In the ILLD package and demo, they have a ethernet demo with loopback_mode enabled. This demo works fine on the board. I then wanted to send out a package to another machine to enable ethernet communication. I have tried fiddling around with settings, using wireshark with cross-over-cable, wireshark with switch, against windows and linux pc's. With no packages received
I am therefore unsure if the demo has the pinmap configured correctly, and/or if some settings I have missed should be changed. My question is therefore if anyone has enabled ethernet communication using the infineon tools, and can guide me?
Tools
I have chosen the HighTec toolchain using the debuger UDE, all are from the free tools available on Infineon.
Board
Infineon Aurix Tricore TC299 Starter Kit.

Server as WebRTC data channel peer

Are there currently solutions where your server can act as the peer of a WebRTC connection?
The reason I am interested in WebRTC is not the peer-to-peer part of it, but because it enables you to use UDP. You could let players participate in a fast-paced game like Quake without needing any plugins.
It seems that essentially this same question was asked before, but surely things must now be quite different as 2 years have passed.
Yes, it is possible to deploy your WebRTC peer code on server. But since you need to run it on server, it's essentially different from how you run the WebRTC code within the browser - i.e. through a Java Script.
For server based WebRTC peer, you would need to use the WebRTC native code available on platforms - Windows, Mac OS X, Linux, Android and iOS. You can get the WebRTC native code from - https://webrtc.org/native-code/development/
Follow the instructions here to download and build the environment. Sample applications are also present in the repository at the locations - src/webrtc/examples and src/talk/examples
In summary you have use the WebRTC source code that is embedded in the browser in your application code and call the relevant methods / API for the WebRTC functionality.
I have answered similar question at: WebRTC Data Channel server to clients UDP communication. Is it currently possible?
We have implemented the exact same thing: a server/client way of using WebRTC. Besides we also implemented data port multiplexing, so that server would only need to expose one data port for all rtcdata channels.
Here is 2018 update for you: your off-the-shelf solutions are:
Red 5 Pro, Wowza, Kurento, Unreal Media Server, Flashphoner
Also note that in modern public networks TCP is not much slower than UDP;
but UDP may have a considerable packet loss, so try WebRTC-TCP for your Quake idea.

API updates and Wi-Fi network connection

I've been looking through the API docs and support forum. Based on what I've been reading, there are two big holes in the Sony camera stack:
1) Cameras ONLY support master mode (peer-to-peer) wi-fi connections. They do not support infrastructure mode allowing the camera to connect to existing wi-fi networks.
2) API does not support moving files off the camera.
My questions: Does Sony plan to address add these capabilities? If yes, what is the timeline?
Thanks,
Graham
Thank you for your interest.
(EDITED: multi Wi-Fi connection is used by Live View Remote LVR & Camera Remote API is officially NOT supported in multi Wi-Fi mode).
Sony lens style cameras QX1, QX30 and ActionCams HDR-AZ1, HDR-AS100V can connect to existing Wi-Fi networks (Multi Connection Wi-Fi mode). Please check if your Wi-Fi network is compatible with connection method used. Newer QX1, QX30, HDR-AZ1 cameras support content transfer API. Please check API Reference document for exact APIs. Best Regards, Prem, Developer World team.