Access basler camera through IP address - basler

How can get live stream from basler camera through IP address?
I am following the basler documentation
https://docs.baslerweb.com/overview-of-the-pylon-ip-configurator

if by live stream you mean RTSP compressed stream or similar, then Basler pylon does not give you such direct possibility.
Pylon SDK is meant to be used for Basler's industrial grade cameras that operate with uncompressed image buffers via pylon API (C++, .NET, ect). So pylon gives you access to camera and image data as such and does not do much more.
To address RSTP streaming, with pylon you can generally create a custom worker application that connects and runs camera on the one hand, creates and maintains RTSP stream with use of i.e. FFmpeg/FF on the other, and feeds this stream with incoming image buffers.
Some related links:
https://docs.baslerweb.com/pylonapi/
https://trac.ffmpeg.org/wiki/StreamingGuide/
Or, if you just want to initialize camera & have live image preview on screen - use pylon Viewer, it has fullscreen function.
IP Configurator is the tool for matching device IP with your NIC card for GigE based Basler cameras. Tutorial how to set up GigE cameras and make them visible in Pylon Viewer:
https://docs.baslerweb.com/assigning-an-ip-address-to-a-camera

Related

Is it possible to control/change remote peerconnection's constraints/record footage using webrtc?

I'm trying to build an web app where there's a broadcaster of a camera stream, and viewers who can watch and control the stream. Is it possible for a viewer to control the constraints of the camera (exposure, brightness, etc.) and possibly pause, rewind, and record footage, being used to broadcast the stream using webrtc? Wanted to know before I decide to use webrtc as the way to accomplish this task. Based on my reading of the webrtc guide and example pages, I think recording is possible. But I wasn't sure about a remote peerconnection changing the local peerconnection's settings or vice versa.

How to display MJPEG stream transmitted via UDP in a Mac OS-X application

I have a camera that sends mjpeg frames as UDP packets over wifi that I would like to display in my max os-x application. My application is written in objective-c and I am trying to use the AVFoundation classes to display the live stream. The camera is controlled using http get & post requests.
I would like the camera to be recognized as a AVCaptureDevice as I can easily display streams from different AVCaptureDevices. Since the stream is over wifi, it isn't recognized as a AVCaptureDevice.
Is there a way I can create my own AVCaptureDevice that I can use to control this camera and display the video stream?
After much research into the packets sent from the camera, I have concluded that it does not communicate in any standard protocol such as RTP. What I ended up doing is reverse-engineering the packets to learn more about their contents.
I confirmed it does send jpeg images over UDP packets. It takes multiple UDP packets to send a single jpeg. I listened on the UDP port for packets and assemble them into a single image frame. After I have a frame, I then created an NSImage from it and displayed it in an NSImageView. Works quite nicely.
If anyone is interested, the camera is an Olympus TG-4. I am writing the components to control settings, shutter, etc.

USB packet and data buffer capture

I need a software or application with API support to capture USB packet and data buffer. I would like to analyse the captured data using LabVIEW.
Suggest applications for usb packet and data capture with API support, so that I can access them using LabVIEW.
Or
Alternate methods to capture and analyse usb data using LabVIEW
I had tried a approach using logman.exe . But that doesn't log all the USB packets. Has anyone tried logman to capture the usb packets?
you can consider using VISA functions.
A few examples are shipped with Labview (open example finder and look for USB).
Here is a starting point giving instructions about how to give VISA access to the device.
You need find out some windows dll's and use it in labview. the examples provided for NI USB devices.

Samsung SPF-71N UPnP photo frame

I have a Samsung's photo frame which is connected to LAN. It broadcasts as UPnP device like that:
09:48:05.429956 IP 192.168.5.4.1900 > 239.255.255.250.1900: UDP, length 279
E..3."#..............l.l...eNOTIFY * HTTP/1.1
LOCATION: http://192.168.5.4:57959/
HOST: 239.255.255.250:1900
SERVER: POSIX, UPnP/1.0, Intel MicroStack/1.0.1868
NTS: ssdp:alive
USN: uuid:cc845bff-073b-c7de-1317-6c3e34888fd0
CACHE-CONTROL: max-age=1800
NT: uuid:cc845bff-073b-c7de-1317-6c3e34888fd0
Frame presents itself as urn:schemas-upnp-org:device:MediaPlayer:1, but I cannot find this device type on UPnP Forum pages. Here is XML descriptor: https://www.dropbox.com/s/unuarev1ywr8hc5/ramka.xml
I tried to set up frame entering my DLNA server IP address (frame configuration), but didn't work. Frame says that there is no server it can play content from.
There is no MediaRenderer service so I cannot just send pictures. I suspect that frame is a "kind of client" but I don't know how to use it. User manual says nothing about media server and serving content from network.
Has anybody some idea how to figure it out?
The device type is MediaPlayer which suggests that it can discover your UPnP MediaServer using SSDP, and then you should be able to browse your photos and play them via the Samsung Photo Frame. Try switching your MediaServer off and then on to encourage it to send a NOTIFY packet so that the Photo Frame finds it.
The MediaPlayer device will not be recognised by standard UPnP CP apps (they are looking for MediaRenderer and MediaServer device types).
The device description XML lists AVTransport and Rendering control services which suggest that the Photo Frame can be controlled remotely e.g. play a photo, set brightness.
The non-standard UPnP device type and extra services suggest that there must be an App from Samsung for the Photo Frame. That is your best option for a remote control.
There is no MediaRenderer service so I cannot just send pictures
That's not the whole story. It's true that the device does not seem to implement ConnectionManager service so it couldn't be a compliant MediaRenderer device (and naming apparently proprietary Samsung things in the "upnp-org" namespace seems obnoxious to me), but the device does claim to implement RenderingControl and AVTransport so it's conceivable that it could be controlled with a (almost) standard control point...
As an example, I'm guessing gupnp-av-cp (a testing AV control point from gupnp-tools on most linux OSes) might work with a oneline change to replace the device type and another to set a fake renderer protocolinfo (basically making a guess on what kind of data the renderer accepts).

What is most appropriate USB class to handle images and video transfer and streaming?

I am working on a project (a digital camera) that should be able to take still images and short video clips and make those available to the host. As well as being able to stream live video.
Which USB class[es] should I use?
Should I use PTP (for still images and video) and USB Video Class for streaming?
Does PTP support transfer of video?
Does PTP support video streaming?
For static objects PTP is better since it does have a definition for an object. Current Canon DSLRs are also capable of "streaming" live video via standard PTP transaction mechanism. However, real streaming would benefit from isochronous transfer so UVC is better for streaming.
You can also have both classes implemented in a single device and have objects transferred bia PTP set of endpoints and stream via UVC endpoint.