Samsung SPF-71N UPnP photo frame - upnp

I have a Samsung's photo frame which is connected to LAN. It broadcasts as UPnP device like that:
09:48:05.429956 IP 192.168.5.4.1900 > 239.255.255.250.1900: UDP, length 279
E..3."#..............l.l...eNOTIFY * HTTP/1.1
LOCATION: http://192.168.5.4:57959/
HOST: 239.255.255.250:1900
SERVER: POSIX, UPnP/1.0, Intel MicroStack/1.0.1868
NTS: ssdp:alive
USN: uuid:cc845bff-073b-c7de-1317-6c3e34888fd0
CACHE-CONTROL: max-age=1800
NT: uuid:cc845bff-073b-c7de-1317-6c3e34888fd0
Frame presents itself as urn:schemas-upnp-org:device:MediaPlayer:1, but I cannot find this device type on UPnP Forum pages. Here is XML descriptor: https://www.dropbox.com/s/unuarev1ywr8hc5/ramka.xml
I tried to set up frame entering my DLNA server IP address (frame configuration), but didn't work. Frame says that there is no server it can play content from.
There is no MediaRenderer service so I cannot just send pictures. I suspect that frame is a "kind of client" but I don't know how to use it. User manual says nothing about media server and serving content from network.
Has anybody some idea how to figure it out?

The device type is MediaPlayer which suggests that it can discover your UPnP MediaServer using SSDP, and then you should be able to browse your photos and play them via the Samsung Photo Frame. Try switching your MediaServer off and then on to encourage it to send a NOTIFY packet so that the Photo Frame finds it.
The MediaPlayer device will not be recognised by standard UPnP CP apps (they are looking for MediaRenderer and MediaServer device types).
The device description XML lists AVTransport and Rendering control services which suggest that the Photo Frame can be controlled remotely e.g. play a photo, set brightness.
The non-standard UPnP device type and extra services suggest that there must be an App from Samsung for the Photo Frame. That is your best option for a remote control.

There is no MediaRenderer service so I cannot just send pictures
That's not the whole story. It's true that the device does not seem to implement ConnectionManager service so it couldn't be a compliant MediaRenderer device (and naming apparently proprietary Samsung things in the "upnp-org" namespace seems obnoxious to me), but the device does claim to implement RenderingControl and AVTransport so it's conceivable that it could be controlled with a (almost) standard control point...
As an example, I'm guessing gupnp-av-cp (a testing AV control point from gupnp-tools on most linux OSes) might work with a oneline change to replace the device type and another to set a fake renderer protocolinfo (basically making a guess on what kind of data the renderer accepts).

Related

Access basler camera through IP address

How can get live stream from basler camera through IP address?
I am following the basler documentation
https://docs.baslerweb.com/overview-of-the-pylon-ip-configurator
if by live stream you mean RTSP compressed stream or similar, then Basler pylon does not give you such direct possibility.
Pylon SDK is meant to be used for Basler's industrial grade cameras that operate with uncompressed image buffers via pylon API (C++, .NET, ect). So pylon gives you access to camera and image data as such and does not do much more.
To address RSTP streaming, with pylon you can generally create a custom worker application that connects and runs camera on the one hand, creates and maintains RTSP stream with use of i.e. FFmpeg/FF on the other, and feeds this stream with incoming image buffers.
Some related links:
https://docs.baslerweb.com/pylonapi/
https://trac.ffmpeg.org/wiki/StreamingGuide/
Or, if you just want to initialize camera & have live image preview on screen - use pylon Viewer, it has fullscreen function.
IP Configurator is the tool for matching device IP with your NIC card for GigE based Basler cameras. Tutorial how to set up GigE cameras and make them visible in Pylon Viewer:
https://docs.baslerweb.com/assigning-an-ip-address-to-a-camera

How to display MJPEG stream transmitted via UDP in a Mac OS-X application

I have a camera that sends mjpeg frames as UDP packets over wifi that I would like to display in my max os-x application. My application is written in objective-c and I am trying to use the AVFoundation classes to display the live stream. The camera is controlled using http get & post requests.
I would like the camera to be recognized as a AVCaptureDevice as I can easily display streams from different AVCaptureDevices. Since the stream is over wifi, it isn't recognized as a AVCaptureDevice.
Is there a way I can create my own AVCaptureDevice that I can use to control this camera and display the video stream?
After much research into the packets sent from the camera, I have concluded that it does not communicate in any standard protocol such as RTP. What I ended up doing is reverse-engineering the packets to learn more about their contents.
I confirmed it does send jpeg images over UDP packets. It takes multiple UDP packets to send a single jpeg. I listened on the UDP port for packets and assemble them into a single image frame. After I have a frame, I then created an NSImage from it and displayed it in an NSImageView. Works quite nicely.
If anyone is interested, the camera is an Olympus TG-4. I am writing the components to control settings, shutter, etc.

Media Foundation Capture - how do you detect the true native input format

I've got a few video converter boxes (Marshall VAC-11SU3, Marshall VAC-11HU3, Magewell USB Capture SDI, Blackmagic UltraStudio Express) and no cameras.
They all have an incoming video signal plugged into their respective SDI or HDMI ports.
The issue is that GetNativeMediaType always returns the same format as GetMediaTypeByIndex does for index 0 regardless of the actual video format that is coming into the SDI/HDMI port.
Every Media Foundation example I've seen so far has a UI to pick the "correct" native format. This menu is populated from GetMediaTypeCount and GetMediaTypeByIndex for the device.
My users will not know what to pick!
We've been using Blackmagic's DeckLink APIs and our users see the incoming video signal format in the UI.
We'd like to expand support for multiple device manufacturers but this one has me stumped.
Media Foundation does not employ a concept of signal format detection you have with recent Blackmagic hardware (earlier Blackmagic products, by the way, did not offer detection).
A video source driver could indeed enumerate the media type it sees on the wire as first GetNativeMediaType output and/or offer dynamic format change during streaming session to such format. Media Foundation video sources are mostly assuming however webcamera-like devices and have a fixed type enumeration order.
I would not assume Blackmagic driver to be different because it mostly mimics a webcamera, so that with a WDM driver Blackmagic device inputs could be consumed using standard APIs. If one needs extended functionality, such as signal detection, Blackmagic suggests using their DeckLink SDK (which is good by the way).

Objective C PTP/IP connection

I'd like to implement a "Picture Transfer Protocol over IP" (PTP/IP) connection with Objective C between my camera (Nikon D5300) and my Mac to transfer photos and other data wirelessly.
The camera creates a WiFi hotspot, and the Mac connects to it.
I know that the camera uses port 15740.
What is the best point to start with? NSInputStream?
I know that ShutterSnitch has this (iOS app). I emailed its author, who while wasn't willing to license me his work, was very kind to point me to the following resources:
http://www.shuttersnitch.com/forums/viewtopic.php?f=4&t=1580
I use CocoaAsyncSocket for communication. For communication, look at the PTP (over USB) spec and http://www.cipa.jp/ptp-ip/index_e.html. The payload packets are pretty much like in the USB spec but have some different header bytes.

Access SmartCards from Windows 8 Store App (WinRT)

In a Windows 8 Store App I would like to read Data from a SmartCard. After installing the SmartCard reader (USB Device) I can read its device path and connection state via the Windows.Devices.Enumeration namespace. Even the device interface id is retrievable (50dd5230-ba8a-11d1-bf5d-0000f805f530) and I put this as a required capability in the App's manifest file.
For the interaction with the device I use a C++ Component calling the CreateDeviceAccessInstance Method. But this call always results in an ACCESS_DENIED exception.
Further research taught me, that interacting with a custom hardware device (everything that is not a printer, microphone, mouse, ...) would need several adjustments in the device driver published to the windows 8 driver store.
I'd be glad to do so, but I am no IHV and would like to use the generic driver and the generic interface.
Could anyone give me a hint how to proceed from here and use the generic interface for USB SmartCard devices?
This is not a real answer, but I have the same problem and I have spent quite some time looking for the hard-to-find bits of information on this subject around the internet, and I'd like to share my results.
Windows 8.1 has some specific APIs for (virtual) SmartCards (API reference, sample), but it seems like they can only be used for authentication and there is no way to send APDU commands to a card at the moment (see this comment by Himanshu Soni). I guess one could use the new USB APIs to talk to the reader directly, but then you'd have to implement the whole protocol yourself.