I'd like to implement a "Picture Transfer Protocol over IP" (PTP/IP) connection with Objective C between my camera (Nikon D5300) and my Mac to transfer photos and other data wirelessly.
The camera creates a WiFi hotspot, and the Mac connects to it.
I know that the camera uses port 15740.
What is the best point to start with? NSInputStream?
I know that ShutterSnitch has this (iOS app). I emailed its author, who while wasn't willing to license me his work, was very kind to point me to the following resources:
http://www.shuttersnitch.com/forums/viewtopic.php?f=4&t=1580
I use CocoaAsyncSocket for communication. For communication, look at the PTP (over USB) spec and http://www.cipa.jp/ptp-ip/index_e.html. The payload packets are pretty much like in the USB spec but have some different header bytes.
Related
I have a camera that sends mjpeg frames as UDP packets over wifi that I would like to display in my max os-x application. My application is written in objective-c and I am trying to use the AVFoundation classes to display the live stream. The camera is controlled using http get & post requests.
I would like the camera to be recognized as a AVCaptureDevice as I can easily display streams from different AVCaptureDevices. Since the stream is over wifi, it isn't recognized as a AVCaptureDevice.
Is there a way I can create my own AVCaptureDevice that I can use to control this camera and display the video stream?
After much research into the packets sent from the camera, I have concluded that it does not communicate in any standard protocol such as RTP. What I ended up doing is reverse-engineering the packets to learn more about their contents.
I confirmed it does send jpeg images over UDP packets. It takes multiple UDP packets to send a single jpeg. I listened on the UDP port for packets and assemble them into a single image frame. After I have a frame, I then created an NSImage from it and displayed it in an NSImageView. Works quite nicely.
If anyone is interested, the camera is an Olympus TG-4. I am writing the components to control settings, shutter, etc.
I have a Samsung's photo frame which is connected to LAN. It broadcasts as UPnP device like that:
09:48:05.429956 IP 192.168.5.4.1900 > 239.255.255.250.1900: UDP, length 279
E..3."#..............l.l...eNOTIFY * HTTP/1.1
LOCATION: http://192.168.5.4:57959/
HOST: 239.255.255.250:1900
SERVER: POSIX, UPnP/1.0, Intel MicroStack/1.0.1868
NTS: ssdp:alive
USN: uuid:cc845bff-073b-c7de-1317-6c3e34888fd0
CACHE-CONTROL: max-age=1800
NT: uuid:cc845bff-073b-c7de-1317-6c3e34888fd0
Frame presents itself as urn:schemas-upnp-org:device:MediaPlayer:1, but I cannot find this device type on UPnP Forum pages. Here is XML descriptor: https://www.dropbox.com/s/unuarev1ywr8hc5/ramka.xml
I tried to set up frame entering my DLNA server IP address (frame configuration), but didn't work. Frame says that there is no server it can play content from.
There is no MediaRenderer service so I cannot just send pictures. I suspect that frame is a "kind of client" but I don't know how to use it. User manual says nothing about media server and serving content from network.
Has anybody some idea how to figure it out?
The device type is MediaPlayer which suggests that it can discover your UPnP MediaServer using SSDP, and then you should be able to browse your photos and play them via the Samsung Photo Frame. Try switching your MediaServer off and then on to encourage it to send a NOTIFY packet so that the Photo Frame finds it.
The MediaPlayer device will not be recognised by standard UPnP CP apps (they are looking for MediaRenderer and MediaServer device types).
The device description XML lists AVTransport and Rendering control services which suggest that the Photo Frame can be controlled remotely e.g. play a photo, set brightness.
The non-standard UPnP device type and extra services suggest that there must be an App from Samsung for the Photo Frame. That is your best option for a remote control.
There is no MediaRenderer service so I cannot just send pictures
That's not the whole story. It's true that the device does not seem to implement ConnectionManager service so it couldn't be a compliant MediaRenderer device (and naming apparently proprietary Samsung things in the "upnp-org" namespace seems obnoxious to me), but the device does claim to implement RenderingControl and AVTransport so it's conceivable that it could be controlled with a (almost) standard control point...
As an example, I'm guessing gupnp-av-cp (a testing AV control point from gupnp-tools on most linux OSes) might work with a oneline change to replace the device type and another to set a fake renderer protocolinfo (basically making a guess on what kind of data the renderer accepts).
If I have two iOS devices, both on same WiFi network and both with Bluetooth turned on, and I use GameKit (specifically GKSession) to manually setup a communications channel between them (without using GKPeerPickerController), I cant tell if it is using WiFi or Bluetooth.
Does iOS prioritise one over the other? I'm hoping that it uses Wifi before Bluetooth, but id like to be sure.
If WiFi is available and bluetooth isn't, it uses Wifi, if Bluetooth is available and Wifi isn't, it uses Bluetooth. Im wondering how they're talking if both bluetooth and WiFi are available, which does GameKit choose over the other?
The only way I can see to find this out is by running a packet sniffer on my WiFi and running several tests across different devices. Kinda hoping someone can save me that effort!
Thanks :-)
According to Apple's documentation if you use a GKPeerPickerController to create your GKSession you will be able to select bluetooth or wifi connectivity (see GKPeerPickerConnectionType).
I'm hoping that it uses Wifi before Bluetooth, but id like to be sure.
It seems an internet connections requires a bit of user code (but not bluetooth) so I would guess it defaults to bluetooth to avoid making this requirement mandatory.
Does anyone know how to create a raw socket to a USB device? Like you can create raw sockets to ethernet devices, I would like to send/receive arbitrary data to a USB device.
Depends on the platform - you need a low level USB library.
Either http://sourceforge.net/projects/libusb/ or http://sourceforge.net/projects/libusb-win32/ is a good place to start
ps. It isn't a socket as such, sockets are specific to networks
You can write to the EndPoint using a Raw Socket but a certain protocol needs to be followed for the device to physically accept and reply to commands.
Depending on how the protocol is written you may be able to use a a RawSocket and some Usb sniffer to replay the data to the EndPoint but most devices employ a Timestamp and handshake process which needs to be performed dynamically for each connection and usually involves querying the device state and using the information to complete the handshake along with other information depending on the protocol of the device in question.
Wanted to write better AI for the game Red Alert 2. Game uses IPX protocol for multiplayer. If i'm right, IPX protocol is on Layer 2, so there is only ETH frame without destination/source IP or destination/source Port, so game is running via it's own protocol. Therefore, i could analyse this protocol and programe AI, that would simulate real player. Do you think, tak it is realistic idea? Is there any way how to "generate" and send IPX packet (ETH frame)?
Thanks for any suggestions.
Is there any way how to "generate" and send IPX packet (ETH frame)?
IPX is a layer 3 protocol; but it doesn't contain IP addresses, because it's not an "IP" (internet protocol).
Assuming you're on Windows, the Winsock library is supposed to support IPX: see Winsock IPX/SPX Annex.
IPX is a packet-oriented (not stream-oriented), unreliable (not guaranteed delivery) protocol: like UDP, instead of like TCP.
I expect that using Winsock for IPX is like using Winsock for UDP, except using SOCKADDR_IPX etc.
Do you think, tak it is realistic idea?
If you'll need to reverse-engineer the contents of the packets, using only a packet sniffer, then I expect that will be difficult.