I've been working on a project which involves a network connection between a server on a Mac, and a client on an iOS device.
The client side requires a live view of of the server's screen.
As of now, my solution has been
getting snapshots of the Mac's screen using CGDisplayCreateImage( ). Timing is done with an NSTimer set to fire every 1/30.0 second.
compressing and generating GIF images using CGImageDestination
sending the GIF images (which are stored in NSData) over Bonjour to the client.
client receives the data, and initializes a new UIImage from it, passing it into an image view.
But I've been encountering a few problems with this method.
The latency is horrible. Each image took many seconds to load, and seemed to never even reach the client.
Each image NSData object's size was around 170000 bytes long. This seems WAY too high to even send over network fast enough. But I have no idea how to compress the images further.
Thank you.
Related
Am using this commit of uvc-gadget together with g_webcam as of 4.4.143 for Rockchip. This version of uvc-gadget only transmits a static mjpeg image (and is much better written than earlier sources of uvc-gadget).
Observing interesting behavior on host laptop, which is receiving the stream from gadget with guvcview: after a while frames start to flicker like an old TV (V4L2_CORE: (jpeg decoder) error while decoding frame), and then eventually the stream breaks down on host: V4L2_CORE: Could not grab image (select timeout): Resource temporarily unavailable. Underneath host continues polling ([75290.124695] uvcvideo: uvc_v4l2_poll), there is no error neither in host's dmesg, nor in uvc-gadget on device. In fact, after re-opening guvcview streaming works again without uvc-gadget restart, but soon crashes in the same way.
I'm using stock USB3.0 cable, which is both for streaming and powering the device. AFAIK, there is no source of noise that may result in such kind of strange flickering on physical level.
Additionally, I've noticed with smaller USB packet sizes going down from 1024 to 256, stream survives for longer (up to 50,000 frames or so), but still finally crashes.
Any idea what's going on here?
UPDATE
After I switched from MJPEG-compressed to uncompressed stream, there is no longer flickering, but still always loss of contact after several seconds: V4L2_CORE: Could not grab image (select timeout): Resource temporarily unavailable
Is there an easy way to playback video data stored in a stream-object (http://picamera.readthedocs.io/en/release-1.13/api_streams.html) (e.g. a PiCameraCircularIO with h264 encoding) using one of the PiRenderers?
The mmalobj API (http://picamera.readthedocs.io/en/release-1.13/api_mmalobj.html#id3) seems to support the playback of buffers, though it is hard to understand and everything I tried to use an MMALVideoDecoder and setting the input to a the data of a PiCameraCircularIO buffer failed.
I'm using the circular stream advancde recipe (http://picamera.readthedocs.io/en/release-1.13/recipes2.html#splitting-to-from-a-circular-stream) but rather than saving the last n seconds to a file, I want to playback them.
I'm working on an application that transmits video in low quality using webrtc. Periodically I want to send from same camera single frame in high resolution.
When I try to acquire another stream using getUserMedia I get same low quality one and when I try to pass some constraints to force higher resolution then then operation fails with overconstrained error (even though normally when there is no other stream it works fine).
Is it even possible to have at the same time many streams with different parameters from same device? Or is it possible acquire high resolution image without requesting for a new stream?
Dunno if that the right place to post the question.
However out of curiosity, how does 10 seconds are loaded in 2 seconds? I could have understand if an audio being loaded to the fileserver and the client is loading it afterwards. However lifestream that comes from RTSP I have got two answers,
It's either loads played content
Or the internet lifestream is behind real stream...
Anyway I would like to hear your aswers and guidance on this topic. Thanks
It's the second option. If you would stream audio in "real time" without any delay, you would have serious problems when the connection is lost or data is delayed, for example, for 100ms. Than the user wouldn't hear anything for 100ms, which would be pretty annoying. This especially happens with mobile connections, which have much higher error rates and while you move have a hard time to keep a stable connection.
Usually the acutal playback is delayed and the next seconds are buffered. When the connection is down and comes back in the buffered time frame, than the user doesn't notice that the connection was lost. In you example the connection can be lost for up to 8 seconds without any problems.
I have an iOS app where I need to load a lot of stuff (JSON data, Images ...) initially after app start (after appdelegate calls the first view). Currently I am using AFNetworking for my requests (Async requests).
What I experienced is a big delay before the first request (3G connection - iPhone 4) itself really starts. I measured the time from operation start to the first bits loaded and checked the answering time of the server. It takes between 2.5 and 4 seconds before the app even starts receiving the first bytes, server said the buildtime is about 0.03 seconds. I think that is a big delay, even if he might have to establish the 3G connection first.
I tested this also with other servers, methods (synchronous/asynchronous), with AFNetworing and simple NSURLRequest and it is always the first request that takes a lot of time.
So my question is, did anyone else experience this and is there a way to accelerate that?