I'm new to Objective-C.
How to read an RTSP stream in an iOS app?
Apparently libraries live555 and ffmpeg are able, but I find no such simple and functional.
Are there other solutions ?
All solutions use ffmpeg.
But some are easier than others, here's a tutorial to get you started.
http://sol3.typepad.com/exotic_particles/
personally I find live 555 offers the most options but also is quite hard to get started with.
most of the difficulty is usually in building ffmpeg libs.
Related
I am wondering if anybody knows if it's possible to use the DJI Windows SDK to decode a video in real-time (render video frames as the video is being retrieved frame-by-frame)?
I don't see anything relevant in the documentation or API reference sections from DJI Windows SDK.
At this point i'll have to dig into the Samples and see if there is anything useful there. Otherwise the online documentation seems rather useless.
Here is the DJI Windows SDK documention.
I agree with you that DJI documentation sucks. But again what you are asking is unclear.
use the DJI Windows SDK to decode a video. So u got an-online video and you want to decoded it. Why not use ffmepg and ffplay???? We use that for DJI tello and IP camera all the time.
If you want to grab the feed from the drone, there are DJI github sample that shows you how. https://github.com/DJI-Windows-SDK-Tutorials/Windows-FPVDemo/tree/master/DJIWSDKFPVDemo
So not 100% sure whats your use case.
Please guide me install WebRTC in fusionpbx. I based on scripts on Freeswitch but it doesn't work.
WebRTC not mentioned in their release page
But they have a demo folder in their their github repo
WebRTC can work in FusionPBX as documented in the book "Mastering FreeSWITCH" Work is being done to make it easier in FusionPBX.
Tip: What you can do in FreeSWITCH can be done in FusionPBX. So their books and documentation apply to FusionPBX.
That one was actually using verto and just a basic example of it from the book so I renamed it to to be more accurate. We are working on a WebRTC app and will put it in the repo when its more polished.
WebRTC instructions are described in the FreeSWITCH book "Mastering FreeSWITCH". Work is being done to make WebRTC easier in FusionPBX but for now the instructions in the book should be enough to get you started. The WebRTC app is not mentioned as its an example and not production ready without the polish one would expect and so it is not mentioned for the release but it is possible to use WebRTC.
so I've been wanting to make a real-time live streaming application. Essentially, the application would send the microphone feed from an Xcode application to a website where it can be viewed in real-time. Is FFMPEG the best solution for this? How would I go about doing this? If that's too broad, then how do I use the FFMPEG framework in an OS X objective-c application?
To directly address your questions:
(1) Is FFMPEG the best solution for this?
It depends. When setting up a live streaming environment you will likely stumble over FFMPEG, VLC and gstreamer, which are the options you have to simply stream video/audio. Therefore, yes, FFMPEG can be used as part of the solution. Please look into the following question: DIY Video Streaming Server
(2) How would I go about doing this?
Your requirement is to make a live streaming application which sends the mic input onto the web. This includes the following steps:
(0) Your Xcode application will need to provide a method to start this process. You don't necessarily need to integrate a framework to achieve this.
(1) Streaming / Restreaming
Use FFMPEG or VLC to grab your device and stream it locally:
ffmpeg -i audioDevice -acodec libfaac -ar 44100 -ab 48k -f rtp rtp://host:port
(2) Segmenting for HTTP Live Streaming*
Use a segmenter such as: mediastreamsegmenter (Apple), livehttp (VLC) or segment (FFMPEG) to prepare your stream for web delivery:
vlc -vvv -I dummy <SOURCEADDRESS> --sout='#transcode{acodec=libfaac,ab=48}:std{access=livehttp{seglen=10,delsegs=false,numsegs=10,index=/path/to/your/index/prog_index.m3u8,index-url=YourUrl/fileSequence######.ts},mux=ts{use-key-frames},dst=/path/to/your/ts/files/fileSequence######.ts}'
*you could also simply use VLC to grab your audiodevice with qtsound (see this question) and prepare it for streaming with livehttp.
(3) HTML 5 Delivery
Publish your stream
<audio>
<source src="YOUR_PATH/playlist.m3u8" />
</audio>
(3) If that's too broad, then how do I use the FFMPEG framework in an OS X objective-c application?
Either use an external wrapper framework to access FFMPEG functionality and consult the tutorials to work with these frameworks or you could also approach this by using NSTask to wrap your command line arguments in Objective-C and simply start those tasks from your application - as in this question.
Another way would be to use VLCKit, which offers VLC functionality in a framework for Objective-C (VLCKit wiki). However when tackling streaming challenges I prefer to work with the actual commands instead of pushing another layer of framework in between, which may be missing some options.
I hope this points you in the right directions. There are multiple ways to solve this. It's a broad question, therefore this broad approach to answer your question.
I have developed a video chat app with webrtc api. I have fallowed the steps given by webrtc. Video working fine. But there is a lot of noice from my laptop. sound is not clear.
But in google developed demo site https://apprtc.appspot.com/ works with out any noise(better compare with us).
I fallowed the same procedure what they did. But no luck.
But in headset this echo is not hearing. This happens when we haering the sound from laptop without headset.
Please give me some suggestion on this.
Thanks in advance. Looking foward for the response.
Give a look at this demo webrtc conferencing supports four callers. This link describes the implentation details architecture
I am looking fo an example showing me how to play an mp3 stream from a URL.
I am trying to build a comand line client for apache using mono with gstreamer.
So far I haven't found any clue how to use gstreamer sharp - does anybody have any experience?
Have a look at Banshee source code, we have a GStreamerSharp backend.