I have a phone which can display http MJPEG streams, and I would like to get this working. I have a camera here, which only sends out an RTSP Stream, I could convert this with vlc to a http MJPEG stream, but my phone needs this embedded into a website.
Like this: http://88.53.197.250/axis-cgi/mjpg/video.cgi?resolution=320x240
But the vlc transcoding, just sends out the bare http stream.
Is there any chance to embedd this correct, so that I can display this on the screen? I've googled a lot, but couldn't find a solution for that.
Thank you very much
I would like to use Suse Linux to do that
This is the command I use for converting RTSP to MJPEG with vlc:
vlc.exe -vvv -Idummy hereYourVideoSource --sout #transcode{vcodec=MJPG,venc=ffmpeg{strict=1}}:standard{access=http{mime=multipart/x-mixed-replace;boundary=--7b3cc56e5f51db803f790dad720ed50a},mux=mpjpeg,dst=:8080/} --run-time= hereYourTimeOutValue vlc://quit;
Change hereYourVideoSource for your RTSP source and hereYourTimeOutValue for enable a timeout of proccessing if you want.
In this sample I use port 8080 on localhost, you can change it to another port. The request to get this mjpeg should be:
http://127.0.0.1:8080/
or:
http://localhost:8080/
In html you get the mjpeg using img tag:
<img src="http://localhost:8080/" />
Hope it helps. Suerte.
Related
In the gstreamer - streamingtest example
(https://janus.conf.meetecho.com/streamingtest.html)
a gstreamer pipe is sending to udpsink host=127.0.0.1 port=5004, which then is broadcasted via webRTC in Janus.
how is it possible to send a webcam-stream from another user through his browser getUserMedia() to Janus-Gateway for broadcasting?
Do i have to configure a pipe for it as well and how would that look like?
I have installed Janus and i am able to run all the Demos.
there is a rtp_forward request possible against the videoroom which would forward the rtp from a publisher in that room to the streaming plug-in or any other ip.
it was added here:
https://github.com/meetecho/janus-gateway/pull/255
instead of rtp_listen though, you should request rtp_forward and also pass in the secret.
(this solution needs a browser, but I marked it as right solution since it works for me this way and also scaling users is possible like this)
I am doing some research on how to do two things: Trim and stream H.264 video.
What does it take to trim a mpeg4 h.264 video to 30 seconds and downsize it to 480p. I am assuming I would need to find a 3rd party library that does H.264 encoding, doing a quick Google search and the only thing I find is VideoLan.org, but I cannot find their commercial license. Are there other options folks know of?
How is streaming of H.264 to a HTML5 work? I know that with Flash, one can have one file format that requires the whole file to be downloaded, then it will play. The other format allows streaming, but requires a Flash server. I am going to be using Apache to serve up the images on the Intranet, how does one go about streaming them on Apache?
1) You can use FFmpeg :
ffmpeg -i in.mp4 -s 720x480 -t 30 out.mp4
-s is to resize and -t is to dump only 30 seconds
2) For http streaming, if the moov atomc(contains the video headers and seek information), is present at the start of the video, the video will start playing as soon as it buffers up few seconds, it does not wait for the whole file to download. Forward seek is possible through ByteRange headers in http. To put moov atom in the beginning use qt-fastart . It comes with FFmpeg
qt-faststart in.mp4 out.mp4
Im trying to figure out a way of having a server which has a camera (or multiple cameras) connected via usb (firewire, whatever...) and then streams the video to users.
The idea so far is to have a red5 server which streams the camera feed as a H.264 stream and have a Html5 player like VideoJS with Flash fallback play the video. Looking at the browser support chart at http://en.wikipedia.org/wiki/HTML5_video#Browser_support i can see i would also need WebM and/or Ogg streams.
Any suggestions on how to do this? Is it possible to route the stream via some (preferable .NET) web application and recode the video on the fly? Although im guessing that would take some powerful hardware :) Is there another media server which supports all three formats?
Thank you for your ideas
You can use an IceCast server. Convert the camera's output to Ogg via ffmpeg2theora and pipe it into IceCast via oggfwd. Then let HTML5 <video> play from the IceCast server. Worked for me for Firefox.
E.g.
# Tune DVB-T receiver into channel
(tzap -c channels-4.conf -r "TV Rijnmond" > /dev/null 2>&1 &)
# Convert DVB-T output into Ogg and pipe into IceCast
ffmpeg2theora --no-skeleton -f mpegts -a 0 -v 5 -x 320 -y 240 -o /dev/stdout /dev/dvb/adapter0/dvr0 2>/tmp/dvb-ffmpeg.txt | oggfwd 127.0.0.1 8000 w8woord /cam3.ogg > /tmp/dvb-oggfwd.txt 2>&1
I m publishing a stream on red5 using microphone on client side as3 code . but it not published good stream but the same thing i m doing on FMS it creates perfect stream
I need to be understand what is the issue during publish on red 5 .
Read Red5 documentation for that. And ofcourse there are differences between the performances of the two servers. However if you want to improve the quality of stream you can use FFMPEG or Xuggler with Red5 to encode streams.
Because you are not saying what your encoder is, it is hard to give a clear answer. If you are using Adobe's FMLE to create the stream that goes to your FMS server, it is the FMLE that explains why you have good video and audio encoding 'out-of-the-box'.
I have never tried to use FMLE with RED5, so I cannot tell you if it works, but doubtful it works out-of-the-box. It probably can work with a bit of tweaking on both client and server side.
To use your own encoder, what you do is capture two streams using ffmpeg, a great example on how to do that is on stackoverflow here.
Once you are capturing, you can use ffmpeg to send the combined audio and video streams to a file, or you can send it directly to your red 5 server. A simplified version of the ffmpeg command to show mapping two streams to give a single rtmp output is shown below. ffmpeg -i video_stream -i audio_stream -map 0:0 -map 1:0 -f flv rtmp://my.red5.server:1935/live/mystream
I'm trying used mediaplay play http or rtsp protocol uri from server, when I play the
address as http://**.wma or *.mp3 ,it can working,but I tryed played the address as
"http://qr.fm.qq.com/qqradio?qqradio",it didn't working.
and also I'trying used VideoView play rstp protocol uri from server,when I play the adress as
“rstp://*.sdp”,it can working ,but I tryed play the adress as "rtsp://vs1.thmz.com/radio31"
,it didn't working.
Anybody help me and tell me how
These are live streams, not static files, so, while it may play back some .wma and .mp3 content - these live streams are not defined like that.
Are you sure the first stream link is valid? After a quick scan with nmap, it seems you may need to be in china to connect to this feed (qq.com Registrant Country Code - CN) I get 1000 scanned ports all filtered, usually means a firewall blocking specific geographic regions.
rtsp://vs1.thmz.com/radio31 -> This is a Windows Media Audio stream, using WMA2 codec, delivered via RTSP, which according to the Android Supported Media Formats: http://developer.android.com/guide/appendix/media-formats.html - is NOT supported.