I have implemented VCL to play the live streaming video and Now I want to play it on apple TV by airplay option on my streaming screen..
Can you pls help me out.
Thanks
By question its not clear what you mean by implementing VLC. Apple provided framework to stream live video (AVPlayer). Which supports Picture-in-picture and airplay.
For making things more clear please elaborate, what exactly you are trying to do.
Support for AirPlay will be available in libvlc and VLCKit 4.0 next year. It will allow you play anything from any source on Apple TV and if needed, will convert the media on-the-fly. Thus, AirPlay support will match what's possible with Chromecast right now.
With VLCKit 3, there is no good way to do it. You could do Display Mirroring, but this would have a bad impact on performance and video quality. Audio-only output via AirPlay / RAOP will just work fine and quality is good. It even supports multichannel now.
Related
I have tried to use AVPlayer to play a video with my Cocoa Application. However, it turns out that AVPlayer is capable of playing only a very restricted variety of video types. As an example, .avi files and .flv files will not get played by AVPlayer. I suppose that AVPlayer will play only those video types that Quicktime will play (i.e., quite a few!).
So I was looking for alternatives: I will need my software to play the largest variety possible of videos since I cannot know in advance which kind of videos will my users open.
I have Mac OSX 10.10 (Yosemite). What are the alternatives that are available to me?
Thank you very much indeed
You are correct in thinking AVPlayer only plays Quicktime compatible media.
Things I can think of, off the top of my head (which may / may not be suitable, but I don't know what other restrictions you have):
Offer to transcode the video using ffmpeg or similar.
See if "Perian" helps (I'm not sure AVPlayer machinery will see it). Perian is an older Quicktime (32bit) plugin that includes many more codecs. It's deprecated and more or less going away, so it's not a long term solution... if it works at all.
I just bought a Sony A7 and I am blown away with the incredible pictures it takes, but now I would like to interact and automate the use of this camera using the Sony Remote Camera API. I consider myself a maker and would like to do some fun stuff: add a laser trigger with Arduino, do some computer controlled light painting, and some long-term (on the order of weeks) time-lapse photography. One reason I purchased this Sony camera over other models from famous brands such as Canon, Nikon, or Samsung is because of the ingenious Sony Remote Camera API. However, after reading through the API reference it seems that many of the features cannot be accessed. Is this true? Does anyone know a work around?
Specifically, I am interested in changing a lot of the manual settings that you can change through the menu system on the camera such as ISO, shutter speed, and aperture. I am also interested in taking HDR images in a time-lapse manner and it would be nice to change this setting through the API as well. If anyone knows, why wasn't the API opened up to the whole menu system in the first place?
Finally, if any employee of Sony is reading this I would like to make this plea: PLEASE PLEASE PLEASE keep supporting the Remote Camera API and improve upon an already amazing idea! I think the more control you offer to makers and developers the more popular your cameras will become. I think you could create a cult following if you can manage to capture the imagination of makers across the world and get just one cool project to go viral on the internet. Using http and POST commands is super awesome, because it is OS agnostic and makes communication a breeze. Did I mention that is awesome?! Sony's cameras will nicely integrate themselves into the internet of things.
I think the Remote Camera API strategy is better than the strategies of Sony's competitors. Nikon and Canon have nothing comparable. The closest thing is Samsung gluing Android onto the Galaxy NX, but that is a completely unnecessary cost since most people already own a smart phone; all that needs to exist is a link that allows the camera to talk to the phone, like the Sony API. Sony gets it. Please don't abandon this direction you are taking or the Remote Camera API, because I love where it is heading.
Thanks!
New API features for the Lens Style Cameras DSC-QX100 and DSC-QX10 will be expanded during the spring of 2014. The shutter speed functionality, white balance, ISO settings and more will be included! Check out the official announcement here: https://developer.sony.com/2014/02/24/new-cameras-now-support-camera-remote-api-beta-new-api-features-coming-this-spring-to-selected-cameras/
Thanks a lot for your valuable feedback. Great to hear, that the APIs are used and we are looking forward nice implementations!
Peter
The Glass Mirror Timeline API document suggests that video can be streamed to a Glass headset (https://developers.google.com/glass/timeline). I'm trying to determine whether this could theoretically work with a WebRTC connection? Documentation is limited around the browser/rendering capabilities of the timeline so has anyone tried something similar?
Ran across a discussion about WebRTC + Glass in a reported issue - https://code.google.com/p/webrtc/issues/detail?id=2083
From the sound of things someone got it to work in chrome beta at 176*144. There were some audio/frame rate issues that appear to have been resolved. Note though they talk about streaming video/audio from the glass to another machine not video streaming into the glass display. I believe that at this point it will only work in chrome beta & doubt you could integrate this into the timeline smoothly, though with how hard Google is pushing WebRTC I would not be surprised to see increased support. I'll be testing this out with my own WebRTC demos soon & will know more.
WebRTC for Google glass has been reported: http://www.ericsson.com/research-blog/context-aware-communication/field-service-support-google-glass-webrtc/. There were some limitations, e.g. the device overheated after 10 minutes.
Another case - http://tech.pristine.io/pristine-presents-at-the-austin-gdg (thanks to Arin Sime)
New to Mac OS X, familiar with Windows. Windows has DirectShow, a good number of built-in filters, COM programming, and GraphEdit for very fast prototyping and snooping on the graphs you've constructed in code.
I'm now about to go to the Mac to work with cameras, webcams, microphones, color spaces, files, splitting, synchronization, rendering, file reading, file saving, and many of things I've come to take for granted with DirecShow when putting together applications for live performance. On the Mac side, so far I've found ... nothing! Either I don't know where to look or I'm having the toughest time tying the Mac's reputation for its ease of handling media with a coherent programmatic ability to get in there and start messin' with media manipulatin' building blocks.
I've seen some weak suggestions to use gstreamer or some library for QT but I can't bring myself to believe that this is the Apple way to go. And I've come across some QuickTime documentation but I'm not looking to do transitions, sprites, broadcasting, ...
Having a brain trained on DirectShow means I don't even know how Apple thinks about providing DirectShow-like functionality. That means I don't know the right keywords and don't even know where to look. Books? Bought a few. Now I might be able to write some code that can edit your sister's wedding video (if I can't make decent headway on this topic I may next be asking what that'd be worth to you), but for identifying what filters are available and how to string them together ... nothing. Suggestions?
Video handling is going through a huge transition on the Mac at the moment. QuickTime is very old, but also big and powerful, so it's been undergoing an incremental replacement process for the past 5 years or so.
That said, QTKit is the QuickTime subset (capture, playback, format conversion and basic video editing) which is supported going forward. The legacy QuickTime APIs are still there for the moment, and probably will remain at least until its major features are available elsewhere, but are 32-bit only. For some involved video stuff you may end up needing to use it in places.
At the moment, iOS is ahead of the Mac because it could start from scratch with AV Foundation. The future of the Mac media frameworks will probably either be AV Foundation directly (with QTKit being a lightweight shim over the top) or an extension of QTKit that looks very similar.
For audio there's Core Audio which is on Mac and iOS and isn't going away any time soon. It's quite powerful but somewhat obtuse in places. Luckily online support is very good; the mailing list is an essential resource.
For filters and frame-level processing you've got Core Video as someone else mentioned, as well as Core Image. For motion graphics there's Quartz Composer which includes a graphical editor and a plugin architecture to add your own patches. For programmatic procedural animation and easily mixing rendering modelsĀ (OpenGL, Quartz, video, etc.) there's Core Animation.
In addition to all of these, of course there's no reason you can't use open source libraries where the built-in stuff doesn't do what you want.
To address your comment below:
In QuickTime (and QTKit), individual data types like audio and video are represented as tracks. It may not be immediately clear that QuickTime can open audio as well as video file formats. A common way to combine audio and video would be:
Create a QTMovie with your video file.
Create a QTMovie with your audio file.
Take the QTTrack object representing the audio and add it to the QTMovie with the video in it.
Flatten the movie, so it doesn't simply contain a reference to the other movie but actually contains the audio data.
Write the movie to disk.
Here's an example from Blender. You'll see how the A/V muxing is done in the end_qt function. There's also some use of Core Audio in there (AudioConverter*). (There's some classic QuickTime export code in quicktime_export.c but it doesn't seem to do audio.)
I'm may just be confused how this all works...in which case, please explain it to me. But, what video codecs are supported by the UPnP AV standard? Or is it on a per-device (client) basis? I want to create an app to send video data to a UPnP device (XBox, PS3, etc) but am not really sure what video codec I should target... it can be anything, I just want to know if there's a way of knowing that it will work on everything.
Edit: Ok, so I will clarify that I will be able to choose whatever video format I want, and once I do that will be all I plan on supporting, therefore I don't need transcoding. My main point was that I was hoping there was some "standard" format used that would be supported on ALL devices so that I could just pick that and be done with it... Obviously this is not the case... but is there any sort of unofficial codec that most devices support? Is there a list of devices and supported codecs anywhere?
Also, how does DLNA work into this...if I understand correctly it's sort of a subset of UPnP AV (but plus some other stuff...) And most UPnP devices I've seen are also DLNA compliant...so would just using whatever codecs DLNA supports be a way to have a common ground?
Doug is right, it depends on the client device.
You could build it so that your server transcodes files on the fly to make them available on the UPnP server, though. It would be easier just to choose a file format that is compatible with all of your devices (if the list is small enough for that to be possible).
For example, you cannot play H.264 encoded video in MKV format on Xbox 360 or PS3 right now. However, you can transcode the files to a format that IS supported. There are many of guides available online for transcoding these files for PS3/360 and what formats are supported by each device.
Here are a couple example guides:
Xbox 360 Conversion Guide
PS3 Conversion Guide
From what i understand it really depends on what codecs the device has installed - therefore it can be anything supported by your device.
I know for example that ps3 supports divx and xbox does not (unless you have windows 7 to transcode for you)
For DLAN device, some video format are must ,some are optional.
Home Devices
MUST : MPEG2
Optional :MPEG1, MPEG4, WMV9
Mobile/Handheld Devices
MUST : MPEG4 AVC (AAC LC Assoc Audio)
Optional :VC1, H.263, MPEG4 part 2, MPEG2, MPEG4 AVC (BSAC or other for Assoc. Audio)
Any other video codecs does not be mentioned here are optional , in my understanding.
check it here: