Can DLNA renderer play web URI? - upnp

I'm experimenting with DLNA, and I'm able to play local network URIs on a DLNA renderer. I have a sample app with GUI that shows available actions for a renderer, lets invoke them and see state variables.
When I feed a renderer with a local network mp3 URI it plays it OK. The URI looks like:
http://192.168.0.132:9000/disk/DLNA-PNMP3-OP01-FLAGS01700000/O0$1$8I528.mp3
But when I feed it with a web link like this one:
http://somehost.com/p28/e8dfsf1bae84.mp3
it doesn't play it (saying different things implementation dependent - can be state PLAYING on one renderer or state STOPPED on another)
So I wonder, if there is a restriction somewhere to play non-local URIs or some other problem that prevents me using it like this?
Thank you

It is RECOMMENDED that fully qualified URLs to resources on the device are not embedded in HTML presentation pages, but that
relative URLs are used instead, so that the host portion of the embedded URLs does not need to be modified to match the
address on which the GET was received.

dlna is local network only. that is not going to work for somehost.com.
https://superuser.com/questions/700720/dlna-server-outside-lan

Related

Is this legacy cloudfront configuration the same as my non-legacy configuration?

At 11:50 in this video the presenter explains how to configure a Cloudfront behaviour to whitelist the CloudFront-Viewer-Country. While I have followed his instructions as closely as I can, this part of the video is the only one where I cannot be sure because he is using AWS's old console and I'm using the new design, which does not have the same terminology - and hence while his example clearly works, mine does not, leading me to suspect that this is the breaking point.
Old console with the CloudFront-Viewer-Country header whitelisted:
My console with what I hope is the same confguration:
How can I know if these show the same setup of behaviour?
When I say that mine does not work, I have 4 different S3 buckets with different images all named the same. These are served via CloudFront with a Lambda#Edge function derived from the example JS function to direct requests to the appropriate bucket content. What I expect is that viewing the image from the European region (via VPN change) I should see one image and when viewing from the US region I should see a different image. I do not.

SMIL adaptive streaming in Videojs

What is required to use SMIL file to utilize adaptive streaming in a videojs player. I have created the SMIL file in my wowza application and it is creating my 4 separate streams and making them available. However I cannot get my webpage, that uses videojs, to correctly play the SMIL file. Hints on that coding or where to go to find the correct documentation would be greatly appreciated.
There aren't many implementations of SMIL players. I'm sure I've seen wowza URLs that suggest it will output the SMIL as other formats, something like whatever.smil/manifest.m3u8. That's HLS which could be played on mobile and Safari natively and with videojs-contrib-hls elsewhere.
I know the question is old, but I've been struggling with this recently, so I want to share my experience in case anyone is interested. My scenario is very similar: want to deliver adaptive bitrate streaming from Wowza to clients using videojs.
There is a master link that explains how to setup and run Wowza Transcoder for live streaming, and how to set up your Adaptive Bitrate Streams using an SMIL file. Following the video in there you can achieve to have a stream that uses ABS, but the SMIL file is attached to the stream name, so it is not a solution if you have streams that come to Wowza from another Media Server origin and that need to be transcoded before being served to the clients. In the article there are a few key things mentioned (like the Stream Name Groups), but somehow things doesn't seem pretty clear, at least to me. So here is some clarification from what I understood from all articles I read and what I did to achieve ABS:
You can achieve ABS in Wowza either with SMIL files or with Stream Name Groups (NGRP). NGRP refres to a block of streams that is defined in the Transcoder template that can be played back using multi-bitrate streaming (dynamically) (<- this is what I used). And SMIL files are used to create a "static" list of streams for multi-bitrate VOD streaming. If you are using Wowza Origin-Edge Delivery you'll need the .smil file, because NGRP do not get forwarded to the edge. (Source for all this information: here).
In case you need the SMIL file, you probably need to generate a new one for every stream, and probably you want to do that in an automated way, so best way would be through an HTTP request (in the link above it is explained how to achieve this).
In case you can live with NGRP, things are a bit easier:
You need to enable Wowza Transcoder (this is pretty easy and steps are in the video I mention above).
You should create your own Transcoder Template with the different stream presets you want to deliver, as an example you can check the default ones that are already there. The more presets you add, the more work Wowza will need to do whenever a stream comes, since it will need to generate a new stream for every preset that you have defined.
Now is when we generate the NGRPs. In your Transcoder Template, you can generate as many NGRPs as you want (to clarify: these are like groups of streams, that you will be able to set in your clients video player. Each NGRP contains the streams that the video will be able to use when doing the adaptive bitrate streaming). For instance, these are the default NGRPs:
If you play the ngrp "_mobile" in the clients video player, the ABS algorithm in the player will be able to adapt itself to play either the 240p or the 160p streams based on the client capabilities.
So imagine you have these two NGRP. In order to play them in videoJS, you will need to set the source to:
http://[wowza-ip-address]:1935/<name-of-your-application>/ngrp:myStream_all/playlist.m3u8
or
http://[wowza-ip-address]:1935/<name-of-your-application>/ngrp:myStream_mobile/playlist.m3u8
... based on how many options you want to provide to the client player to use for the ABS. (For instance: if your targets are old mobile devices, you probably just want to offer a couple of low bitrate streams).
(This would be in case you're delivering an HLS stream. If other format, the extension would change, for instance if you are delivering a DASH stream you would have "/manifest.mpd" instead of "playlist.m3u8").
That is all, there is also a very helpful link in video.js documentation explaining how it does the bitrate switching: here.
I hope it helps someone! At least clarifying things! :)

Create a custom desktop YouTube player

I want to create an application capable to play YouTube video's audios and also save the downloaded content in a local cache, therefore when the user decides to resume or play the video again, then it doesn't have to download part of video again but only download the remaining part (User can decide what to do with the cache then, and how to organize it).
It is also very convenient for mobiles (it is my main focus) but I'd like to create a desktop one too for experimental purposes.
So, my question itself is, does YouTube provide any API for this? I mean, in order to cache the download content I need that my application download the content and not any embed player (also remember that it is a native application). I have a third-party application in my Android system that plays YouTube videos, so I think it's possible unless that the developers use some sort of hack, again this is what I don't know.
Don't confuse with the web gdata info API and the embed API, this is not what I want, what I want is to handle the video transfer.
As far as I know, there is no official API for that. However, you could use libquvi to look up the URLs of the real video data, or you could have a look at how they do it and reimplement it yourself (see here).

Hidden video URL on video.js

I wanna hidden my URL video's hosted on Amazon S3 to prevent people download them.
I saw another strategy (Amazon Bucket Policies) but I think it's too complex for this case.
Is possible hidden that one?
What do you suggest for this problem?
Instead of the video tag in the beginning, you could have a data tag with just an id. You could then reference in this id, compare it in the javascript and inject the appropriate video url. As previously stated, there's no true way to protect it. With my method, people can still use fiddler to see where it's being referenced in. They can also use the browser's dev tools.

Upload and share image on Google+

I can't seem to find any documentation or reference on upload and sharing images on Google+.
Is this action current supported in google+?
Their moment sharing seems to accept thumbnail url, but I don't want to keep the image hosted on my site once it is created and shared by visitor.
You have a few different options, but I'm not sure any of them are really what you're looking for.
Google+ doesn't really allow outside apps to upload and share something automatically.
As you've observed, the closest you can get is generating a Moment for them to share. And while there are similarities to Instant Upload, it isn't identical. You could probably use a data url to encode and store the image as part of the moment, but I haven't tested this.
Another alternative is to use the Google Drive API to store the image in their Drive space, permit the image to be read publicly, get a link for it, and use this link as the thumbnail URL. Similarly, you might be able to use the Picasa Web Albums Data API to store the image. Both have good, but different, integration with Google+. The former is more modern, while the latter has more features that are tailored for images.