protect flv video file - flex3

I have a flv file,and want to play from only from AIR application.
Now i need to protect my flv file in such a way that it cant be opened or played by
any other existing flv player in the world.If some one doubleclicked that flv file then
action will be zero or not played by existing flv player in the world and only played from
my own player.
how can i solve this?

You cant really solve this.
Anyone can simply record the screen and audio circumventing any of your tools.
You could host the file on a streaming service which will steam the file via RTMP which is harder to crack but is a) expensive to host b) point 1 applies.
You could probably save the file with the extension .zip, its still a FLV file and you can just tell air to load this FLV just change the path name. This is sort of security through obscurity, point 1 still applies. Someone will probably get that it is a FLV or video due to the very large file size.
If you makes lots of money you can write you own DRM video player for all platforms. But maybe focusing on making money from your idea first than losing a lot of time because 1. alwys applies.

Except from hosting the file online, I am not aware of any built-in way of protecting an FLV file. I guess there should be some commercial solutions, or you could try to encrypt it yourself, but I think that would be over-complicating your application.
The best way I could think about, even if it's a bit "dirty", would be to embed the file in your AIR application.

Related

How can I play an m3u8 video live instead of playing old segments?

I have an m3u8 video that if I just start with the first segment it is behind the live version of the stream. But in some players it plays live just fine, so my thinking is that there must be some magical way of knowing how to play the stream live but I don't see anything on the manifest telling me this.
So what should I be looking for?
Thank you.

Download chunks of MP4 file from iOS

I am developing an iOS app that synchronises with GoPro cameras.
One of the feature requires downloading MP4 from the GoPro (potentially huge).
I basically have a url like: http://10.9.9.5/whatever/video.mp4.
However, I only need parts of the video, let's say between 1:00 and 1:05.
I am thinking on downloading just parts of the MP4, using HTTP "Range" header. I believe that it's possible and I will get a bunch of bytes.
However, is it a valid file? Will I be able to create a MP4 ? Do I need the MP4 header with meta information? Do any of you faced this kind of challenge?
I am using Objective C but I believe that this is a general question.
The MP4 file is a container for video that is structured around something called boxes. Probably you'll have h.264 video in that MP4 file, knowing that, you'll need to know the structure of the file you are trying to chunk.
Depending on the way it is encoded you'll have to look for a box with metadata that'll allow you to search for the correct part of the file either at the beginning or at the end, but you'll have to reconstruct a valid MP4 with the data you get from the original file.
You can see a reference of the file format here http://xhelmboyx.tripod.com/formats/mp4-layout.txt.

SMIL adaptive streaming in Videojs

What is required to use SMIL file to utilize adaptive streaming in a videojs player. I have created the SMIL file in my wowza application and it is creating my 4 separate streams and making them available. However I cannot get my webpage, that uses videojs, to correctly play the SMIL file. Hints on that coding or where to go to find the correct documentation would be greatly appreciated.
There aren't many implementations of SMIL players. I'm sure I've seen wowza URLs that suggest it will output the SMIL as other formats, something like whatever.smil/manifest.m3u8. That's HLS which could be played on mobile and Safari natively and with videojs-contrib-hls elsewhere.
I know the question is old, but I've been struggling with this recently, so I want to share my experience in case anyone is interested. My scenario is very similar: want to deliver adaptive bitrate streaming from Wowza to clients using videojs.
There is a master link that explains how to setup and run Wowza Transcoder for live streaming, and how to set up your Adaptive Bitrate Streams using an SMIL file. Following the video in there you can achieve to have a stream that uses ABS, but the SMIL file is attached to the stream name, so it is not a solution if you have streams that come to Wowza from another Media Server origin and that need to be transcoded before being served to the clients. In the article there are a few key things mentioned (like the Stream Name Groups), but somehow things doesn't seem pretty clear, at least to me. So here is some clarification from what I understood from all articles I read and what I did to achieve ABS:
You can achieve ABS in Wowza either with SMIL files or with Stream Name Groups (NGRP). NGRP refres to a block of streams that is defined in the Transcoder template that can be played back using multi-bitrate streaming (dynamically) (<- this is what I used). And SMIL files are used to create a "static" list of streams for multi-bitrate VOD streaming. If you are using Wowza Origin-Edge Delivery you'll need the .smil file, because NGRP do not get forwarded to the edge. (Source for all this information: here).
In case you need the SMIL file, you probably need to generate a new one for every stream, and probably you want to do that in an automated way, so best way would be through an HTTP request (in the link above it is explained how to achieve this).
In case you can live with NGRP, things are a bit easier:
You need to enable Wowza Transcoder (this is pretty easy and steps are in the video I mention above).
You should create your own Transcoder Template with the different stream presets you want to deliver, as an example you can check the default ones that are already there. The more presets you add, the more work Wowza will need to do whenever a stream comes, since it will need to generate a new stream for every preset that you have defined.
Now is when we generate the NGRPs. In your Transcoder Template, you can generate as many NGRPs as you want (to clarify: these are like groups of streams, that you will be able to set in your clients video player. Each NGRP contains the streams that the video will be able to use when doing the adaptive bitrate streaming). For instance, these are the default NGRPs:
If you play the ngrp "_mobile" in the clients video player, the ABS algorithm in the player will be able to adapt itself to play either the 240p or the 160p streams based on the client capabilities.
So imagine you have these two NGRP. In order to play them in videoJS, you will need to set the source to:
http://[wowza-ip-address]:1935/<name-of-your-application>/ngrp:myStream_all/playlist.m3u8
or
http://[wowza-ip-address]:1935/<name-of-your-application>/ngrp:myStream_mobile/playlist.m3u8
... based on how many options you want to provide to the client player to use for the ABS. (For instance: if your targets are old mobile devices, you probably just want to offer a couple of low bitrate streams).
(This would be in case you're delivering an HLS stream. If other format, the extension would change, for instance if you are delivering a DASH stream you would have "/manifest.mpd" instead of "playlist.m3u8").
That is all, there is also a very helpful link in video.js documentation explaining how it does the bitrate switching: here.
I hope it helps someone! At least clarifying things! :)

Video format that can be played while recording?

I want to make a kind of server-client software which would support live webcam. My approach is to record webcam to an avi on hard disk while transfering each fresh byte of that avi to the other peer who will play it till the most recent byte that is transfered. But of course, avi failed to be played while file is being recorded.
My question is : is there any video/audio format out there that suite my need, to be playable while being recorded ?
I've tried around with ffmpeg for a while, and I wonder if "ffplay" which is always included with ffmpeg can do the trick? ( very few documentatons & examples about ffplay could be found)
PS: I'm using MS built-in function MCISendString to play video, and any 3rd party component would be welcomed as well as any suggestion!
FLV and mpeg2 transport streams both meet your requirements.

How can I limit HTML5 video bandwidth usage in Video.js?

Trying to roll out a video to a client with limited bandwidth. The client is concerned that the video will eat up all the bandwidth at their field office. In testing, I've discovered that even though my video is encoded at 420kbps, when downloading the client it still utilizes about 1.5mbps. Is there a way to control the maximum bandwidth used by video.js or the video tag?
Unfortunately no. The html5 video element doesn't have any throttling options. It's completely up to the browser to decide how to fetch the video data. Some will download the whole thing at once, others will download pieces as you need them. All I think will use as much of the pipe as you give them.
The media source extensions proposal hopes to add some ability here, but that won't be available for a while.
I would find somewhere else besides the office to host the video, like Amazon S3.