I have a client who wants to display live 24 hour footage on their website, to show off the progress of a number of big refurbishment jobs they are carrying out.
I've looked at IP cameras and to be honest this looks like the most logical technology to use, but not sure if I'm missing something. Would it be possible to put the live feed straight from the camera onto their website (via an iFrame maybe)?
The website will be getting quite a few hits but nothing massive, so I think a regular broadband connection at each site should give them enough upload capacity to pull this off.
Am I approaching this from the right direction? Is there anything I should take into consideration before recommending a solution to the client?
Yes, that will work fine. Most IP cameras will provide a motion JPEG stream which is really just a collection of JPEG images.
This allows you to decide on what framerate you want to send up to the site.
Providing an iframe with some javascript to refresh the JPEG at your desired framerate would be an easy solution.
Related
We're currently in the process of recording 1fps time-lapsed 4k 360 photos of every island in the Bahamas, with embedded GPS EXIF data. An average hour of filming tends to produce around 600 image frames, which can easily expand to 2000-10,000 images per day on bigger routes. 2000 or so are approved on Google Maps already, but we're hitting a larger brick wall.
The Street View app is obviously the best way to upload when you have 50-100 image files, but it obviously struggles when it starts to hit over 500+ uploads in a batch (publishing doesn't start, or the app crashes), so we're left manually submitting collections. Add that to the standard 4000/day quota, and it's quite a challenge.
Having looked at the Publish API, it's rather tricky to leave a CLI tool running as it's designed with OAuth flow in mind with 1hr access tokens. The service account route seems to the way to go, but the PHP API client seems to have scant documentation for SV Publishing. Connecting photos is also tricky with that many images.
We ideally need a desktop uploader (like the backup tool), or a way to directly import from folders in Google Drive. The first seems discontinued, and there's no data on the second.
Can anyone explain or elucidate on the best practice for this kind of volume upload with the Street View publishing service?
Since you are only capturing 4K images, it will be much simpler to use video mode, depending on if your camera supports it at the desired framerate. You may check this documentation for more information.
Additional information:
You may also check out some of the desktop utilities at the bottom of the Street View website which may be able to help.
You may want to consider one of the Street View ready cameras (also listed on the above webpage) that is capable of recording and uploading 360 videos to Street View. Upon publication, the 360 photo frames are extracted from the video and used to create an automatically connected Street View experience on Google Maps.
Check these pages to learn more about this option:
Set up and connect 360 cameras
Capture and publish in Video mode with the Street View app
Tips for capturing 360 videos for Street View
I'm 100% sure that this isn't the right form to post a question like this, but I hope someone on here has the answer to my question.
Is there advertising platform that sends a true or false response dependent on if the user finishes the ad video to the server so I can add points to the users account.
(I don't need help adding points, or creating a point system, just an advertising platform that would be easy to integrate with the videos on my site)
I understand if this isn't the right forum for this, if someone could point me in the right direction. Before you say anything, yes I have googled, just can't find anything that suits my needs.
You need a video player that supports the VAST protocol
VAST provides a common protocol that enables ad servers to use a
single ad response format across multiple publishers/video players
This format allows to specify the so-called tracking elements - URIs that would be called once a particular events occurs. You can manually provide a VAST xml response with a tracking element for a video completion event.
Also consider using DFP to serve your video ads - it's free up until a certain traffic volume and would track everything you might need
For the welcome screen of my app, we are trying to serve up a webpage in a webview that consists of a video and some text. (We want to go this route so that we could quickly update the welcome screen and test changes on the fly, versus having to submit and get approval each time.)
The video is only 8.6mB and is currently being played via HTML5 , hosted on an S3 and served via CloudFront. However, the playback still tends to be a bit choppy at times. Does anyone have any recommendations as to an optimal way to host and serve up the video to make it play smoothly? Are there any specific settings for the S3 or CloudFront anyone would recommend that could help?
Thanks in advance for any help anyone can provide.
The most common technique currently is to use ABR in parallel with a CDN to provide smooth playback.
ABR, Adaptive Bit Rate, involves making multiple copies of the video at different bit rates, from low to high and hosting these on the server.
The client receives an index file for the videos, e.g. an m3u8 manifest file, and then chooses the best bit rate for the current conditions to allow smooth playback without buffering.
If network conditions improve the client will 'step up' bit rates and if it gets worse it will 'step down' bit rates.
Typically a low or medium bit rate is chosen as the first one to allow quick and smooth start up.
You can see this effect on services like Netflix as they start up, and you can also see it on YouTube if you right click the video and select 'Stats for Nerds'.
Some links for ABR in AWS Elastic transcoding - you can set the bit rates you want, for e.g. see the note below from their FAQ re HLS jobs:
Specify that the transcoding job create a playlist that references the outputs. You should order your bit rates from lowest to highest, with the audio only stream last, since this order will be maintained in the generated playlist file. Once your transcoding job has completed, the output bucket will contain a proper arrangement of your master and individual M3U8 playlists, and MPEG-2 TS media stream fragments.
Take a look at the sample request on this page here which includes two different bit rates (video service providers will generally have more than 2 but this gives you a feel for the approach):
http://docs.aws.amazon.com/elastictranscoder/latest/developerguide/create-job.html
Azure Media Services has a built in "Adaptive Streaming" preset that is content-adaptive and can adjust the encoding settings to meet your content that is coming in.
See the following - https://learn.microsoft.com/en-us/azure/media-services/media-services-autogen-bitrate-ladder-with-mes
I have developed a site that hosts user videos. I store the video files in AWS S3, I deliver them through AWS Cloudfront and I use video.js as the site's player with HTML5 as default and flash as fallback.
Generally the video streaming seems to work fine but in some cases I receive complaints from users for slow or choppy video playback. I want to create some tests to measure the performance of streaming in order to be able to distinguish user problems (e.g. slow connection at the user side) or with my service.
Are there any best practices or tools to collect video delivery metrics? I'm interested in open source solutions or something that I can implement myself because it's just a personal project, but I don't want to rediscover the wheel.
Testing progressive download implies checking the transmission bandwidth and its continuity. For example for a high transmission rate the initial client buffer will be filled faster and the playback will start sooner. However, losing that transmission capacity at some later time can cause re-buffering. The total transmission time of your file must be lower than the video duration.
To identify potential issues you can start with the S3 bucket logs and the CloudFront cache statistics and access logs.
There's a load testing tool written in Java called Apache JMeter. It cannot use JavaScript so it must be configured to request the files directly.
The disadvantage of using a load test tool in a single location is pretty evident. Different geographical areas and carriers have different characteristics and test results will be different.
There are online, non open-source tools that can load test from multiple locations but they are generally paid though some offer free trials.
Here's another way to look at this.
but in some cases I receive complaints from users for slow or choppy video playback.
If you're using an Adaptive HLS stream, and you're CloudFront, and the video is still choppy to some users, that's probably because of their own internet connection speeds.
In that case, you can encode your video in multiple resolutions (using just one AWS MediaConvert job, btw) - like 1080p, 720p, 360p, 240p, 144p etc.
And then Videojs has a stream switcher plugin that will 1) automatically start playing the highest possible resolution - and no higher - that's right for the viewer's connection and 2) give the user the option via a "Settings" (gear) icon in the control bar that they can use to switch resolutions manually.
That way, even those with really poor internet connections should be able to watch your video.
Of course, the other alternative is to use progressive download videos that the viewer can simply click play, then immediately click pause, and wait for the video to buffer, and then play it after it's fully downloaded.
Check out the Videojs Resolution Switcher demo here.
-- Ravi Jayagopal
Trying to roll out a video to a client with limited bandwidth. The client is concerned that the video will eat up all the bandwidth at their field office. In testing, I've discovered that even though my video is encoded at 420kbps, when downloading the client it still utilizes about 1.5mbps. Is there a way to control the maximum bandwidth used by video.js or the video tag?
Unfortunately no. The html5 video element doesn't have any throttling options. It's completely up to the browser to decide how to fetch the video data. Some will download the whole thing at once, others will download pieces as you need them. All I think will use as much of the pipe as you give them.
The media source extensions proposal hopes to add some ability here, but that won't be available for a while.
I would find somewhere else besides the office to host the video, like Amazon S3.