The Pebble and Galaxy Gear devices have web browsers; what are some CSS media queries we can use to address screens of that size? Alternately, let's start a list of UA strings for smart watches.
Related
I have developed a site that hosts user videos. I store the video files in AWS S3, I deliver them through AWS Cloudfront and I use video.js as the site's player with HTML5 as default and flash as fallback.
Generally the video streaming seems to work fine but in some cases I receive complaints from users for slow or choppy video playback. I want to create some tests to measure the performance of streaming in order to be able to distinguish user problems (e.g. slow connection at the user side) or with my service.
Are there any best practices or tools to collect video delivery metrics? I'm interested in open source solutions or something that I can implement myself because it's just a personal project, but I don't want to rediscover the wheel.
Testing progressive download implies checking the transmission bandwidth and its continuity. For example for a high transmission rate the initial client buffer will be filled faster and the playback will start sooner. However, losing that transmission capacity at some later time can cause re-buffering. The total transmission time of your file must be lower than the video duration.
To identify potential issues you can start with the S3 bucket logs and the CloudFront cache statistics and access logs.
There's a load testing tool written in Java called Apache JMeter. It cannot use JavaScript so it must be configured to request the files directly.
The disadvantage of using a load test tool in a single location is pretty evident. Different geographical areas and carriers have different characteristics and test results will be different.
There are online, non open-source tools that can load test from multiple locations but they are generally paid though some offer free trials.
Here's another way to look at this.
but in some cases I receive complaints from users for slow or choppy video playback.
If you're using an Adaptive HLS stream, and you're CloudFront, and the video is still choppy to some users, that's probably because of their own internet connection speeds.
In that case, you can encode your video in multiple resolutions (using just one AWS MediaConvert job, btw) - like 1080p, 720p, 360p, 240p, 144p etc.
And then Videojs has a stream switcher plugin that will 1) automatically start playing the highest possible resolution - and no higher - that's right for the viewer's connection and 2) give the user the option via a "Settings" (gear) icon in the control bar that they can use to switch resolutions manually.
That way, even those with really poor internet connections should be able to watch your video.
Of course, the other alternative is to use progressive download videos that the viewer can simply click play, then immediately click pause, and wait for the video to buffer, and then play it after it's fully downloaded.
Check out the Videojs Resolution Switcher demo here.
-- Ravi Jayagopal
Can we control multiple DSC-QX100 cameras using the Camera Remote API SDK from an iPad OS7?
The objective is to cause multiple cameras to "snap" picture at the exact same time. Perhaps each camera has an address (serial number)...can the software communicate with all cameras at the same time using multiple addresses? Need is limited to still photos and so-called fast, rapid photography. Video not necessary.
If so, how?
Unfortunately you can only control one QX100 lens at a time. This is because the lens connects over WiFi and you are limited to only a single WiFi connection at once on an iPad. It may be possible using a desktop PC with multiple wireless cards installed but that would be the only way.
As the Sony rep said, there's no way to do this with "officially supported" techniques.
The reason for this is that the camera acts as a WiFi Access Point (AP) - so while multiple devices can connect to it, most mobile devices can only connect to it and not anything else (since iOS and Android don't support connection to multiple APs simultaneously). This is also why you can't use other network interfaces when connected to the camera. (I don't know about iOS, but Android always prioritizes WiFi over cell network data, for example.)
Android devices have a feature called "WiFi Direct" that provides more flexibility in terms of peer-to-peer interconnection, but iOS does not support WFD. The QX100 DOES respond to WFD invites, and you can accept a pairing request with (if I remember correctly) a long-press of the shutter button. However, the official app only supports normal WiFi AP connections.
I have not yet attempted to see if using Sony's remote API in combination with the (unsupported but apparently present) WiFi Direct capability works.
More info on Wifi Direct and Android can be found at http://developer.android.com/guide/topics/connectivity/wifip2p.html
Marlin SONY, I disagree, wifi is Ethernet and by definition can handle multiple devices on the same network. If you run a phone or iPad as a hotspot and connect multiple devices, it works.
Multicam Switcher Basic is an example of a free app that supports cutting together multiple camera angles live. Unfortunately the app is still being developed so features like third party camera support isn't included, but it does show what is possible and awaits development.
I think this should be possible. Apps like CollabraCamâ„¢ (Multicam Social Video Production) or RecoLive MultiCam prove that is possible to use mulitple cams simultaneous.
I need someone too to develop an App to be able to use "two" sony dsc-qx for 3D shoots. Please, if You know how or who can do this contact me email#3-d.re
I was doing some research to find out ways that would allow me to stream music on my website legally. I came across iTunes partner program which allows to stream music on a website through their embedded players. I was wondering is it possible to stream iTunes music through your own custom player? If that is not possible via iTunes, then what other methods are available?
You could do this with a server software like Icecast, there is some good tutorials on setting this up here: http://www.icecast.org/docs.php
Depending on how many browsers you want to support you might want to setup two streams, one in MP3/OGG and a "backup" stream in Flash. Then add some detection as to what the browser supports and present the correct stream (i.e.: Use the HTML5 <audio> tag for playing MP3/OGG to browsers that support this, and use your flash stream for the rest)
their program allowing playback of music in the iTunes Store is likely only for those with the intention to sell music, without providing a commerce business, you'd be breaking their partner program T&C's.
I want to develop an App that requires wired communication between Web cam type video camera and iPad2. Basically I will directly connect Web cam and iPad2 using cable and when I start web cam, whatever images(picture/video) captured by web cam should be displayed on iPad2.
Based on my research on this I found that iPad2 cable is only made for iPod Program so the connector is not a traditional USB port I can't do direct communication between web cam and iPad2. Am I missing anything?
We are going to use Vivotek camera and they have mentioned here that we can use safari to receive the Motion-JPEG stream. I am wondering if that could also possible on iPad 2 and is it reliable?
Further I found Apple's MFi Program to develop electronic accessories that connect to iPod, iPhone, and iPad. Is there anyone out here used this already and know more about this if I can go for this?
Thanks.
You can receive a motion jpeg stream in mobile Safari or in a UIWebView in a custom app. I am not able to (yet) successfully receive a motion jpeg stream via an AVPlayer, AVPlayerItem or AVURLAsset.
Becoming a MFI authorized company is non-trivial (I tried once). They want larger established companies that have demonstrated they have the skills and manufacturing know-how/contacts to produce quality accessories.
Curious if you can step back from your initial requirement and see if you can figure out how to do it wirelessly for at least the last step to the iPad 2. Can go wired up to 2 feet away from the iPad and use a local private wifi network for that last 2 feet (say).
I recommend you add (use existing, or purchase) a wireless gateway. Connect the camera to the gateway, and then connect the iPAD to the wireless network, and then browse in safari to the camera and then you can view the image. There is no "hard wired" way to get this to work.
As for the "hard wired" portion of the question, I do not believe that is not possible without a lot of work and hardware. There is no "video in" on an ipad to make it a monitor for a camera.
I need to develop a symbian application in which the app is triggered while taking a photo and upload to a web location with the GPS location of the phone. I would like to know which all devices can support this and is there any API restrictions or licence required to do this.
Sorry for the relevant plug but what you are looking for is basically 3 chapters (Networking, Multimedia and Location Based Services) of the Quick Recipes on Symbian OS book.
Since your application will use Networking APIs (and therefore cost the application user money), you will have to go through the Symbian Signed process.
As far as how many Symbian-powered phones contain both a camera and a GPS, I am afraid you are going to need to look at the invidual handset manufacturers websites to come up with an exhaustive list. Outside Japan, they are Nokia, Sony-Ericsson and Samsung.
Here is a list of Nokia models and their info
Here is a list of capabilities and what signing method they need