How can we test Roku application - testing

I'm new to Roku development (in R&D phase actually). I read that we can't test Roku app on simulator and need real device. If we develop an application, how will we test it?
I checked Roku developer site and different links on internet, but could not find anything that answers my questions
As per my info, Roku sells 5 devices so:
Can we do one app that supports all 5 devices?
Do we need assets in multiple resolutions?
Do I need to buy all devices?

Can we do one app that supports all 5 devices?
Yes. Roku is trying hard to keep their platform coherent, though there are performance issues between the OpenGL and non-OpenGL devices. The "legacy" models (<2222) are no more supported, the firmware is kept current for the others.
Do we need assets in multiple resolutions?
Theoretically yes, practically - not really. You can make-do with assets in only one resolution, if you RTFM and pre-plan carefully. You'll need 3 sizes of app icon, no sweat. For the real UI though, you can either do HD (720) or FHD (1080) and leave it scale accordingly - the thing is TV is very forgiving to scaling graphics because of 10ft watching distance (60" 1080p screen is "Retina" beyond 8ft). Can largely snub SD.
Do I need to buy all devices?
No. And there are much more than 5 devices that are in use - see https://forums.roku.com/viewtopic.php?f=34&t=86471&start=15#p536994 for some statistics (RokuCo does not publish statistics, so that's about the best info available). If you buy only 2 devices, i'll say get
a #42xx (Roku 3 or current Roku 2) as reference model with OpenGL
a #27xx (Roku 1 or SE) or #5xxx RokuTV as reference for "slower", non-OGLES
As 3rd model i'll say the "new HDMI stick" #3600. You can get that one as the only device, its performance is somewhere between (1) and (2) above... but i don't think developing with only 1 device is a good idea.
One thing you may not have noticed is that there are also these "Roku TV" things under Hisense/TCL/Sharp/Insignia brands, models #5xxx. These are proper TVs with proper Roku smarts - meaning can run your Roku app. And one can be had for as little as... (skimming BestBuy web) $130-150 for 24-32" screen.
And i haven't even mentioned the 4k/HDR craze here, nor the new 37xx/46xx models that will be out for the holiday season (i only expect minor, evolutionary changes there).

Disclosure: I am a Roku employee.
That's correct, you'll need an actual Roku device to test your application. You can buy them used on eBay for very cheap ($20-35), or you can buy a brand new unit from our website for $50. The latest Roku Streaming Stick (Model #3600X) is my personal favorite option, and a great value.
You don't need to buy all devices, although we do recommend having many models so that you can QA test across devices. However, one popular development approach is to build your channel on a lower-end model, which theoretically will assure it works on higher-end models as well. This will also mean you have to spend less on your purchase.
Download our Precertification Checklist and open the third sheet, which includes a list of all our model numbers and corresponding code names. I'd recommend building on a "Giga" or a "Paolo."
Think of this cost as an R&D expense. Plus you'll get to enjoy the device on your free time as well!
As for your other questions:
Yes, you will only build one app that will work on all different devices. We do recommend taking the time to make sure your app is optimized across all devices, including older devices with less processing power. Our Performance Guide is a great starting point for this.
The other option is to check if the first number of the device model is less than “3” (which indicates it's a lower-end device) and add conditionals off that, such as removing animations.
You can find two examples of this on our RokuDev GitHub page:
Hero-Grid-Channel —> Components —> LoadingIndicator —> LoadingIndicator.brs —> Line 244
Multi-Live-Channel —> Source —> Main.brs —> Line 21
Yes, you do need different assets based on resolutions. Take a look at this document: https://github.com/rokudev/docs/blob/master/design/channel-artwork.md

Related

Why are maps in Indian apps like Uber, Google Maps so inaccurate?

I've been using some popular taxi-hailing apps like Uber and OLA in India and the same, Uber, in USA. The location of cars, where they're moving and my position on the app's map are always off. So much so, I'd need to call the driver to tell them where I'm at. From this Quora thread I was able to narrow the problem to be in use of Maps API or GPS signals.
The Quora post: https://www.quora.com/Why-is-GPS-in-India-so-inaccurate
The parody video: https://www.youtube.com/watch?v=hjBM-zSq3NU
It is possible that the problem is caused by your device or the capability of GPS in your area. newer phones can use 10's of GPS's to locate themselves this is actually called AGPS. However, older phones use three cell towers (not GPS) in order to triangulate your position. While, this method was fairly accurate it was known to be more than a couple of feet off on occasion. Even older phones, may even use only two cell towers, the problem here that the speed at which this data (light speed) allowed for a very large margin of error which could be your problem. Also, some phones without AGPS use only one GPS to locate, and this may also complicate things for you.

Data usage from any application

I want to read how much data from 3G every app uses. Is this is possible in iOS 5.x ? And in iOS 4.x? My goal is for example:
Maps consumed 3 MB from your data plan
Mail consumed 420 kB from your data plan
etc, etc. Is this possible?
EDIT:
I just found app doing that: Data Man Pro
EDIT 2:
I'm starting a bounty. Extra points goes to the answer that make this clear. I know it is possible (screen from Data Man Pro) and i'm sure the solution is limited. But what is the solution and how to implement this.
These are just hints not a solution. I thought about this many times, but never really started implementing the whole thing.
first of all, you can calculate transferred bytes querying network interfaces, take a look to this SO answer for code and a nice explanation about network interfaces on iOS;
use sysctl or similar system functions to detect which apps are currently running (and for running I mean the process state is set to RUNNING, like the ps or top commands do on OSX. Never tried I just suppose this to be possible on iOS, hoping there are no problems with app running as unprivileged user) so you can deduce which apps are running and save the traffic stats for those apps. Obviously, given the possibility to have applications runnning in background it is hard to determine which app is transferring data.
It also could be possible to retrieve informations about network activity per process/app like nettop does on OSX Lion, unfortunately nettop uses the private framework NetworkStatistics.framework so you can't dig something out it's implementation;
take into account time;
My 2 cents
No, all applications in iOS are sandboxed, meaning you cannot access anything outside of the application. I do not believe this is possible. Neither do I believe data-traffic is saved on this level on the device, hence apple would have implemented it in either the network page or the usage page in Settings.app.
Besides that, not everybody has a "data-plan". E.g. in Sweden its common that data-traffic is free of charge without limit in either size or speed.

How do Augmented reality browsers track you accurate location to overlay relevant information?

AR browsers include those like wikitude, Layar etc that are available for Iphone and Android smartphones .
When you point your camera to a landmark they automatically overlay previously available information for a location over it (e.g. name of a restaurant over its door)
If the accuracy of GPS as states by some is ~ 10 m , how can this be done accurately ?
I mean if they track the location of your phone to display geographically accurate information , a difference of even 2-3 m can cause havoc.
you can check by yourself in the code of mixare, which is an augmented reality browser released under a free software license (GPLv3). The source code is available on github both for android and iPhone.
To answer your question: there are errors, mostly because of the compass readings (digital compasses are unreliable because they pick up every kind of noise). What helps is that you are usually looking to objects that are quite big (buildings etc.) hence the error is not THAT visible to the end user, but it's still there, trust me :)
HTH
Daniele
Disclosure: I am the project leader of mixare and main developer of the android version

what is difference between testing functionality in browser and same in mobile

I today had basic question in my mind ,what is the difference between
testing functionality in browser and same in mobile. say testing
m.gmail.com in browser and same directly in mobile using selenium.Is
there any difference other than the fitting into screen etc.
If you have a different server to handle mobile requests, you might have different implementation of the features, thus, you need to test both platforms.
The content might also be delivered according to the mobile browser used (Safari on the iPhone is much more feature-rich than the default N95 browser, for example).
Let’s look at some of the major differences.
Limited Real Estate
The most obvious difference is screen size. Responsive design is relatively easy to code for desktop and laptop browsers – most of which come with predefined ratios anyway.
By contrast, mobile devices are much smaller. Aligning images and text becomes a real challenge – especially when you factor in features like portrait orientation (i.e. the ability to rotate a mobile device and have the image flip accordingly).
Worse still, there is so much more variation – even when dealing with the same manufacturer.
For example, the iPhone 5 has a 4” display, whereas the iPhone 6 is 4.7” on the diagonal. When you add in the iPhone 6 Plus (5.5”), iPad Mini (7.9”), and iPad standard (9.7”), it becomes harder and harder to code and test mobile applications that look “good” on all screens.
Storage and RAM
Screens are not the only spatial constraint mobile software testers face. You also must factor in the limited storage and processing power of today’s mobile devices. Even high capacity phones can quickly fill up as users download apps and multimedia.
In the browser world, such constraints are moot. Desktop storage is essentially unlimited (measured in terabytes). And cloud-based storage is easy to increase, even if this requires charging higher prices to end-users.
Internet Access
With the exception of a few off-line browser applications (e.g. Gmail), Web-based software always requires an Internet connection.
Mobile apps may or may not need online access (although those that don’t often take up more space – see point #2 above). When Internet is needed, however, mobile software testers must factor in 3G and 4G – in addition to normal Wi-Fi.

How to demo examples of embeded systems?

It seems that a lot of small business people have a need for some customized embedded systems, but don't really know too much about the possibilities and cannot quite envisage them.
I had the same problem when trying to explain what Android could do; I was generally met with glazed eyes - and then I made a few demos. Somehow, being able to see something - to be able to touch it and play around with it – people have that cartoon lightbulb moment.
Even if it is not directly applicable to them, a demo starts them thinking about what could be useful to them.
The sort of person I am talking about may or may not be technical, but is certainly intelligent, having built from scratch a business which turns over millions.
Their needs are varied, from RFID or GPS asset & people tracking, to simple stock control systems, displays, communications, sometime satellite, sometimes VPN or LAN (wifi or RJ45). A lot of it needs a good back-end database with a web-site to display, query, data-mine …
So, to get to the question, I am looking for a simple project, or projects, which will cause that cartoon lightbulb moment. It need not be too complicated as those who need complicated solutions are generally tech-savvy, just something straightforward & showing what could be done to streamline their business and make it more profitable.
It would be nice it if could include some wifi/RJ45 comms, communicate across the internet (e.g not just a micro-controller attached to a single PC – that should then communicate with a server/web-site), an RFID reader would be nice, something actually happening (LEDs, sounds, etc), plus some database, database analysis/data-ming – something end-to-end, preferably in both directions.
A friend was suggesting a Rube Goldberg like contraption with a Lego Mindstorms attached to a local PC, but also controllable from a remote PC (representing head office) or web site. That would show remote control of devices. Maybe it could pick up some RFID tags and move them around (at random, or on command), representing stock control (or maybe employee/asset movement within a factory or warehouse (Location Based Services/GIS)), which cold then be shown on the web site, with some nice charts & graphs etc.
Any other ideas?
How best to implement it? One of those micro-controller starter kites like http://www.nerdkits.com/ ? Maybe some Lego, or similar robot kit, a few cheap RFID readers … anything else?
And – the $409,600 question – what's a good, representative demo which demonstrate as many functionalities as possible, as impressively as possible, with the least effort? (keeping it modular and allowing for easy addition of features, since there is such a wide area to cover)
p.s a tie with an Adroid slate PC would be welcome too
Your customers might respond better to a solid looking R/C truck which seeks RFID tags than to a Lego robot. Lego is cool, but it has a bit of a slapped-together 'kiddie' feel.
What if you:
scatter some RFID tags across the conference room.
add a GPS & wifi transmitter to your truck.
drive the truck to the tag
(manually - unless you want to invest a lot of time in steering algorithms).
have a PC drawing a real-time track of the trucks path.
every the truck gets within range of the tag, add it to an inventory list on the screen, showing item id, location, time recorded, total units so far.
indicate the position of the item on the map.
I'd be impressed.
Is it 'least effort'? I don't know, but I'd hope that if this is the type of solution you are pitching, that you already have a good handle on how to read GPS and RFID devices, how to establish a TCP or UDP connection with wifi, how to send and decode packets. Add some simple graphics and database lookup, and you are set.
Regarding hardware, I don't have any first hand experience with any of these, but the GadgetPC Wi-Fi G Kit + a USB RFID reader + a USB GPS reciever looks like a nice platform for experimenting with this.
Many chip manufactures have off-the-shelf demo boards. Microchip has some great demo boards for TCP/IP communications on an embedded system. I haven't seen one yet for RFID. Showing potential customers some of these demos could get them thinking about what is possible.