What is the maximum data size that can transfer via GPRS - mobile-application

What is the maximum data size that can transfer via GPRS.
I want to upload some images from my device (mobile phones,tablets,ipad,etc) to the web (server)
Im developing a mobile application in J2ME,ANDROID,etc.In that application i need to upload some images to the server.
So i need the maximum data size will able to upload
All are welcome to give your suggestions.

If you capable of storing any image in device like J2ME/Android then you can upload that to web.No limit on that is there.But the time it takes may be greater if amount of size is much,better you take call on this on the basis of device capabilities mostly for J2ME low end devices.

Related

Best streaming service for mp4 into webview

For the welcome screen of my app, we are trying to serve up a webpage in a webview that consists of a video and some text. (We want to go this route so that we could quickly update the welcome screen and test changes on the fly, versus having to submit and get approval each time.)
The video is only 8.6mB and is currently being played via HTML5 , hosted on an S3 and served via CloudFront. However, the playback still tends to be a bit choppy at times. Does anyone have any recommendations as to an optimal way to host and serve up the video to make it play smoothly? Are there any specific settings for the S3 or CloudFront anyone would recommend that could help?
Thanks in advance for any help anyone can provide.
The most common technique currently is to use ABR in parallel with a CDN to provide smooth playback.
ABR, Adaptive Bit Rate, involves making multiple copies of the video at different bit rates, from low to high and hosting these on the server.
The client receives an index file for the videos, e.g. an m3u8 manifest file, and then chooses the best bit rate for the current conditions to allow smooth playback without buffering.
If network conditions improve the client will 'step up' bit rates and if it gets worse it will 'step down' bit rates.
Typically a low or medium bit rate is chosen as the first one to allow quick and smooth start up.
You can see this effect on services like Netflix as they start up, and you can also see it on YouTube if you right click the video and select 'Stats for Nerds'.
Some links for ABR in AWS Elastic transcoding - you can set the bit rates you want, for e.g. see the note below from their FAQ re HLS jobs:
Specify that the transcoding job create a playlist that references the outputs. You should order your bit rates from lowest to highest, with the audio only stream last, since this order will be maintained in the generated playlist file. Once your transcoding job has completed, the output bucket will contain a proper arrangement of your master and individual M3U8 playlists, and MPEG-2 TS media stream fragments.
Take a look at the sample request on this page here which includes two different bit rates (video service providers will generally have more than 2 but this gives you a feel for the approach):
http://docs.aws.amazon.com/elastictranscoder/latest/developerguide/create-job.html
Azure Media Services has a built in "Adaptive Streaming" preset that is content-adaptive and can adjust the encoding settings to meet your content that is coming in.
See the following - https://learn.microsoft.com/en-us/azure/media-services/media-services-autogen-bitrate-ladder-with-mes

Tizen wearable sensor API stops giving data after sometime

I am trying to make a tizen gear application in which I record accelerometer and gyroscope data of user activity for about an hour and give analytics based on this raw sensor data saved in a csv file.
I have a tizen wearable hybrid application. I am using native service application to retrieve sensor data, and further sending it to web application for writing into csv file.
Upon testing, I have observed that after few minutes of starting the data recording (after about 3-8 minutes) the sensor data is no longer received (seen from the logs in sensor_callback() function). Moreover, during the test scenarios, I have observed that SHealth application opens up and starts recording the dynamic workout activity of the user. Is it possible that SHealth application might be hindering my application from receiving sensor data? If yes, how could one possibly resolve this?
Thanks in advance.
Actually nothing can be say without seeing any code.
SHealth can be an issue but it is just a guess as you have not provided any logs that can show SHealth is an issue.
I will suggest you to follow the sensor app workflow properly from the below guide. The guides also provided sample app.
Accessing Heart Rate Monitor (HRM) Sensor data for Native applications
How to use Light Sensor in Tizen

red5 performance number of users

I need to build infrastructure for video streaming service, that will be able to handle >100 live streams with average of 50 viewers, where top stream can have up to 5000 viewers. All streams will be served as multicast, not extra transcoding will be required (input and output will be h.264), no recording will be made. I'm curious how many streams can handle simple red5 server and how to calculate max of users.

How to measure the performance of my site's video streaming and playback?

I have developed a site that hosts user videos. I store the video files in AWS S3, I deliver them through AWS Cloudfront and I use video.js as the site's player with HTML5 as default and flash as fallback.
Generally the video streaming seems to work fine but in some cases I receive complaints from users for slow or choppy video playback. I want to create some tests to measure the performance of streaming in order to be able to distinguish user problems (e.g. slow connection at the user side) or with my service.
Are there any best practices or tools to collect video delivery metrics? I'm interested in open source solutions or something that I can implement myself because it's just a personal project, but I don't want to rediscover the wheel.
Testing progressive download implies checking the transmission bandwidth and its continuity. For example for a high transmission rate the initial client buffer will be filled faster and the playback will start sooner. However, losing that transmission capacity at some later time can cause re-buffering. The total transmission time of your file must be lower than the video duration.
To identify potential issues you can start with the S3 bucket logs and the CloudFront cache statistics and access logs.
There's a load testing tool written in Java called Apache JMeter. It cannot use JavaScript so it must be configured to request the files directly.
The disadvantage of using a load test tool in a single location is pretty evident. Different geographical areas and carriers have different characteristics and test results will be different.
There are online, non open-source tools that can load test from multiple locations but they are generally paid though some offer free trials.
Here's another way to look at this.
but in some cases I receive complaints from users for slow or choppy video playback.
If you're using an Adaptive HLS stream, and you're CloudFront, and the video is still choppy to some users, that's probably because of their own internet connection speeds.
In that case, you can encode your video in multiple resolutions (using just one AWS MediaConvert job, btw) - like 1080p, 720p, 360p, 240p, 144p etc.
And then Videojs has a stream switcher plugin that will 1) automatically start playing the highest possible resolution - and no higher - that's right for the viewer's connection and 2) give the user the option via a "Settings" (gear) icon in the control bar that they can use to switch resolutions manually.
That way, even those with really poor internet connections should be able to watch your video.
Of course, the other alternative is to use progressive download videos that the viewer can simply click play, then immediately click pause, and wait for the video to buffer, and then play it after it's fully downloaded.
Check out the Videojs Resolution Switcher demo here.
-- Ravi Jayagopal

What is the fastest way to send a small video from one iOS device to another?

I'd like to make an application in which one person takes a short video, maybe five to twenty seconds in length, and sends it to another user as quickly as possible. An example would be an instant replay at a sporting event. What would be the fastest and most reliable way to transfer a video of that size? I am considering the following two options, but am open to other suggestions.
uploading the video to my own server and performing some kind of push operation
performing a direct transfer over a shared wifi network (what about long distance?)
I'd take your first option
Record your video and compress it to a reasonable size on the source device.
Upload the video to an external server.
From the server, send push notifications to the particular users who should be able view the video.
If recipients/consumers of the video could stream the uploaded video from the server it would be a pretty reasonable user experience.