Is there any way to adjust the quality level at which RED5's SOSample records a webcam stream? I just installed it on a remote server, and it's recording in an awful quality.
The FLV will be recorded at the level set within your client application, see these specifically (via Adobe):
http://www.adobe.com/livedocs/fms/2/docs/wwhelp/wwhimpl/common/html/wwhelp.htm?context=LiveDocs_Parts&file=00000548.html
http://livedocs.adobe.com/flex/3/langref/flash/media/Camera.html#setQuality()
Another factor is your bandwidth, if data is dropped or lost in transit to the server it will not be recorded.
Related
I am currently using Webrtc to stream a game. It's a custom WebRTC implementation inside the game engine.
Both the client and the server easily support 100+ Mbps upload speed. Currently, i have locked the max bitrate to 80 Mbps, which is supported.
The issue happens when the webrtc probes for the connection speed, it drops down to 7-8 mbps then it slowly goes back up to 80mbps.
Which drops again when a webrtc-probing happens.
I have linked a video below of the issue.
https://drive.google.com/file/d/1coI3rrGVf4OAFnt2oeSCx0zFJznvfyQv/view?usp=sharing
What could the issue be and is there any solution to fix it?
I am trying to do something new, something I have never done before. I am looking for advice or point me into right direction how to choose technology. I am trying to build race simulation app that will have thousands of iot devices streaming data into central platform. While I understand that I can use some sort of IOT hub with cloud providers, but what technology do I choose for storing data?
Example is online indoor biking app. There are apps where you can connect your indoor bike online and have simulated race. For my project I am trying to build something similar. Do I use NO SQL db in this scenario? What technology will allow better scale of application like this since it could be millions of devices around the world in "simulated" race. I am not worried about front-end and things like that, but backend, IOT hub, storing data, presenting-real time?
At this point it is important to understand what kind of data your IoT devices will stream, and at what kind of a rate. It will have significant impact on your question.
That it is if it's just location information and some other small data sent lets say once a second, then if you're talking about tens of thousands of devices - this is not a big load of information, and any standard database, like MySQL will be able to deal with it. You will of course need a multi-threaded server(s) capable of handling many requests in parallel.
If your IoT devices will stream HD video, then you're looking at a completely different solution, with a much stronger server, capable of handling allot of streams in parallel, with significant bandwidth requirements from your hosting company, as well as storage space for all the videos. In this case you will store the streams as files (if you'll need them later on), and you won't need any special database either.
In any case, once you'll reach millions of users, you'll be able to scale most modern databases and servers, like MySQL replication capability. For example, take a look how Wikipedia is relying on MySQL: wikipedia - MySQL https://www.mysql.com/why-mysql/case-studies/mysql-cs-wikipedia.html
So I wouldn't be worried regarding the database on this stage, but make sure that the design of my system is in accordance to the the type of data and rate it is streamed.
Hope this gives you a pointer.
I am developing a sensor based mobile application for iOS and Android. The data produced by smart phone sensors will be stored in the cloud. At this point, I am wondering that what I should test about the data transfer and storing. I mean that for example, I should test the scenario as if the connection corrupts while GPS data transfer not finished. I am not looking for the techniques, or testing styles. I am trying to find possible failure points or test scenarios. I hope that I could explain my point.
Below are some of the things worth considering for your app:
Incomplete transfers when connection corrupts (as u mentioned)
Cloud-server size..how much request can it handle at a single instance?
If u are considering cloud solutions, you should also consider the location of your users from where they will be accessing your app. Users and the location of data center will also affect in the response time.
Format of the date to stored. Considering a file size which is fast in i/o will also help optimize the speed of the app.
Asynchronous/Synchronous data transfer
Security measures on the cloud..may be using services like VPC if you are considering AWS
These are some things worth considering.
Thanks :)
For my school project I have to stream screen grabbing from 1 station (i.e. server) to another (i.e. client) in Real Time, both running linux (ubuntu).
I'm using libav-tools (avconv as the encoder on the server side and avplay as the player on the client side)
avconv uses x11grab format to grab from the screen.
My problem is: avconv needs a few seconds to output the encoded video. this wait is too long for RT.
I've tried streaming to localhost to avoid network influence on speed, it still seems that avconv is responsible for the long wait.
Also, streaming a video file seems to be much faster, almost immediately.
The project is implemented in C++ and executes avconv in a fork.
Any suggestions as to shortening the procedure?
This is most likely due to internal buffering. There is often a buffer which is way too big on default. That is because having no delay is not the primary concern of most software, they are more concerned with bad connections and that sort of problems, which is what buffers are for.
See https://libav.org/avconv.html, search for "nobuffer" or "-analyzeduration" or "-rtbufsize" or "-max_delay" or "-fpsprobesize" or "rtmp_buffer" (if you use rtmp) or others and try your luck.
There will always be a noticable delay, especially if you use an encodings like h264 for transfer. But a few seconds it does not need to be in a controlled environment. You should be able to bring it down to fractions of a second.
To conserve iPhone power, but allow data transfer over TCP-IP what do I do?
I need to receive a constant stream of data all the time. But I don't want to kill the battery in 4 hours by removing the sleep feature.
thx
In one word you cannot do that, you cannot transfer constant stream of data over TCP-IP. One user closes your app, apple restricts resource access to your app. This is apple way of conserving power. You need not worry about power.
I think this old question of mine would help you - iOS Background downloads when the app is not active
You might be able to reduce power a bit by sending or asking for data in the largest chunks possible consistant with smooth operation of your particular application, as larger data bursts may allow the radios to idle for longer periods between the data transfers; and allowing the wifi and cellular radios to turn off greatly reduces power consumption.