My Mobile Is heating up because of continues usage of my application .
For Detecting how well the driver is driving my application Uses These services :-
Accelerometer For starting the trip .
GPS To find the location and speed.
Then Syncing all the data to the server and writing that data to file locally.
After Few minutes my mobile heats up .
I Want to prioritize the services that is heating up my mobile and need to fix this issue of heating up .
Among the three apps you mentioned, GPS is by far the biggest power hog. It can be easily 50 times more than accelerometer. With GPS you are dealing with acquiring the signal in multiple frequencies, and it does a lot of number crunching at every specified epoch which also consumes a lot of power. However, for accelerometer the data is already inside of your mobile device. Thus, no significant power hogging for acquiring the data. I wouldn't worry about the 3rd case since a lot of apps do that with little heating problem.
So start with working on GPS case first.
Related
Guys I want to ask if possible to create a solar powered automatic bell system with smoke detection using raspberry pi? And I worries if the raspberry pi have real time clock? Because I need a RTC to execute the bell in exact alarm for certain interval I set.
This the statement I find in internet for more details but the author used an arduino.
The system uses the real time clock to determine
the time and the bell rings based on set up time. The LCD
in this system displays current time and displays fire if
the smoke detector detects a smoke. For different
sessions, the bell will ring different numbers of times.
The system is expected to continuously display the time
by using real time clock and monitor the situation of the
school during the day and night with power generated by
solar energy [9]. By using solar energy as a power source,
the system is uninterrupted during power supply failure
from the main energy department. In addition, the energy
can be used efficiently during day time and stored energy
in the battery can be utilized during night time. This
designed bell system integrated with smoke detector
integration is expected to safeguard the institution from
damages and losses particularly during an outbreak of
fire.
Your project is indeed feasible. The RPI check for the smoke detector value at a certain interval of time. If the value is higher than a certain threshold then you ring your bell.
The raspberry pi does not have a RTC included. You'll have to buy one online (a simple research on google should lead you to online shop like adafruit ...). But I don't understand why you would need to use a real time clock. If you want to use this system (which I don't recommend ; use a certified equipment for your personal security) you should check the data from the smoke/fire detector quite regularly but it doesn't have to be matching real time clock.
Alternatively, you could use RPI WiFi/Ethernet to request RTC from the internet.
If you are doing this project just for fun, use a smoke detector and a buzzer. Your RPI should be able to provide enough current to power both of these equipment. You could power your RPI with a small battery or a charger (check the correct voltage & current needed by the RPI).
You should find a lot of information/tutorials/code online for these type of little projects.
I'm trying to write a code in which every 1 ms a number plused one , should be replaced the old number . (something like a chronometer ! ) .
the problem is whenever the cpu usage increases because of some other programs running on the pc, this 1 milliseconds is also increased and timing in my program changes !
is there any way to prevent cpu load changes affecting timing in my program ?
It sounds as though you are trying to generate an analogue output waveform with a digital-to-analogue converter card using software timing, where your software is responsible for determining what value should be output at any given time and updating the output accordingly.
This is OK for stationary or low-speed signals but you are trying to do it at 1 ms intervals, in other words to output 1000 samples per second or 1 ks/s. You cannot do this reliably on a desktop operating system - there are too many other processes going on which can use CPU time and block your program from running for many milliseconds (or even seconds, e.g. for network access).
Here are a few ways you could solve this:
Use buffered, hardware-clocked output if your analogue output device supports it. Instead of writing one sample at a time, you send the device a waveform or array of samples and it outputs them at regular intervals using a timing signal generated in hardware. Unfortunately, low-end DAQ devices often don't support hardware-clocked output.
Instead of expecting the loop that writes your samples to the AO to run every millisecond, read LabVIEW's Tick Count (ms) value in the loop and use that as an index to your array of samples: rather than trying to output every sample, your code will now say 'what time is it now, and therefore what should the output be?' That won't give you a perfect signal out but at least now it should keep the correct frequency rather than be 'slowed down' - instead you will see glitches imposed on the signal whenever the loop can't keep up. This is easy to test and maybe it will be adequate for your needs.
Use a real-time operating system instead of a desktop OS. In the case of LabVIEW this would mean using the Real-Time software module and either a National Instruments hardware device that supports RT, such as the CompactRIO series, or installing the RT OS on a dedicated PC if the hardware is compatible. This is not a cheap option, obviously (unless it's strictly for personal, home use). In any case you would need to have an RT-compatible driver for your output device.
Use your computer's sound output as the output device. LabVIEW has functions for buffered sound output and you should be able to get reliable results. You'll need to upsample your signal to one of the sound output's available sample rates, probably 44.1 ks/s. The drawbacks are that the output level is limited in range and is not calibrated, and will probably be AC-coupled so you can't output a DC or very low-frequency signal. However if the level is OK for what you want to connect it to, or you can add suitable signal conditioning, this could be a neat solution. If you need the output level to be calibrated you could simultaneously measure it with your DAQ card and scale the sound waveform you're outputting to keep it correct.
The answer to your question is "not on a desktop computer." This is why products like LabVIEW Real-Time and dedicated deterministic hardware exist: you need a computer built around dedication to a particular process in order to consistently serve that process. Every application in a regular Windows/Mac/Linux desktop system has the problem you are seeing of potentially being interrupted by other system processes, particularly in its UI layer.
There is no way to prevent cpu load changes from affecting timing in your program unless the computer has a realtime clock.
If it doesn't have a realtime clock, there is no reason to expect it to behave deterministically. Do you need for your program to run at that pace?
As the question says ,I want to monitor the value of power(watts) that some components consumption .especially the value of CPU , Memory and disk .
when I use aida64,I found that in computer/sensor ,there are some data about power consumption . I want to know how did it can get these data ?
I already have some idea ,but not sure which is the best way to solve this question :
there are some sensors on the motherboard ,we can use values of those sensors to calculate the real-time power.
according to different OS, we have some APIs that can get the utilization of cpu,memory throughout rate and disk I/O rate . Using this data ,we can build mode of power consumption about PC.if there are those APIs,where can I find them ?
maybe the hardware manufacturer like intel has already record the value of power in real-time ,they put the value into some special register in hardware .we can get the value through mapping into special memory location .
In my opinion ,the second way maybe the solution that most monitor software using .but I just don't know where can I get those API.
whats more ,our aim is to design an OS-independent real-time power monitor software. So, if there are any better solutions about this question ,I will appreciate your help .
Hmmm. I wasn't sure if I should post this as a comment or an answer. It is an answer but in the negative.
At this time, you can't create an OS independent software-based non-intrusive power monitor. By non-intrusive, I mean that you are not putting special instrumentation on the motherboard and other hardware. This is because the power technology being used by modern processors is in rapid flux, each new generation making significant advances. Additionally, the amount of power related information available to software from the hardware (via PMU events and the like) is continually increasing as more silicon real estate becomes available. For example, I believe that in the most current processors, you can get direct thermal information for key parts of the processor silicon, and temperature, power and current readings from various parts of the core and uncore.
The best you can do is to abstract the top layer of your monitor from the lower layers. Then the top becomes OS / HW independent while the lower levels need to be platform dependent.
Check out the PAPI APIs. Note that the APIs appear to give you the world, but are really just an API set. Someone still has to implement what's on the other side of the API.
Now if you can do your own special instrumentation, many (most?) motherboards and other hardware have measurement points (some undocumented) that provide thermal, current (and so power) information. This information is important for debugging devices and platforms.
How would you test a ‘Mobile’ version of IM? What are some key differences between testing this version versus a desktop application?
First of all you need to account for unstable connectivity - the program should be ready to deal with temporary and permanent network coverage losses.
Resources consumption (memory footprint, network bandwidth and CPU time) is also to be considered. Noone wants a program that drains the battery in an hour, occupies all memory or causes a giant bill to the owner.
Both memory requirement and network bandwidth has to be kept to as low as possible...
adding to sharptooth's answer, notifications on sending/receiving a message. Support for voice chat/video chat via available hardware :)
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 7 years ago.
Improve this question
I assume it doesn't connect to anything (other than the satelite I guess), is this right? Or it does and has some kind of charge?
GPS, the Global Positioning System run by the United States Military, is free for civilian use, though the reality is that we're paying for it with tax dollars.
However, GPS on cell phones is a bit more murky. In general, it won't cost you anything to turn on the GPS in your cell phone, but when you get a location it usually involves the cell phone company in order to get it quickly with little signal, as well as get a location when the satellites aren't visible (since the gov't requires a fix even if the satellites aren't visible for emergency 911 purposes). It uses up some cellular bandwidth. This also means that for phones without a regular GPS receiver, you cannot use the GPS at all if you don't have cell phone service.
For this reason most cell phone companies have the GPS in the phone turned off except for emergency calls and for services they sell you (such as directions).
This particular kind of GPS is called assisted GPS (AGPS), and there are several levels of assistance used.
GPS
A normal GPS receiver listens to a particular frequency for radio signals. Satellites send time coded messages at this frequency. Each satellite has an atomic clock, and sends the current exact time as well.
The GPS receiver figures out which satellites it can hear, and then starts gathering those messages. The messages include time, current satellite positions, and a few other bits of information. The message stream is slow - this is to save power, and also because all the satellites transmit on the same frequency and they're easier to pick out if they go slow. Because of this, and the amount of information needed to operate well, it can take 30-60 seconds to get a location on a regular GPS.
When it knows the position and time code of at least 3 satellites, a GPS receiver can assume it's on the earth's surface and get a good reading. 4 satellites are needed if you aren't on the ground and you want altitude as well.
AGPS
As you saw above, it can take a long time to get a position fix with a normal GPS. There are ways to speed this up, but unless you're carrying an atomic clock with you all the time, or leave the GPS on all the time, then there's always going to be a delay of between 5-60 seconds before you get a location.
In order to save cost, most cell phones share the GPS receiver components with the cellular components, and you can't get a fix and talk at the same time. People don't like that (especially when there's an emergency) so the lowest form of GPS does the following:
Get some information from the cell phone company to feed to the GPS receiver - some of this is gross positioning information based on what cellular towers can 'hear' your phone, so by this time they already phone your location to within a city block or so.
Switch from cellular to GPS receiver for 0.1 second (or some small, practically unoticable period of time) and collect the raw GPS data (no processing on the phone).
Switch back to the phone mode, and send the raw data to the phone company
The phone company processes that data (acts as an offline GPS receiver) and send the location back to your phone.
This saves a lot of money on the phone design, but it has a heavy load on cellular bandwidth, and with a lot of requests coming it requires a lot of fast servers. Still, overall it can be cheaper and faster to implement. They are reluctant, however, to release GPS based features on these phones due to this load - so you won't see turn by turn navigation here.
More recent designs include a full GPS chip. They still get data from the phone company - such as current location based on tower positioning, and current satellite locations - this provides sub 1 second fix times. This information is only needed once, and the GPS can keep track of everything after that with very little power. If the cellular network is unavailable, then they can still get a fix after awhile. If the GPS satellites aren't visible to the receiver, then they can still get a rough fix from the cellular towers.
But to completely answer your question - it's as free as the phone company lets it be, and so far they do not charge for it at all. I doubt that's going to change in the future. In the higher end phones with a full GPS receiver you may even be able to load your own software and access it, such as with mologogo on a motorola iDen phone - the J2ME development kit is free, and the phone is only $40 (prepaid phone with $5 credit). Unlimited internet is about $10 a month, so for $40 to start and $10 a month you can get an internet tracking system. (Prices circa August 2008)
It's only going to get cheaper and more full featured from here on out...
Re: Google maps and such
Yes, Google maps and all other cell phone mapping systems require a data connection of some sort at varying times during usage. When you move far enough in one direction, for instance, it'll request new tiles from its server. Your average phone doesn't have enough storage to hold a map of the US, nor the processor power to render it nicely. iPhone would be able to if you wanted to use the storage space up with maps, but given that most iPhones have a full time unlimited data plan most users would rather use that space for other things.
There's 3 satellites at least that you must be able to receive from of the 24-32 out there, and they each broadcast a time from a synchronized atomic clock. The differences in those times that you receive at any one time tell you how long the broadcast took to reach you, and thus where you are in relation to the satellites. So, it sort of reads from something, but it doesn't connect to that thing. Note that this doesn't tell you your orientation, many GPSes fake that (and speed) by interpolating data points.
If you don't count the cost of the receiver, it's a free service. Apparently there's higher resolution services out there that are restricted to military use. Those are likely a fixed cost for a license to decrypt the signals along with a confidentiality agreement.
Now your device may support GPS tracking, in which case it might communicate, say via GPRS, to a database which will store the location the device has found itself to be at, so that multiple devices may be tracked. That would require some kind of connection.
Maps are either stored on the device or received over a connection. Navigation is computed based on those maps' databases. These likely are a licensed item with a cost associated, though if you use a service like Google Maps they have the license with NAVTEQ and others.