Does QueryPerformanceCounter go forward or backward on windows 10? - api

To make a delay in my program, i want to use QPC. For example, delay is exact 5000.100306 seconds ,
So, my question is: Does QueryPerformanceCounter go forward or backward on windows 10 ?

Related

Will my google colab program end after i close the webpage?

I am running a deep learning training program on my colab notebook which will cost about 10hours. If i close my browser, will it be shutdown by google before it ends as expected? Or will the last output be saved coorectly in my Drive?
I suggest you to look here and here. Basically, the code should keep running, but after some time (around 90 minutes) of idle activity, the notebook should be cut off, so I assume that what you suggest is not viable. Maybe you could try to launch the script in the morning and interact with it every 20-30 minutes to prevent it going to idle. Also, consider using Google Colab pro (faster GPUs and longer runtimes, but never longer that 24 hours)
The simple answer to that question is a solid no. Your session will go ahead and continue executing or will stay idle, as stated in the #SilentCloud 's Answer above it will go for about
90 Minutes [With CPU]
30 Minutes [With GPU]
The reason I say 30 Minutes with GPU is that I have personally tested that and it appears to be this number, as do use on a rather regular basis.
You can make a simple bot on Your Machine using pyautogui in order to go ahead and do some random stuff if for some reason it makes more economical sense, or you are not interested in Google Colab Pro Subscription.
Run with Browser Closed
If you want a seamless experience with the browser window effectively closed and having access to GPU's that are much more better and faster, I would recommend the Colab Pro + Subscription.
But the Scripting Idea is there, and your mileage may vary.

Objective C / Arduino communication slow down

I have written an G code interpreter/control app for a cnc machine in objective C. Everything runs fine for the first 20 - 30 secs but after that the whole thing stops for an other 20 sec and resumes super slow. I made a video so you can see for your self:
video (about a minute). As far as I can tell it doesn't skip steps or something like that it just goes really slow.
In my X-code console I can see that code is interpreted at the normal speed (using a NSLog every time a byte is written).
I use the IOkit from the Arduino Cocoa reference to communicate. I have tried a lot of different Baudrates and sometimes that will prolong the time it keeps working correctly but eventually it always slows down.
I some thing after this line needs to clean up the serial buffer or something
// send a byte to the serial port
void writeByte(char val) {
write(serialFileDescriptor, [[NSString stringWithFormat:#"%c" , val] cStringUsingEncoding:NSUTF8StringEncoding], 1);
}
Update: I am developing the app on my 17"MBP running OS X 10.9, I tried this on a other 13"MBP running 10.9.1 same thing happens but when I use yet an other 13"MBP running 10.6.8 it works fine!
Any ideas on what is happening here?
Probably you are writing faster than baudrate, but you will slow down only when output buffere will be full, because you need to wait to write. This can be solved or worked around in many ways
As it turned out there was a rogue Serial.Write() some were in the Arduino code (I placed there to find an other bug, and just forgot about it) The statement slowly filled the serial buffer.

More Accurate StopWatch (VB.NET)

Is there any sort of "sleep" method that is more accurate than the stopwatch? Or, is there a way to make the stopwatch class more accurate? It doesn't have to be in .NET, it can be in c++, but whatever language it is in has to have exactly 1ms accuracy; I don't need more then that. Say if I want my program to "sleep" for 300ms, I would like it to sleep for 300ms at least most of the time.
Currently I use:
Dim StopWatch As New StopWatch
StopWatch.Start
Do
Loop Until StopWatch.ELapsed.Milliseconds >= 300
StopWatch.Stop
My results running it 5 times were: 306, 305, 315, 327, 304.
It stayed like that if I ran it more.
I put my thread and process priority on "Realtime" / "High".
The Stopwatch class has a property IsHighResolution. If it returns ´true´ you are using the High Performance Event Timer (HPET) - availability depends on hardware and OS. Using this, you can measure times very accurate. BUT! Windows (as usual Linuxes) is NOT a realtime OS, but uses preemptive multitasking. Whenever the OS thinks, that it needs to, it will put your current thread on hold to do other work and after some time, it will return to your thread and let it continue. If this switch happens somewhere inside your loop, you still measure the correct time, but it contains an amount of inactivty time.
Since a time slice under Windows is something between 15 and 30 ms, you(r thread) might be suspended after 299 ms and 15-30 ms later you will get back. And that's the effect you see. The Stopwatch IS accurate. It just measures stuff you didn't expect.
How to overcome: You can't. As said: Windoes IS NOT a realtime OS! Even if you assign priority "realtime" to your process.
What you are seeing is completely normal. Delay will never be exactly 300ms, it will always be more than that. Sleep itself is accurate, but the actual delay depends on your operating system, and other processes running in parallel to yours.
If you want a more accurate timer, you need to use the current date and time as a reference. Here is a simple equation that you can run every millisecond:
currentTime - startTime = elapsedTime
...where currentTime is System.DateTime.Now, startTime is the time that the timer was started, and elapsedTime is a System.DateTime.TimeSpan.
For more details on how to do this, check out the source of a program I made in VB.Net, E-Tech Timer: http://etechtimer.codeplex.com

how does the function time() tell the current time and even when the computer has been powered off earlier?

how we can work on timer deals with milliseconds (0.001) how we could divide second as we want ?? how we could we deal with the second itself ???
http://computer.howstuffworks.com/question319.htm
In your computer (as well as other
gadgets), the battery powers a chip
called the Real Time Clock (RTC) chip.
The RTC is essentially a quartz watch
that runs all the time, whether or not
the computer has power. The battery
powers this clock. When the computer
boots up, part of the process is to
query the RTC to get the correct time
and date. A little quartz clock like
this might run for five to seven years
off of a small battery. Then it is
time to replace the battery.
Your PC will have a hardware clock, powered by a battery so that it keeps ticking even while the computer is switched off. The PC knows how fast its clock runs, so it can determine when a second goes by.
Initially, the PC doesn't know what time it is (i.e. it just starts counting from zero), so it must be told what the current time is - this can be set in the BIOS settings and is stored in the CMOS, or can be obtained via the Internet (e.g. by synchronizing with the clocks at NIST).
Some recap, and some more info:
1) The computer reads the Real-Time-Clock during boot-up, and uses that to set it's internal clock
2) From then on, the computer uses it's CPU clock only - it does not re-read the RTC (normally).
3) The computer's internal clock is subject to drift - due to thermal instability, power fluctuations, inaccuracies in finding an exact divisor for seconds, interrupt latency, cosmic rays, and the phase of the moon.
4) The magnitude of the clock drift could be in the order of seconds per day (tens or hundreds of seconds per month).
5) Most computers are capable of connecting to a time server (over the internet) to periodically reset their clock.
6) Using a time server can increase the accuracy to within tens of milliseconds (normally). My computer updates every 15 minutes.
Computers know the time because, like you, they have a digital watch they look at from time to time.
When you get a new computer or move to a new country you can set that watch, or your computer can ask the internet what the time is, which helps to stop it form running slow, or fast.
As a user of the computer, you can ask the current time, or you can ask the computer to act as an alarm clock. Some computers can even turn themselves on at a particular time, to back themselves up, or wake you up with a favourite tune.
Internally, the computer is able to tell the time in milliseconds, microseconds or sometimes even nanoseconds. However, this is not entirely accurate, and two computers next to each other would have different ideas about the time in nanoseconds. But it can still be useful.
The computer can set an alarm for a few milliseconds in the future, and commonly does this so it knows when to stop thinking about your e-mail program and spend some time thinking about your web browser. Then it sets another alarm so it knows to go back to your e-mail a few milliseconds later.
As a programmer you can use this facility too, for example you could set a time limit on a level in a game, using a 'timer'. Or you could use a timer to tell when you should put the next frame of the animation on the display - perhaps 25 time a second (ie every 40 milliseconds).
To answer the main question, the BIOS clock has a battery on your motherboard, like Jian's answer says. That keeps time when the machine is off.
To answer what I think your second question is, you can get the second from the millisecond value by doing an integer division by 1000, like so:
second = (int) (milliseconds / 1000);
If you're asking how we're able to get the time with that accuracy, look at Esteban's answer... the quartz crystal vibrates at a certain time period, say 0.00001 seconds. We just make a circuit that counts the vibrations. When we have reached 100000 vibrations, we declare that a second has passed and update the clock.
We can get any accuracy by counting the vibrations this way... any accuracy thats greater than the period of vibration of the crystal we're using.
The motherboard has a clock that ticks. Every tick represents a unit of time.
To be more precise, the clock is usually a quartz crystal that oscilates at a given frequency; some common CPU clock frequencies are 33.33 and 40 MHz.
Absolute time is archaically measured using a 32-bit counter of seconds from 1970. This can cause the "2038 problem," where it simply overflows. Hence the 64-bit time APIs used on modern Windows and Unix platforms (this includes BSD-based MacOS).
Quite often a PC user is interested in time intervals rather than the absolute time since a profound event took place. A common implementation of a computer has things called timers that allow just that to happen. These timers might even run when the PC isn't with purpose of polling hardware for wake-up status, switching sleep modes or coming out of sleep. Intel's processor docs go into incredible detail about these.

Accurate Windows timer? System.Timers.Timer() is limited to 15 msec

I need an accurate timer to interface a Windows application to a piece of lab equipment.
I used System.Timers.Timer() to create a timer that ticks every 10 msec, but this clock runs slow. For example 1000 ticks with an interval of 10 msec should take 10 wall-clock seconds, but it actually takes more like 20 wall-clock sec (on my PC). I am guessing this is because System.Timers.Timer() is an interval timer that is reset every time it elapses. Since it will always take some time between when the timer elapses and when it is reset (to another 10msec) the clock will run slow. This probably fine if the interval is large (seconds or minutes) but unacceptable for very short intervals.
Is there a function on Windows that will trigger a procedure every time the system clock crosses a 10 msec (or whatever) boundary?
This is a simple console application.
Thanks
Norm
UPDATE: System.Timers.Timer() is extremely inaccurate for small intervals.
I wrote a simple program that counted 10 seconds several ways:
Interval=1, Count=10000, Run time = 160 sec, msec per interval=16
Interval=10, Count=1000, Run time = 16 sec, msec per interval=15
Interval=100, Count=100, Run time = 11 sec, msec per interval=110
Interval=1000, Count=10, Run time = 10 sec, msec per interval=1000
It seems like System.Timers.Timer() cannot tick faster that about 15 msec, regardless
of the interval setting.
Note that none of these tests seemed to use any measurable CPU time, so the limit is not the CPU, just a .net limitation (bug?)
For now I think I can live with an inaccurate timer that triggers a routine every 15 msec or so and the routine gets an accurate system time. Kinda strange, but...
I also found a shareware product ZylTimer.NET that claims to be a much more accurate .net timer (resolution of 1-2 msec). This may be what I need. If there is one product there are likely others.
Thanks again.
You need to use a high resolution timer such as QueryPerformanceCounter
On surface of it the answer is something like "high resolution timer" however this is incorrect. The answer requires a regular tick generation and the windows high res performance counter API does not generate such a tick.
I know this is not answer inself but the popular answer to this question so far is wrong enough for me to feel that a simple comment on it is not enough.
The limitation is given by the systems heartbeat. This typically defaults to 64 beats/s which is 15.625 ms. However there are ways to modify these system wide settings to achieve timer resolutions down to 1 ms or even to 0.5 ms on newer platforms:
Going for 1 ms resolution by means of the multimedia timer interface (timeBeginPeriod()):
See Obtaining and Setting Timer Resolution.
Going to 0.5 ms resolution by means of NtSetTimerResolution():
See Inside Windows NT High Resolution Timers.
You may obtain 0.5 ms resolution by means of the hidden API NtSetTimerResolution().
I've given all the details in this SO answer.
In System.Diagnostics, you can use the Stopwatch class.
Off the top of my head, I could suggest running a thread that mostly sleeps, but when it wakes, it checks a running QueryPerformanceCounter and occasionally triggers your procedure.
There's a nice write up at the MSDN: Implement a Continuously Updating, High-Resolution Time Provider for Windows
Here's the sample source code for the article (C++).