How to get the time elapsed time after using the reset of the "Elapsed Time" component? - labview

I am calculating the capacitance of a circuit using LabVieW. I have tried to get the time after the voltage across it reaches 2.5V. I am giving a supply of 5V. I used a logic operator and connected it to the reset to the elapsed time component. But I get zero as the time gets reseted. I want to get the actual time elapsed.
The block diagram of the circuit:

Try this code: Do not wire something on the reset and place a False on the Auto Reset Input
Block Diagram

Related

How can I have the total time Psychopy

I made an experimentation on Psychopy. I have Intructions, 10 differents routines and then msg end.
I am able to have all the time it took for each routine, but I will like to have the total time of my 10 routines without having to calculate it my self in my csv file at the end.
Code for having my duration of each trial. I putted this line in each routine.
thisExp.addData('trial_duration1', t)
I tried to create a variable total and adding all the trial_duration, but my column was empty in the csv file at the end.
Thanks!!
Psychopy has an internal clock which starts when the experiment starts. You can read the time using core.monotonicClock.getTime(). The timing of this clock starts almost immediately as you hit "run", i.e. before the dialogue box, so it doesn't read the time since the first routine started. However, you can get that duration by first recording the time of the clock in a code component when you want time zero to be defined:
time_zero = core.monotonicClock.getTime()
... and then record the time elapsed doing
thisExp.addData('cumulative_duration', core.monotonicClock.getTime() - time_zero)
Note that if you want to do this only for particular loops (e.g. define time_zero in the first loop and record cumulative_duration in the last loop) require the condition to be satisfied:
# If this is the first iteration of the loop (no matter the name of the loop)
if currentLoop.thisN == 0:
time_zero = core.monotonicClock.getTime()

measuring time between two rising edges in beaglebone

I am reading sensor output as square wave(0-5 volt) via oscilloscope. Now I want to measure frequency of one period with Beaglebone. So I should measure the time between two rising edges. However, I don't have any experience with working Beaglebone. Can you give some advices or sample codes about measuring time between rising edges?
How deterministic do you need this to be? If you can tolerate some inaccuracy, you can probably do it on the main Linux OS; if you want to be fancy pants, this seems like a potential use case for the BBB's PRU's (which I unfortunately haven't used so take this with substantial amounts of salt). I would expect you'd be able to write PRU code that just sits with an infinite outerloop and then inside that loop, start looping until it sees the pin shows 0, then starts looping until the pin shows 1 (this is the first rising edge), then starts counting until either the pin shows 0 again (this would then be the falling edge) or another loop to the next rising edge... either way, you could take the counter value and you should be able to directly convert that into time (the PRU is states as having fixed frequency for each instruction, and is a 200Mhz (50ns/instruction). Assuming your loop is something like
#starting with pin low
inner loop 1:
registerX = loadPin
increment counter
jump if zero registerX to inner loop 1
# pin is now high
inner loop 2:
registerX = loadPin
increment counter
jump if one registerX to inner loop 2
# pin is now low again
That should take 3 instructions per counter increment, so you can get the time as 3 * counter * 50 ns.
As suggested by Foon in his answer, the PRUs are a good fit for this task (although depending on your requirements it may be fine to use the ARM processor and standard GPIO). Please note that (as far as I know) both the regular GPIOs and the PRU inputs are based on 3.3V logic, and connecting a 5V signal might fry your board! You will need an additional component or circuit to convert from 5V to 3.3V.
I've written a basic example that measures timing between rising edges on the header pin P8.15 for my own purpose of measuring an engine's rpm. If you decide to use it, you should check the timing results against a known reference. It's about right but I haven't checked it carefully at all. It is implemented using PRU assembly and uses the pypruss python module to simplify interfacing.

Unstable NSTimer causes fluctuations in counting

I use NSTimer to count from a certain moment.
int totalSeconds;
int totalMinutes;
int totalHours;
If the totalSeconds are 60, totalMinuts become +1. Its very simple and should work.
For example i started the NSTimer together with the clock of my mac. (running on simulator).
When i look at the clock of my mac and the timer and compare the time the first 10-20 seconds its counting perfectly synchronous. After that it fluctuates or goes ahead 5 seconds or more.
So i output my timer and found this:
2012-10-24 14:45:44.002 xxApp driveTime: 0:0:44
2012-10-24 14:45:45.002 xxApp driveTime: 0:0:45
2012-10-24 14:45:45.813 xxApp driveTime: 0:0:46
2012-10-24 14:45:46.002 xxApp driveTime: 0:0:47
The milliseconds are timed at 002 as you see. But at the third row its 813. This happens very randomly and causes the fluctuations.
Is there a more stable way to count?
From the NSTimer documentation
A timer is not a real-time mechanism; it fires only when one of the run loop modes to which the timer has been added is running and able to check if the timer’s firing time has passed. Because of the various input sources a typical run loop manages, the effective resolution of the time interval for a timer is limited to on the order of 50-100 milliseconds.
If your goal is to compute the total time that has passed since your program has started running, this is quite easy. As soon as you want to begin keeping track of time, store -[NSDate date] into a variable. Whenever you want to compute how much time has passed, call -[NSDate date again and do the following, assuming originalDate is a property where you stored the result of the first call to -[NSDate date]:
NSDate *presentDate = [NSDate date];
NSTimeInterval runningTime = [presentDate timeIntervalSinceDate:originalDate];
runningTime will be the total number of seconds that have elapsed since you started keeping track of time. In order to get the number of hours, minutes, seconds, and so on, an NSDateComponents object should be used.
This mechanism will allow you to use a timer to update your total running time "just about once a second" without losing accuracy.

Cancelling a setInterval delay while it is running in AS2

Is this possible?
I have a file in which a movie clip is launched when the user roles over another element. To make the user experience more pleasant this happens after a 3 second delay using setInterval. Is there a way of stopping and resetting this time if the user rolls off the element before the 3 seconds is up?
var xTimer = setInterval(wait, 3000);
function wait(){
show('all');
play('all');
clearInterval(xTimer);
}
Above is the code I have used to set the delay, and below is the code I had assumed would interrupt and reset the timer.
invisBtn.onRollOut = function(){
rollover_mc.gotoAndStop(1);
stop();
clearInterval(xTimer());
trace('off');
}
Any help on this would be massively appreciated.
First, the setInterval & clearInterval functions use a Number variable to work.
setInterval() returns a Number variable, and clearInterval() takes that Number in parameter to remove the previous started interval. Here you seem to keep the interval ID inside a function variable instead of a Number one.
Thus, clearInterval(xTimer()); should in reality be clearInterval(xTimer); (without the parenthesis after xTimer).
And secondly, so you can use it in the invisBtn.onRollOut function, just be sure that the xTimer variable is scoped correctly (not inside a function where the invisBtn.onRollOut isn't also), and not on different keyframes of the timeline (timeline keyframes in Flash tends to forget the code you've written on it as soon as the reading head passes onto a new keyframe of the layer which has the code on it).
Feel free to ask more details if you need !

How can I (reasonably) precisely perform an action every N milliseconds?

I have a machine which uses an NTP client to sync up to internet time so it's system clock should be fairly accurate.
I've got an application which I'm developing which logs data in real time, processes it and then passes it on. What I'd like to do now is output that data every N milliseconds aligned with the system clock. So for example if I wanted to do 20ms intervals, my oututs ought to be something like this:
13:15:05:000
13:15:05:020
13:15:05:040
13:15:05:060
I've seen suggestions for using the stopwatch class, but that only measures time spans as opposed to looking for specific time stamps. The code to do this is running in it's own thread, so should be a problem if I need to do some relatively blocking calls.
Any suggestions on how to achieve this to a reasonable (close to or better than 1ms precision would be nice) would be very gratefully received.
Don't know how well it plays with C++/CLR but you probably want to look at multimedia timers,
Windows isn't really real-time but this is as close as it gets
You can get a pretty accurate time stamp out of timeGetTime() when you reduce the time period. You'll just need some work to get its return value converted to a clock time. This sample C# code shows the approach:
using System;
using System.Runtime.InteropServices;
class Program {
static void Main(string[] args) {
timeBeginPeriod(1);
uint tick0 = timeGetTime();
var startDate = DateTime.Now;
uint tick1 = tick0;
for (int ix = 0; ix < 20; ++ix) {
uint tick2 = 0;
do { // Burn 20 msec
tick2 = timeGetTime();
} while (tick2 - tick1 < 20);
var currDate = startDate.Add(new TimeSpan((tick2 - tick0) * 10000));
Console.WriteLine(currDate.ToString("HH:mm:ss:ffff"));
tick1 = tick2;
}
timeEndPeriod(1);
Console.ReadLine();
}
[DllImport("winmm.dll")]
private static extern int timeBeginPeriod(int period);
[DllImport("winmm.dll")]
private static extern int timeEndPeriod(int period);
[DllImport("winmm.dll")]
private static extern uint timeGetTime();
}
On second thought, this is just measurement. To get an action performed periodically, you'll have to use timeSetEvent(). As long as you use timeBeginPeriod(), you can get the callback period pretty close to 1 msec. One nicety is that it will automatically compensate when the previous callback was late for any reason.
Your best bet is using inline assembly and writing this chunk of code as a device driver.
That way:
You have control over instruction count
Your application will have execution priority
Ultimately you can't guarantee what you want because the operating system has to honour requests from other processes to run, meaning that something else can always be busy at exactly the moment that you want your process to be running. But you can improve matters using timeBeginPeriod to make it more likely that your process can be switched to in a timely manner, and perhaps being cunning with how you wait between iterations - eg. sleeping for most but not all of the time and then using a busy-loop for the remainder.
Try doing this in two threads. In one thread, use something like this to query a high-precision timer in a loop. When you detect a timestamp that aligns to (or is reasonably close to) a 20ms boundary, send a signal to your log output thread along with the timestamp to use. Your log output thread would simply wait for a signal, then grab the passed-in timestamp and output whatever is needed. Keeping the two in separate threads will make sure that your log output thread doesn't interfere with the timer (this is essentially emulating a hardware timer interrupt, which would be the way I would do it on an embedded platform).
CreateWaitableTimer/SetWaitableTimer and a high-priority thread should be accurate to about 1ms. I don't know why the millisecond field in your example output has four digits, the max value is 999 (since 1000 ms = 1 second).
Since as you said, this doesn't have to be perfect, there are some thing that can be done.
As far as I know, there doesn't exist a timer that syncs with a specific time. So you will have to compute your next time and schedule the timer for that specific time. If your timer only has delta support, then that is easily computed but adds more error since the you could easily be kicked off the CPU between the time you compute your delta and the time the timer is entered into the kernel.
As already pointed out, Windows is not a real time OS. So you must assume that even if you schedule a timer to got off at ":0010", your code might not even execute until well after that time (for example, ":0540"). As long as you properly handle those issues, things will be "ok".
20ms is approximately the length of a time slice on Windows. There is no way to hit 1ms kind of timings in windows reliably without some sort of RT add on like Intime. In windows proper I think your options are WaitForSingleObject, SleepEx, and a busy loop.