Standard Time Measurement Unit - jprofiler

in JProfiler, is there a way to force the UI to always shows time measurement in one unit, ie. milliseconds? The current behaviour is it will auto adjust the number to a higher or lower unit, i.e. seconds or nanoseconds, which is a bit annoying to me.
Thank you.

In the "View settings" of each view that shows times or memory sizes, you can switch the time scale from "Automatic" to a fixed unit.

Related

Rendering labels taking long time

I have >1000 nodes. Label rendering for these nodes takes about 35 seconds. How can I reduce this time? When I look at the chrome developers tool, I see that cytoscape's label projection calculation taking this time. Could you please help on how to reduce this time or if it is fixed in latest versions? Label displaying is must , cannot just show in tooltip.
There isn't much you can do about the performance. It is generally bad when the size of the graph is over a thousand nodes/edges.
You can use some kind of semantic zooming (i.e. reveal more details about the graph based on zoom level), to increase the performance. In your case, this is showing or hiding labels based on the zoom level. There is an option for that min-zoomed-font-size.
Also check this out for more performance tuning.

SUMO sim time and real time difference

I used traci.simulation.getTime to get the current sim time of SUMO.
However, this time runs faster than real time.
For example, when sim time grow from 0-100, real time just grow from 0-20.
How can I make SUMO simulation time be the same with real time?
I tried --step-length = 1, but this didn't work
The --step-length property is a value in seconds describing the length of one simulation step. If you put a higher number here vehicles have less time to react, but your simulation probably runs faster.
For the real time issue you might have a look to the sumo-user mailinglist. I think the mail gives a pretty good answer to your issue:
the current limit to the real time factor is the speed of your
computer. If you want to slow the GUI down you can change the delay
value (which is measured in milliseconds) so a value of 100 would add
100ms to every simulation step (if you simulation is small and runs
with the default step length of 1s this means factor 10). If you want
to speed it up, run without GUI or buy a faster computer ;-).
To check how close your simulation is to wall clock time, you can check the generated output from SUMO. The thing you're looking for is called Real time factor

Average time per step is constant despite a variable delay

I've made a cellular automaton (Langton's ant FYI) on VBA. At each step, there is a Sleep(delay) where delay is a variable. I've also added DoEvents at the end of a display function to ensure that each step is shown on screen.
With a Timer I can monitor how long one step require in average. The result is plotted on the graph bellow (Y-axis : Time per step (in ms). X-axis : delay (in ms))
Could you explain to me why it looks like that? Especially why does it remain steady ? Because IMO, I'm suppose to have (more or less) a a straight line.
I got these results whitout doing anything else on my computer during the whole process.
Thank you in advance for your help,
That would be because the Sleep API is based off of the system clock. If the resolution of your clock is lower than the time you are sleeping, then it will round up to the nearest resolution of your system clock. You may be able to call timeGetDevCaps to see the minimum timer resolution of your system.
Think about it this way. You have a normal watch that only includes your usual Hour/Minute/Second hand (no hands for 1/1000, etc). You are wanting to time half a second, but your watch only moves in 1 second intervals - hence your watch's resolution is 1 tick per second. You would not know that half a second has actually passed by until the full second passes by due to this resolution, so it's rounded to the next tick.

Oxyplot: IsValidPoint on realtime LineSerie

I've been using oxyplot for a month now and I'm pretty happy with what it delivers. I'm getting data from an oscilloscope and, after a fast processing, I'm plotting it in real time to a graph.
However, if I compare my application CPU usage to the one provided by the oscilloscope manufacturer, I'm loading a lot more the CPU. Maybe they're using some gpu-based plotter, but I think I can reduce my CPU usage with some modifications.
I'm capturing 10.000 samples per second and adding it to a LineSeries. I'm not plotting all that data, I'm decimating it to a constant number of points, let's say 80 points for a 20 secs measure, so I have 4 points/sec while totally zoomed out and a bit more detail if I zoom in to a specific range.
With the aid of ReSharper, I've noticed that the application is calling a lot of times (I've 6 different plots) the IsValidPoint method (something like 400.000.000 times), which is taking a lot of time.
I think the problem is that, when I add new points to the series, it checks for every point if it is a valid point, instead of the added values only.
Also, it spends a lot of time in the MeasureText/DrawText method.
My question is: is there a way to override those methods and to adapt it to my needs? I'm adding 10.000 new values each second, but the first ones remain the same, so there's no need for re-validate them. Also, the text shown doesn't change.
Thank you in advance for any advice you can give me. Have a good day!

what does 'resolution' mean in embedded system?

I'm taking embedded system class and I keep seeing this word being used as a clock period. But sometimes, it acts differently to clock period. It's definitely not a graphics screen resolution. What's the exact definition in terms of embedded system?
The clock resolution is the smallest unit of time it can measure. For example, a clock with a rate of 1hz has a resolution of 1s. One with a rate of 60hz has a resolution of 16.6 repeating ms.
Desired unit in seconds/clock rate per second = resolution
1000ms/60 hz = 16.6 repeating ms
The term has a general meaning relating to granularity or minimum increment that applies to time as well as images and displays.
In the case of a clock signal there are a number of terms relating to resolution that are probably less ambiguous such as period, frequency, or granularity.