On Tue, Jun 21, 2022, at 4:42 PM, William Linkmeyer via Boost wrote:
This is interesting and helpful — yes!
Although, it sounds like it’s more pertinent to software timers, right?
I’ve done some work with hardware RTCs, many of which drift by 4ms/day, others by a few ms/year. (For physical quartz clocks, drift is a function of temperature and crystal frequency.)
I had assumed that these clocks were more-or-less ubiquitous on computers today.
The drift you're noticing is software alright. The perceived inaccuracy isn't with the hardware clocks employed. On the contrary. The clocks are the ~accurate reference point that allow you to observe the latency/jitter involved with scheduling threads at runners in your operating system. This should be relatively obvious to you even before you asked the question: if you thought the actual clock is drifting, it would make it inspired to measure latencies.
Anyway, best regards and thank you for your detailed response.
WL
Seth