Click here to Skip to main content
15,885,985 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
Hello All,

I have function that reads data from a device and returns an array.

I want to execute this function at accurate intervals of 50ms for some specific time duration, say 10 seconds.

However, while using timer in C# I am facing issue of Data loss.

Is there any way in C# that can be used to log data accurately at interval as low as 50ms?

What I have tried:

I have tried using System.Timers AND System.Threading.Timer also, Facing the same issue.
Posted
Updated 21-Feb-23 3:24am

You can't rely on timers for accuracy at all - Windows is not a real time operating system and timers are only guaranteed to tick at some point on or after the specified interval. They are not guaranteed to occur immediately the interval elapses. If there is a free core in the system, and the thread the timer is running on is otherwise idle, it will occur as close as possible to the interval, but that is in the lap of the gods.

Add to that that it will depend heavily on what your functions needs to do, what you loggin involves, and how long those two take to execute each time: if they do slow things (such as serial communications for example) then they may exceed 50ms all on their own even with an idle computer!
 
Share this answer
 
Typically, something like this is done using dedicated data acquisition hardware and not by the PC itself. That hardware would obtain samples at the specified rate, buffer that data, and transfer it to the PC in larger bundles whenever the PC is available to grab the data.
 
Share this answer
 
 
Share this answer
 
v2

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900