Click here to Skip to main content
15,920,053 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
I'm currently designing a system wherein my requirements are as folows:

I have a USB-UART chip. I 'm developing an apllication which shall initially set the baud rate of UART chip(Api provided with driver for that).

My problem is that in the application i have a scenario where i have to send bits of data say 0 for 25 +- 1 ms. Then 1 for another 50 +-1 ms. Such that it sends data in a high and low pattern.

See consideration to make are: Once we set a baud rate of say 100 bps. it would mean 1 bit is sent in 10 ms.

In C#, is it possible to handle such timing pricisions . if yes how can i design it better way?

Otherwise what are the ways in other language?
Posted

.NET is nondeterministic, the GC might suspend your application to clean up when your timer should fire. So using .NET isn't good in your case.

No matter what language you use you'll be affected by the fact that Windows isn't accurate below 50 ms as it aint real time so you'll be presented with alot of nondeterministic jitter.
 
Share this answer
 
Please read Luc's article: "Timer surprises, and how to avoid them"[^].
 
Share this answer
 
Comments
Simon Bang Terkildsen 25-Aug-11 3:59am    
good link +5
glued-to-code 25-Aug-11 4:54am    
Thanks for this, but can you tell me if its possible to get accurate timings in c++. if yes how?
CPallini 25-Aug-11 4:58am    
As Simon Bang Terkildsen (and Luc, in his article) already pointed out you cannot achieve that, even using C++ (at least in user code), it is an OS limitation, Windows is not a real-time OS.

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900