Click here to Skip to main content
15,887,450 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
I am currently doing serial communication between PC and Mitsubishi PLC at 19200 kbps for 1000 I/Os from different PLC registers using C# Serial Port Class. Polling method by periodic Tick event of the Windows.Form.Timer is used to send a READ command and wait sometime for PLC to received the READ data, then dispatching the data to the different parts of the GUI using threading. This polling methodology is found to be not efficient at all, and the performance of the GUI coupling with the communication is low.It is expected to have slight delay between the continuously and repeatedly sending of READ commands by PC, and some delay before the full data package is returned from the PLC. Is there any other way round to effectively optimize the communication performance? Such as multi-threading?
Posted

I'm afraid this is not really an answer, more a general sympathetic moan!


I write applications that communicate with the 'outside world' via RS232 and I often have difficulties with timing. I am sure you are aware that exact timings from the PC are virually impossible because you cannot control the serial port directly, you are relying on Windows to to do it when it has time ...... we use 38400 baud and yet I find collecting even moderate ammounts of data takes some time due to the build up of 'delays'. I have to add a delay into my code which is the minimum the hardware needs, but sometimes the PC delays for far longer and the whole thing becomes very slow.

I have implemented an additional thread, in one project, that performs data collection at 'exact' time intervals and it works moderately well, but I'm not sure it would improve efficiency. Since only one 'set' of data can be sent or received at anyone time.

I did set my 'communication' thread to have a higher priority that normal and that did increase the speed to some extent, have you tried that?

Best of Luck :)
 
Share this answer
 
Comments
CPallini 10-Jun-10 7:33am    
Cannot this 'slow' behavior be fixed using a microcontroller between the PC and the 'outside world'?
:-)
LittleYellowBird 10-Jun-10 8:25am    
My 'outside world' is a microcontroller. It receives the 'command/data/whatever' and decides what to do and then responds. The PC has to wait for it ..... whereas a microcontroller has a handy interrupt to tell me that data has come in, on the PC the program has to poll the comms, the serial port library normally has a 'timeout' built in and if you issue too many commands to check for data the whole thing grinds to a halt. But if you don't check often enough you waste precious time.
zhofft 10-Jun-10 21:41pm    
So, you've been using a high priority thread to initiate a Timer for the SerialPort scanning? I'm trying to do such thing in order to seperate the UI and SerialPort Scanning at best, since there is obvious delay of sending data using Timer/Thread when the UI is changing. Actually, I'm quite interested to know more about the 'exact' time interval you've been managed to handle using the .NET framework as I found the delay of the communication comes in quite a portion from the poor time handling.
This is my answer to your comment on your original question (I've put it here so that you get notification that I have added a message :) )


Hhmmm, I am afraid my 'exact' timing is a fudge and may not help you :-O , here's the story.....

Years ago I calulated the correct timer setting to give me the sampling rate that I needed on a project that records a signal level at set rate, but I found the timer was always slower than I expected. I could 'tweak' the timer setting and get it just right on that PC, on that day ..... but on another machine or a few days later if would be wrong! (I guess it depends on ehat processes are running etc) The error could be 10% or 20%. My customer might well run a 2 hour experiment and that meant the experiment would either take an extra 10 or 20 minutes OR they would have 10 or 20% less points of sample data (depending on how I decided it was finished) - neither situation was acceptable. My solution was to dynamically adjust my timer setting, using the Real Time Clock to check my timer. The Real Time Clock has its own crystal and is not affected by the PC's actions.

I set the timer to the calculated value then at intervals of a few seconds I check the number of samples that I have against the the number I should have, caluculated from the Real Time clock. I then tweak the timer setting to get the timer correct, until I get the correct number of samples in a given time. Obviously if the user runs a new app that uses lots of resources it takes a few seconds to adjust but most of the time they just set the PC off doing an experiment and leave it.

It is sufficiently accurate for my customers' needs as their signal does not change rapidly. I think it is a resonable compromise for my system. This is all done for hardware that was originally designed before the company that I worked for realised that PC's could not do accurate timings (long before I joined) and all our new hardware has is own timers and buffers, which solves my problem.

I can imagine that in many situations this would not be an acceptable solution, but I am happy to share it with you in case you can use it. Comms is always a headache (we've started to change to USB and I still have timing issues with too! :rolleyes:), hope I've helped a little,

Good Luck with your project :)
 
Share this answer
 

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900