LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

User-specified DAQ interruptions, instrument control through serial communication

I'm working on an instrument control program, and I've run into a structural problem that I cannot figure out.

The instrument in question is effectively a thermostat.


The program has two functions:

1.)  Background sampling to record temperature in a log over time.

2.)  Adjust temperature according to user input

 

The issue is that the instrument uses EIA-232 serial communication to talk to the PC.

This prohibits simultaneous execution.  Attempting to send a command while the program is taking a sample will result in serial blockage errors.

 

So the program must interrupt background sampling until the specified command has been completed.
I can't figure out how to do this.

My best idea was to create a manual pause control.  If the user wants to adjust the temperature, he hits a switch to pause the sampling, sends the appropriate command, then hits the switch again to reinitiate sampling.  This method will suffice, but is not ideal.
Beyond that, I really have no idea how to prevent the two functions from running into each other.


Help structuring this program would be greatly appreciated,
Thank you

0 Kudos
Message 1 of 3
(2,205 Views)

Look into Semaphores (icons with traffic light glyphs)

 

Set up a Semaphore resource to the COM port that is wired to two parallel structures.

One structure would do the background polling, the other would handle setting changes.

Prior to background poll, lock the resource, then unlock it after the poll. Likewise for the setting change command.

 

When a resource is locked, the other process cannot access it until it is unlocked. Be sure to dispose of the Semaphore when ending the program.

 

-AK2DM

~~~~~~~~~~~~~~~~~~~~~~~~~~
"It’s the questions that drive us.”
~~~~~~~~~~~~~~~~~~~~~~~~~~
Message 2 of 3
(2,199 Views)

This solution makes sure that serial blockage errors will not occur. 
However, it gives a poor response time for setting changes.

Currently, samples are separated by a programmable delay (refresh rate).

If the refresh rate is, for example, ten seconds, then the sampling process will wait ten seconds before ceding the resource to the user command.

If I were to decrease the delay to the minimum allowed by the instrument, this likely wouldn't make much of a difference, but the problem remains.

 

I'll implement this solution and run a few trials to determine if the response time is sufficient.
Thank you for your help!

0 Kudos
Message 3 of 3
(2,168 Views)