04-24-2020 10:08 AM - edited 04-24-2020 10:18 AM
thanks
How can I set delay between sending each two bytes?
edited: you have answered some one else before:
All you have to do is put your VISA write in a "loop with a 50 msec delay, and write the message out one character or byte at a time. If your bytes are in an array, you can autoindex the array. If they are in a string, pick off one character at a time and put the remainder of the string in a shift register for the next iteration"
how about for cluster?
thanks
04-24-2020 10:22 AM - edited 04-24-2020 11:18 AM
Finally I did it :))))
Thanks
04-24-2020 10:52 AM
That seems to be in LabVIEW 2019 format. Please go through the small trouble of saving the VI for previous and save it as 2018 or earlier version.
04-24-2020 11:20 AM - edited 04-24-2020 11:24 AM
Thanks RolfK
I did it successfully
Just one remained issue: after clicking "abort execution" icon I always need to close VI and run it again because the indicators remain on although I have marked "clear indicators when called" . why?
04-24-2020 12:13 PM
You should never be using the Abort Execution button unless you are debugging an application that has gotten stuck somewhere.
As for indicators remaining on, it is called "clear indicators when called". Not clear indicators when "ended".
05-07-2020 04:04 AM - edited 05-07-2020 04:07 AM
Hi all
I have a problem. I am reading 566 bytes with start frame 0xCCCC by Labview. some times Labview shows 3 0xCC bytes I mean 0xCCCCCC (on data1 display) what is the problem?
I have checked my data on terminal (with other softwares) there is no extra CC byte and micro works correctly on the other side
I have attached my VI
Thanks
Regards