Instrument Control (GPIB, Serial, VISA, IVI)

cancel
Showing results for 
Search instead for 
Did you mean: 

ADC communiction with UART to Labview

Solved!
Go to solution

okay got it thanks, so i attached the screen shot of the data in hex. so as i said the voltage i am reading is a AC voltage of .2 and  the ADC reading are read one bit at a time, so for example if i am reading 50 and 60 then format would be !50$, followed byt !60$, indcating start of and end of each number. so each value read has ! for start and $ for end. I dont know exactly how its coming in and displaying in labview but as you said maybe the bits are broken up. So as you said, im comparing for the ! mark but not working so im not sure if i can actallly put a ! to compare or do i have to actually format for "!" and "$"

0 Kudos
Message 31 of 47
(3,784 Views)
There's no screenshot of the data. Even if there was, your description is totally confusing to me. If the '50' is a single byte in hex and you prepend a single ! to each byte, you are back to not knowing which byte is which on terms of the order of converting three separate bytes back to a floating point.

At this point, you would be better off sending the entire measured value as ASCII text. You should perhaps debug the communication with a terminal emulator such as putty before writing your own program.
0 Kudos
Message 32 of 47
(3,772 Views)

@Dennis_Knutson wrote:
At this point, you would be better off sending the entire measured value as ASCII text. You should perhaps debug the communication with a terminal emulator such as putty before writing your own program.

Since it sounds like the OP is in full control of the microcontroller code, I second this.  Changing things to be ASCII makes things A LOT simpler.  I see two ways to do this:

1. Have the microcontroller do a straight conversion to the decimal value in ASCII.

2. Change your bytes into hexidecimal ASCII characters (would double the number of bytes being sent).

 

To make it more universal, I would go with option 1.

 

Regardless, I recommend ending the message with a Line Feed (0x0A) so that you can just use the termination character to stop the VISA Read and you know you go the whole message.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 33 of 47
(3,759 Views)

So as I said before I have the adc setup to read one reading at a Time so when a value comes in it places in a biffer from msn to lsn and also have a ! For start of day and $ for end of data on ever value so I can separate the numbers. So when labview gets the values some time of conversion is happening and I think it has to be labview side. 

0 Kudos
Message 34 of 47
(3,750 Views)
As already said, LabVIEW or VISA does not do any conversion and putting the same thing around each byte does not do anything. Typically, you have a start character, byte one, byte two, byte three, etc, end character. Start character, byte one, end character, start character, byte two, end character, etc is just bad design and will never work. And as can be plain seen, you aren't even sending the start and end character that you think you are.
0 Kudos
Message 35 of 47
(3,741 Views)

So if you dont mind, how should have i the microcontroller send the data and have it formated in the micro code to make it easier on the labview side

0 Kudos
Message 36 of 47
(3,715 Views)
You either do a correct write of the binary/hex values or transmit as ASCII text. You aren't sending the ! or $ as ASCII text right now and my C programming is rusty enough that I don't understand why so you should be posting that question on a different forum.
0 Kudos
Message 37 of 47
(3,712 Views)

http://www.codeproject.com/Articles/473828/Arduino-Csharp-and-Serial-Interface

might help you understand what is required

Though it is about Arduino and C#, the concepts are quite relevant.

0 Kudos
Message 38 of 47
(3,697 Views)

I finally got it to work, the baudrate was not correct from the data sheet so i hook it up to a oscope and figured it out there, so now my micro controller is sending straight ascii values and the right data. So now my code works as follows im sending sets of 512 numbers and i have a $ that insitates the begining of each 512 numbers, then between each number there is a comma, there is 8 digits between each comma, So now i need a way to find each comma and $ sign. I used the match pattern to find the $ and comma and it does but only one $ sign and comma. I need every one. Then i need to join numbers. I have attached image of the data in buffer and make code. I also found the spreadsheet string kinda useful but idk. I have attached my code and images.

0 Kudos
Message 39 of 47
(3,686 Views)

At the end of the 512 numbers, is there a Carriage Return or Line Feed to signal the end of the data?  If not, then I would make the '$' the termination character.

 

Now for your number of bytes to read, you have 512 numbers*9bytes/number+1 start bytes = 4609 bytes.  I would just set the number of bytes to read for the VISA Read to 4650 and let the termination character tell the VISA Read when to stop.

 

And finally your parsing.  Use Spreadsheet String To Array.

 


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 40 of 47
(3,653 Views)