LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

NI9401 TTL Serial Communication data packet reliability issue

Solved!
Go to solution

Device used:

NI-9401 high speed digital module

NI-cDAQ-9174 USB chassis

Software:

LabVIEW 2020

 

Hello,

This is my first attempt to write a Labview program to interface with a 3rd party device talking in serial 5V-TTL with a baud rate of 4800. The device(an inclinometer) works fine when connected to PC or an Arduino, the protocols are well defined.(attached as an image)

However, when I created a Labview and connected the device to NI9401 module in a cDAQ9174 chassis, the data packet can be received but have reliability issues, the program I wrote was able to grab the packet and the first 2 bytes seems very reliable (data header is defined 0x55 + 0x51), but the reliability goes down further in the 11 byte long packet, the last(11th) byte received is checksum and it never manage to match the real sum calculated by the program.(indicated in the screenshot)

I have talked to the support team of the serial device and I am quite certain the checksum was calculated correctly.

I suspect there may be a clock problem, even though I set the channel clock rate to 4800, there exists a minimum shift for the later bytes. But I do not know how to resolve this issue.

The vi program(written by a true novice) is attached, it can parse the data packet by doing a large sampling and use "search" function to look for the header existing in the waveform chunk, and then parse the trimmed array into bytes in order to apply the protocol.

Thank you for reading this.

 

JT

Download All
0 Kudos
Message 1 of 9
(2,687 Views)

Hi Jie,

 

have you tried to use a USB-2-Serial converter with a TTL-compatible output?

Might be easier than to use NI9401 with DAQmx…

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 2 of 9
(2,662 Views)

Hi GerdW,

Thank you very much for your reply,

I tried Serial to USB module and Arduino, both worked fine.

Meanwhile, 9401 is still the preferred way because it will enables me to talk to more than 1 devices with only 1 USB port connected to my PC, the chassis also contains other modules(NI-9219s) with some other transducers so I (with all the naivete) considered it to be a better integrated system.

0 Kudos
Message 3 of 9
(2,651 Views)

Baud rate and bitrate aren't really the same (although they could be).  That's might be where you are having issues.  Baud rate is really the amount of symbols per second.  Read up on baud rate vs bitrate.  It's interesting reading.  And it will probably make you choose a more "traditional" method of serial communication.

Bill
CLD
(Mid-Level minion.)
My support system ensures that I don't look totally incompetent.
Proud to say that I've progressed beyond knowing just enough to be dangerous. I now know enough to know that I have no clue about anything at all.
Humble author of the CLAD Nugget.
Message 4 of 9
(2,615 Views)

@Jie_T wrote:

Meanwhile, 9401 is still the preferred way because it will enables me to talk to more than 1 devices with only 1 USB port connected to my PC


Then get a good USB hub.  I like the industrial grade hubs from StarTech such as ST4200USBM.  I also encourage using those with a separate power supply.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 5 of 9
(2,580 Views)

Hi BillKo,

I checked the clock rate. I'm certain that <baud rate= bit rate> in my case. Because if I use a clock rate of 19200(4x), the header 0x55(0b01010101) would become (0b00001111000011110000111100001111).

Thank you very much for replying, it was an interesting read indeed.

 

J

0 Kudos
Message 6 of 9
(2,569 Views)

@crossrulz wrote:

@Jie_T wrote:

because it will enables me to talk to more than 1 devices with only 1 USB port connected to my PC


Then get a good USB hub. 


This is the correct answer.

 

Don't reinvent the wheel!

========================
=== Engineer Ambiguously ===
========================
Message 7 of 9
(2,553 Views)
Solution
Accepted by topic author Jie_T

@Jie_T wrote:

 

I checked the clock rate. I'm certain that <baud rate= bit rate> in my case. Because if I use a clock rate of 19200(4x), the header 0x55(0b01010101) would become (0b00001111000011110000111100001111).

Thank you very much for replying, it was an interesting read indeed.


You may think so but that doesn't mean it is so. While the bit interval within a single "character" (usually 8 bit nowadays) is indeed pretty much according to the serial baudrate, that timing can wander off over the course of multiple "characters". That is also why there is at least a start bit and often one or two stop bits in serial communication. The start bit allows the serial port receiver to synchronize to the senders clock rate before every transmitted "character". If your digital receiver simply treats the incoming stream as an equally spaced bit stream over multiple characters, it is not surprising that the interpretation of your data stream is actually slowly drfting across the actual bits.

 

And you already discovered by accident a way to adjust for that in software if the hardware can't itself synchronize on the serial bit stream, which every uart chip implementation nowadays does automatically.

It is called oversampling! Read the data with 19200 samples per second (4 * oversampling but it could be even more depending on the accuracy you require) and analyze the incoming bit stream accordingly, adjusting for the bit synchronzation by one or at most two sample intervals everytime you detect a logic change. That way you can determine the digital datastream, even if the "characters" are slowly drifting away because of small delays in generating the individual "character" sequences at the sender as well as different timing bases on receiver and sender. 4800 baud seldom is 4800.000 bits per second since depending on the hardware resources available on an embedded device the actual clock generation could be simply based on an RC oscillator with as much as 20 to 30% tolerance! A crystal and the necessary clock driver for it easily can add 50 cent or more to the hardware cost of an embedded device and that is a lot of money when you consider that some fairly complex embedded devices are often sold in bulk for not really more than that. Also crystals are generally fairly sensitive to shock and other physical stress like vibration, so there may be a practical reason to avoid them in certain designs, especially if the accuracy of a crystal is not really needed.

Rolf Kalbermatter
My Blog
Message 8 of 9
(2,536 Views)

Hi Rolf,

Yes! This explains the significant drift existing in the 11-byte long packet perfectly. I think there should be a proper oversampling multiplier that I can use to make the data received marginally reliable, a typical engineering trade off problem.

Thank you so much for your answer, huge respect.

 

J

Message 9 of 9
(2,524 Views)