I noticed that the result of appending a 1-row or empty set of compressed digital data using DTbl Append Digital Samples.vi gives incorrect results when you append it to an existing set of digital data with a matching last row. This does not occur when both sets of digital data are uncompressed. See Digital Data Append Flaw.vi in the attachment.
This issue seems to manifest as follows:
If the second set is empty and the last two transitions of the first set have the same value, the data's endpoint is lost.
If the second set has one row that matches the last row of the first set, the endpoint of the combined set is lost.
My application just needs to individually track Boolean transitions (and I was getting some crashing issues when using 72 individual digital waveforms in parallel with other code), so I started writing my own (scaled down) version. I found this issue when I was verifying that my code gives the same output as the built-in waveform functions. I attached some of this code that could easily converted to handle 2-D arrays of U8's instead of 1-D arrays of Booleans (My Version of Digital Data Append.vi in the attachment). Note that I am not doing subsets in my code, so I don't subtract an offset for the case when the first transition of the second array has an index other than 0. I also don't have any code to deal with uncompressed data, as I only care about transitions.
Solved! Go to Solution.
I forgot to mention:
The code is saved in LabVIEW 2015. I had a coworker try it in LabVIEW 2017 with the same results.
Thanks for bringing this issue to our attention. I was able to replicate the behavior in another program and am in the process of reviewing it with R&D to see if this requires a corrective action report. I will keep you updated on the status!
Hey oort, I wanted to let you know that I've filed a Corrective Action Report for this issue. Thanks again for bringing it to our attention and catching this bug! If you'd like to check in and see if the bug has been fixed in our newest releases of software, you can keep an eye on this page and look for the ID number 681163: