10-03-2020 12:32 AM - edited 10-03-2020 12:52 AM
Hello Community,
I would like to simulate a signal, based on data points in Excel.
I have got an excel sheet, attached in the folder, with the first column as time and the second column is the data.
My goal is to simulate a real-time signal based on this data.
The following attached is the LabVIEW code that I made, but I'm really confused about how it works.
As shown in the figure below, there are amplitudes 2 and 4 coming out of my signal, but those data points are not even in my excel sheet.
I'm not sure if I'm even plotting the correct data, I would like to know where these values came from.
Also, how do I change the timing of the loop so it kind of matches up to the time values in my first column?
Finally, after I rerun the program, how do I let the plot restart at 0 seconds. Thank you!
Solved! Go to Solution.
10-03-2020 11:39 AM - edited 10-03-2020 11:48 AM
Hi jack,
@jackhsu66 wrote:
The following attached is the LabVIEW code that I made, but I'm really confused about how it works.
You are confusing the terms "spreadsheet file" with "proprietary Excel file format" - as many other people before!
The ReadDelimitedFile functions reads only text files containing delimited text, aka CSV files. Your XLSM files can only be read by Excel - or by using RGT (Report Generation toolkit) functions…
I exported the worksheet into a CSV file, using the semicolon as delimiter between values, timestamp formatted as HH:MM:SS, and unfortunately German decimal separator for the numeric value: