08-06-2009 08:37 AM
Whatever you are trying to do. I can tell you with my eyes closed that you are either doing it wrongly (or unneccesarily) or doing it the hardway.
08-06-2009 08:47 AM
Muks,
Im writing a program which will read dater (baurdrate of 9600) via serial port. Aswell as plotting a couple graph, I want to be able to store the data as a spreadsheet. The data will be running for a few months. How do I need to change my program (attached) so that it can perform this task undesturbed without losing data?
Reagrds,
Sam
08-06-2009 09:03 AM
How often are you collecting data? Once per second, once per minute? How many bytes per reading? That information will tell you how large the file will grow. The file can grow as large as the file system can handle. Now if you try to import that text file into Excel, all the good, older version of Excel are limited to 65,535 rows of data. It will raise an error and now import anything after that. The newest version of Excel can handle somewhere over a million.
If you are running this for months, you don't want to wait until the loop is done to save the data. Imagine 2 months into your test the PC crashes, you wouldn't have any data since you hadn't saved it to a file yet, it was only stored in RAM. You especially don't want to grow a very large array to save the data. (Your screenshot still shows you are only sending out the very last point). Move the Write to Spreadsheet File into the Loop and turn on the Append to File option.
If you are doing a lot of file writing, (like hundreds to thousands of points a second), you would want to use a producer/consumer architecture to move the data out of the acquisition loop and into a file management loop. There you could build small arrays of data and periodically flush them out and write them to the file. That is more efficient and saves wear and tear on the hard drive.