But I would suggest you to go for the .csv format and then generate the report using the report generation toolkit. Whatever you are doing with the Excel file you can do the same in .csv also and then converting to report should be good. I tried to give you explanation since you wanted to understand.
Yes, I understood. Thank you so much for your help and Suggestions.
Please see the attached document.
These are two different sample documents about my Report.
Here, I want to read particular column values based on requirement (in Offline mode).
As we know , csv is coma-delimited format. If there is any coma present in the File Name or path name then array positions are changing.
I think we have to do some more String manipulations to get the correct data.
Do you have any idea?
It really depends how you write the data to the .csv file. Why are you having that much of coma's in your data? Can you show the code in which you are generating the data?
Please find the attached VI (V1.3).
Input file Should be anything (with many coma’s) but it should read.
Please see the Version 1.4.
May be this is one of the possible way.
Based on this, let me know which one is more better (CSV/RGT).
There should be proper format of data written so that we can manipulate. If you want to have coma for other use then you can use "Tab space" as delimiter. So replace the coma with "Tab" constant in Array to spreadsheet string and you can give .txt itself as file extension (Just to avoid confusion).
Could you please give me sample vi.
If Tab-delimiter, Data is in 1st column then how we will get the particular Sub Array data from File.
I can read but when we see the data in CSV then Format is bit Different (because it’s Tab-delimiter).
I want CSV file also in same format like my 1st sample Report data
As I understand your problem, most of the time you are not writing to Excel. If you are "occasionally" adding data to an existing Excel file, simply treat it as though you are dealing with an ordinary text or binary file. You would create a little sub-VI, "Append data to file", and pass in the data (and maybe the file name). Inside the VI, you would open the (existing) file, position the file at its end, add the new data, then close the file and exit the sub-VI. Result -- outside the sub-VI, the file exists, but is closed so its integrity is guaranteed.
How to do this in Excel? Simple, use the Report Generation Toolkit. The New Report function takes a parameter that NI calls a "Template", but it really is just the name of an existing Excel file. To go to the end of the file, or the last row of the Spreadsheet, use the Get Last Row function found on the Excel Specific, Excel General palette. Once you have added the new data, use Save Report to File using the same name that you used to open the Report. Note -- do NOT manually close Excel, only close it programmatically. There's a bug that I reported to NI whereby closing a Workbook "by hand" and then closing it using Close File will result in the file being deleted!
So assuming that you are updating the Workbook at, say, once a minute, you should be able to do this Open/Append/Close sequence in a second or less, meaning most of your program's time can be devoted to other things, and most of the time your file, being closed, is "safe".
Hope this solves your problem.