11-17-2023 08:16 AM
Hello ,
I need to read a .txt file (log.txt) and extract the data from each line and put it in a csv file. The parser file output example.csv is what I want to have.
I succeeded in reading the file, but how do I extract the data from each line?
How do I do this?
Thanks in advance
Solved! Go to Solution.
11-17-2023 09:12 AM
11-17-2023 09:26 AM - edited 11-17-2023 09:31 AM
We are missing subVIs. Is the included CSV file exactly what you want as desired output for the given input file? Looks like "semicolon delimited", not "comma delimited" as csv typically would indicate. Do you really need all these blank fields?
Your code contains completely silly Rube Goldberg constructs such as the following:
(If you are autoindexing, you don't need to get the size and wire to N! Building an array with one element, then getting that element is a NOOP.!)
11-18-2023 12:28 AM
@altenbach wrote:
(If you are autoindexing, you don't need to get the size and wire to N! Building an array with one element, then getting that element is a NOOP.!)
May be kind of "orphan" refactoring, because probably some iterations ago here was completely other code, which was removed. I don't belive that it was created "from the scratch" in such form. Sometimes I have "similar" funny artefacts in my old code, even after 20+ years experience. Just round refactoring, that all.
11-18-2023 06:13 AM
@Andrey_Dmitriev wrote:
@altenbach wrote:
(If you are autoindexing, you don't need to get the size and wire to N! Building an array with one element, then getting that element is a NOOP.!)
May be kind of "orphan" refactoring, because probably some iterations ago here was completely other code, which was removed. I don't belive that it was created "from the scratch" in such form. Sometimes I have "similar" funny artefacts in my old code, even after 20+ years experience. Just round refactoring, that all.
From what I saw in the original code, this was an artifact of just trying anything trying to get something to work instead of actually figuring out what each item does to construct the solution.
11-18-2023 09:45 AM - edited 11-18-2023 10:04 AM
@crossrulz wrote:
From what I saw in the original code, this was an artifact of just trying anything trying to get something to work instead of actually figuring out what each item does to construct the solution.
Yes, following the adage that even a blind chicken finds a grain of corn once in a while, we can just randomly throw more and more code at it and as long as there are no broken wires there is a finite chance that it will run and do "something". Wash. Rinse. Repeat.
Same applies for the file reading (it is not clear on which planet "open or create" would make sense to read an existing file. 😄 fortunately, the file dialog is configured to pick an existing file, so this is moot.)
:
(... and no, I would not do it this way either... )
11-20-2023 03:54 AM
Hi @raphschru, hi @all
thank you for your answers.
When I posted my first question I had a lot of things to do I didn't have time, I modified an old vi hence some duplicates that some people have noticed.
I thought about my problem and I noticed that my log.txt file had this structure (see Archetecture_of_file_log.txt).
So I thought of making a cluster).
My last, probably simple question is:
How do I save Array data in the output.csv file?
Thanks in advance
11-20-2023 03:59 AM
Hi @Andrey_Dmitriev,
It was exactly that !
11-20-2023 10:31 AM
Sorry, I don't have your toolkit. Can you run your VI until the "Array" indicator contains data, then select that indicator and "exit...make selected value default). Not copy it to a new VI and attach that here.
I doubt that the current example output is really what you want (here's how a small part looks in notepad++):
11-20-2023 10:49 AM - edited 11-20-2023 12:15 PM
Here's code that would blindly graph all data as %d, then just write it as 1D array using a linefeed as delimiter.
You need to figure out how the date/time is encoded in some of these integers and translate it onto a proper time format (not shown), (Yes, there are more efficient ways, but start with that).