11-27-2015 02:49 AM
hello all,
We have a tester that produce report's of uut's in txt file.(file size is about 6Kb-8Kb).
We are saveing the report's to the same folder(each uut have an report). the tester can produce alot files in 24H (5000 file in one day). we have the ability to save all uut's report's into one file.After this we need to pares the report and insert the result into DB.
SO my question is :
1. What is the best way to save UUT's report? all the report into one txt file or each report in a differnt txt file.?
2. When opening and reading lots of files the system work slowly then opening and reading one large file?
note: i know the ability to save the result directly inot the db.(we still need the txt files).
thank's
eyal.
11-27-2015 03:36 AM
My recommendation would be to have one file per day/hour. That way you can treat the file like a 'daily report' - you will have multiple UUT results in each file which will save on the overhead of opening/closing up thousands of files but your file sizes will still be manageable. It's also easier on your file system if you don't have thousands of files in a single folder.
11-27-2015 03:47 AM
Eyal,
please note that you have to make the decision. And: There is no "one single golden rule" which applies for every use-case.
That being said, i want ou to consider the following:
a) Having thousands of files on disk, esp. in a single folder, hogs the system. You can already see this when browsing those folders with file explorer, using programmatic access can be very slow.
b) Having thousands of files or a single, gigantic file is a big challenge for processing/searchability and associated performance.
c) The bigger the file, the greater the risk for file issues (e.g. corrupt file)
d) Overall amount of data: How is searchability provided? Why store data which will never be reviewed? (Read: Try to reduce amount of data which is logged to file overall)
I think these are the most important ones, but the list doesn't look to be complete though.....
Norbert