Im just wondering how to speed up copying of big (26GB) file to few USB HDD from one Internal HDD, my idea is to read file by 1MB each loop and save data to some var and save data to all hdd from var. In general i will read from file save to var and set green light to copy from var to file on USB HDD's.
What do you think of this? Labview can handle it?
Sure LabVIEW can handle it. And you can write a partical accelerator program using HTML and PHP but I wouldn't recommend it. Why re-invent the wheel? Under the File IO palette there is a Copy file function. Give it a source, give it a destination and it copies the file.
"Under the File IO palette there is a Copy file function. Give it a source, give it a destination and it copies the file"
but you will get maximum speed of HDD/7 not so if HDD read speed is 120MB/s you can write to 7 HDD on maximum 17.4MB/s, i wanto to use RAM to get 120MB/s for all 7 HDD.
So use one read binary file and 7 write binary files. There is an efficient read somewhere in the palettes that reads to a preallocated data array, use this. Proceed in a loop, consecutive blocks of data at a time. You may have to experiment, or get some ideas from the web as to what a good size block is for speed.
I'm not sure where you are claiming the bottleneck will occur, but there are many other ways to copy a file. What about using the command line copy command? XCopy? XXCopy? Robocopy? I would have writing your own copy function, at the bottom of the list of solutions and only use it if there are no other options.
now i have working system with copy option and im doing it to 7 postions on same time, but when you need to copy over 90GB of files, stoppage for production line will be dramatic long. I need to find way to stream at maximum speed to 7 HDD at same time. Now i tried to read binary file and save to 7 position in same time but it got only 30% faster than original solution. I will try to build message qeue and try to use some buffering in app.