I believe you are correct in your reasoning behind the error. Because USB devices require interrupts to pass the data of the device, if no processor time is available to fill the request these types of errors can occur.
Instead of having to redimension the array, and forcing LabVIEW to copy the data to a new memory location, you could initialize the array to a predefined size at the beginning of your application. For example if you wanted to record data for 1 hour at 2 samples a second you would initialize the array to have 7200 elements. In each iteration of the array you would then insert into a specific element. This would allow you to keep track of all the data. Another alternative is to store that data to a file, and only keep a smaller history for your graph. For example you could record data for 7+ hours into a file, but only keep the last 30 minutes of data in the graph. Once the application has stopped you could do post processing on that data when time is not as critical.
It is more efficient to allocate the entire array before your while loop, and use the replace array subset vi to fill it in. You can find information on this here:
under the heading "Avoid Constantly Resizing Data". If you need to add the data point to location 120001 you need to use the insert into array, and this will reallocate all the memory in LabVIEW. As the document demonstrates with the build array, this is very time consuming.
I do not think that going to USB 2.0 will help you past this problem because the issue seems to be with the processor. It sounds like the processor is not fast enough to keep up with the memory allocation/copying of data and handling interrupts.
To find out how many samples remain in the buffer, you can use a DAQmx Read property node. In this property node select Status >> Available Samples Per Channel.