From 04:00 PM CDT – 08:00 PM CDT (09:00 PM UTC – 01:00 AM UTC) Tuesday, April 16, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

error 200279 - buffering

Hello all,
 
I am relatively new to labview so any help you can offer is greatly appreciated.
 
I am hitting the same problem as a few other people but the solutions that have been previously posted aren't working for me.
 
I am trying to acquire, graph and write to file three voltage analog inputs.  I want to sample at 10 kHz but the moment i take the sampling rate above about 5kHz i get the dreaded 200279 error after a period of time.   
 
I am using a Pentium4 2.5 GHz computer with 512mb ram with the 6025E card LabVIEW7.1 so i'm quite sure my system is capable.
 
I started with the express vi's but found i had no control of the input buffer size, i then broke the daq.assistant.vi down so i could include the config.input.buffer.vi but I am still hitting the same problem when i approach the sampling rate i require. 
 
I am relatively unfamiliar with technicalities of LabVIEW so maybe i am missing something simple.
 
Andy
0 Kudos
Message 1 of 9
(3,386 Views)

Hi Andy,

If you can post your code I will happily take a look at it.

Emma Rogulska

NIUK & Ireland

0 Kudos
Message 2 of 9
(3,386 Views)

Hi Emma,

Thanks for the response, I have included my vi for you to look at.

I have include my orginal vi, creatively entitled Andy's3.vi. I have also included one called Andy's4.vi which includes a DAQmx Read properties code which an author called Zador posted, it seemed relevant so i have attempted to include its functionalities.

I have tinkered with the andy's.vi in-line with other posts and found that increasing the input buffer only seems to delay the onset of the 200279 error and also slow the computer considerably. The andy'4.vi seems to delay the error permanently but with the same slowing of the pc.

To re-iterate my issue. I wish to be able to collect from three sources of data continuously at 10kHz at least for a 60 second period. I do however need the graphs to be always-on so i can monitor the experiment but I only want to collect and record data when i trigger the button on the front end. An to correct a typo from my previous post i have an M-Series 6250 card.

Any advice you have regarding this issue or my vi in general will be very helpful.

Thanks,

Andy

Download All
0 Kudos
Message 3 of 9
(3,361 Views)

Hey Andy,

Have you tried reading less samples read more frequently?

If not I would suggest trying instead of waiting for 1000 samples to be available, read 100 samples at a time, but read faster. Hopefully this will directly address the issue of buffer overflow.

I cannot run your code as do not have the same hardware setup that you do.

However the following  knowledge Base describes a method that you can use to monitor buffer status during aquisition. I thought this may help you with your developement as it should enable you to see the effect of the various changes you are making.

Knowledge Base Useful:"DAQmx) Error -200279 During a Continuous, Buffered Acquisition"

http://digital.ni.com/public.nsf/websearch/7AD4854943BF344186256D6500807935?OpenDocument

I hope this information helps you.

Also if you could let me know what you have already tried it will help me to avoid repeating the same information.

Let me know how you get on with the above,

Emma Rogulska

NIUK & Ireland

0 Kudos
Message 4 of 9
(3,358 Views)

Hi Emma,

Thanks for the link, it explains it quite nicely and it think i've got my head around it. I have fiddled about with the sampling rate and the read values and have found a happy medium which keeps the buffer read difference constant enoughf or twhat i'm doing.

However, i now seem to have a problem with my Writing to file. I have included the output file below. I seem to be losing large chunks of data. You can see from the file that i have lost data between the time period 0.099 to 0.2949.  This is obviously a bit of a problem.

Is this because of varying the offset in the input buffer?

Regards,

Andy

0 Kudos
Message 5 of 9
(3,342 Views)

Hi Andy,

I think that the solution to your problem is in this forum:

http://forums.ni.com/ni/board/message?board.id=170&message.id=92236&requireLogin=False

As stated by "chilly charly", when writting multiple channels to a single file it is easy to confuse empty strings with zero values when reading the file back.

He has posted a really gd alternative piece of code which separates the channels and decimates the data according to your division/channel, before saving it SEPARATE files.

I hope that this helps you,

Emma Rogulska

NIUK & Ireland

0 Kudos
Message 6 of 9
(3,330 Views)
Hi Emma,
 
Ok, i'm just having a look at that code now.
 
I assumed that with the offset being moved data was being lost/overwritten in the buffer, is this not the case?
 
Andy
0 Kudos
Message 7 of 9
(3,324 Views)

Hi Andy,

Sorry for the delay in replying.

I cannot confirm for you whether or not this is the case as I cannot run your code, due to not having the hardware setup you have.

If you implement the code that I referred to, previously posted by "chilly charly", which saves the data to separate files and you still find data missing this then suggests that this is the case.

If however, no data is missing having implemented this method of saving it will prove that the data is in fact being lost when being read back.

Emma Rogulska

NIUK & Ireland

 

0 Kudos
Message 8 of 9
(3,301 Views)

Hi Emma,

Thanks for the reply i have implemented part of the code from chilly charly an it appears to work quite well, all my data is still there.

I'm still having the problem with the computer slowing when the buffer fills, but its a minor incovenience rather than a major issue.

Thanks for the assistance.

Regards,

Andy

0 Kudos
Message 9 of 9
(3,287 Views)