From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Simulated instruments do not act like real instruments

 

I am having an issue with development for an NI-9263 in a NI-9172 chassis. I started developing my software using simulated instruments. I have some very long waveforms that I will need to use, so I tried to break things up into manageable blocks of 1 minute each. I used properties from the task (SpaceAvail ) to decide when to write the next block.

 

When I received my hardware, I found that the properties that I was relying on do not return the same data. Now my application does not know when to do the next write because it always thinks that the buffer is full. I will have to come up with another method for feeding data into the task.

 

Is this a known shorcoming of simulated devices? I would think that the simulated device driver would return approximately the same data as the real thing.

 

Attached is a little test widget I wrote that you can use to see the difference between a simulated and real device (if you have one). It should work for Labview 8.0 and up.

 

 

--

Brian Rose
0 Kudos
Message 1 of 6
(2,834 Views)

Hi Mister Rose,

 

The reason you are seeing different data is due to the fact that the buffer is simulated in a different manner than it is actually implemented for the 9263.  The simulation has a finite resolution to which it can model the filling of the buffer.  In a real device, the DMA transfer used to move samples is a hardware operation and requires no CPU power.  As such, the buffer does not fill in the exact same way between a simulated and real device. 

David
Applications Engineer
National Instruments


Digital Multimeters
0 Kudos
Message 2 of 6
(2,792 Views)

The impression that I had was that DAQmx provided a layer of abstraction so that the application does not have to manage the minutae of each device. You could write code for a simple USB based device and easily* port it to a PXI card with deeper memory. But if I have to account for the buffering scheme of this particular card, it is going to limit where the code can run without redesign.

 

Furthermore, I would think that the simulator would at least behave closely to the real device, instead of presenting a false model. As it stands now, I have to redesign a significant chunk of code because the simulated device is very different from the real one. Instead of pretending to be  9263, they should just say "simulated widget" and not get anybody's hopes up that this actually behaves like a real device. Why do they have the user go through the selection of a particular simulated device if it behaves nothing like the real thing?

 

Sorry for the rant, but the value of simulated instruments is greatly diminished with this.

 

*Note: Various values of "easy" may apply. 

--

Brian Rose
0 Kudos
Message 3 of 6
(2,772 Views)

Hi Brian,

 

With regard to your comment regarding the DAQmx abstraction between devices, the issue is not between two real devices, it is between a simulated and real device.  You are not accounting for the buffer "scheme" of the card but rather of the simulated card.  The simulation of data transfer to the buffer is different than the actual data transfer to the buffer for a real device due to CPU involvement in the simulation.  In a real device, the DMA transfer is a hardware only operation.  If the simulated device were to fill the buffer continuously, the CPU would have to process continuoulsy filling it with simulated data (sine wave), taking away for the processing power from the rest of the program, making it run slower than it would with a real device.

 

The simulated device behaves very much like the actual device.  However, as you have found, when it comes to certain operations (eg hardware events), the functionality of the simulated device can be different than that of the real device.  I apologize for the inconvenience this has caused you.

David
Applications Engineer
National Instruments


Digital Multimeters
0 Kudos
Message 4 of 6
(2,747 Views)

If the simulated device is managing a buffer in software and updating the properties, then this takes some software time to run. Wouldn't the hardware reduce (not increase) the amount of processing power needed by the drivers to update the proterty fields? I would think that the real hardware would give you more performance than a simulated device, not less.

 

At least they should make sure that the data provided by the simualted device is the same as what the real device provides. If the real device does not update the SpaceAvail property until the last sample in the buffer is played, then the simulated device should do the same.

--

Brian Rose
0 Kudos
Message 5 of 6
(2,699 Views)

Hi Brian,

 

As I stated previously, the DMA transfer is a hardware only operation.  As such the hardware does reduce the amount of CPU required for the process (in line with your question).  The simulated device fills the buffer differently than the real device, in a manner that reduces the amount of CPU processing required.  As such, properties (ie SpaceAvail) will be different between simulated and real device applications.

David
Applications Engineer
National Instruments


Digital Multimeters
0 Kudos
Message 6 of 6
(2,671 Views)