LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Update from version 7.1 to 2019

Solved!
Go to solution

Hi, 

we got an old XP computer with Labview 7.1 and some old obsolete HW. The vi that runs on that PC generate a signal and sends it to an actuator that moves according to this control signal. It makes use of <vilib>\Daq\1easyio.llb and the functions "A0 Generate Waveform.vi", "AO Write.vi", "AI Read.vi" +++

 

I copied the vilig library from the old 7.1 installation, but I guess that is not the way to go (I still get a lot of weird error messages that I can not solve).

I would like to upgrade the system to run on a Win10 platform and be able to use some newer HW f.ex one USB-device that got both AO and AI. As it is now the HW is hard coded into the code several places and that is probably the reason why it has not been upgraded along the way. 

Are there any easy fix to this?

 

The main VI is the "WaveLab 4.vi"


Olav 

0 Kudos
Message 1 of 9
(1,459 Views)

@OlavGundersen wrote:

Are there any easy fix to this?


Unfortunately not.  I have had to do a few of these, 2 of which I changed to using cRIO instead of a DAQ due to the industrial setting I was working with.  At  the very least, you need to change from using the Traditional DAQ to using DAQmx.  DAQmx is a MUCH better API, but it is completely different.  You need to understand the Traditional DAQ calls and do the equivalent in DAQmx.  This can be a long process, depending on how complicated the system is.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 2 of 9
(1,457 Views)

I see... So is it like replacing those Daq vi's to equivalent Daqmx vi's for every occurrences in the code -or is it even much worse? I assume that the vi's like "AO Generate Waveform.vi" is a corresponding Daqmx vi that will do similar but with a (slightly?) different interface? 
If so, how do I find the "equivalents"? I found this manual  that gives a nice overview, but is there a cross reference something that what you can change to if you go from Daq to Daqmx?
Thanks!

0 Kudos
Message 3 of 9
(1,409 Views)

It's not so simple as 1-to-1 replacements.  You need to figure out what is happening for a task in Traditional DAQ and recreate it using DAQmx.  While going that effort, I ended up taking the Traditional DAQ VIs from an old install and removed all of the dll calls.  But I still had the most important part of the VIs: the context help.  Using the context help, I was able to piece together what the code was doing with the DAQ and then rewrite it with DAQmx.  You can find this library here (along with a more detailed story): https://forums.ni.com/t5/LabVIEW/Modifying-old-VI-for-LabVIEW-2015/m-p/3370067#M992248 


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 4 of 9
(1,402 Views)

I read that link, but can you please explain more in detail what you meant with "..so I went through and took out all of the calls to the NI DAQ DLL.  So that left me with just the API for the Traditional DAQ."?

 

Is it what I did when I copied the library "vi.lib" directory from the old 7.1 installation into my 2019-project directory just to see what the "?-blocks" looked like and what the inputs are? 

0 Kudos
Message 5 of 9
(1,348 Views)

@OlavGundersen wrote:

Is it what I did when I copied the library "vi.lib" directory from the old 7.1 installation into my 2019-project directory just to see what the "?-blocks" looked like and what the inputs are? 


Yep, that was exactly the point.  I just went through and removed the dll calls inside of the VIs so I wouldn't have a giant list of broken VIs.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 6 of 9
(1,344 Views)
Solution
Accepted by OlavGundersen

I have a similar story to @crossrulz, though I'm certain I started with less experience than he had at that time.

 

About a decade ago, I "inherited" a LabVIEW 7.0 LabVIEW RT project that used PXI and PCI A/D cards (the PXI also had a Timer, D/A, and Digital I/O card, in addition to the Multi-function Card).  It had hundreds of VIs, all without any Documentation, and one enormous VI that did the PXI-based DAQ functions that, when the block diagram was printed at half size, took about 20 pieces of paper taped together in a 2 x 10 Landscape "scroll".  Needless to say, maintaining this, let along modifying or expanding this, was quite a challenge (especially since I was relatively "new" to LabVIEW at that time, though I had a lot of experience in Real-Time interrupt-driven programming).

 

I had been using LabVIEW 2009 and later versions by that time, and appreciated the improvements over LabVIEW 7, especially how much easier it was to do DAQ using DAQmx (Traditional DAQ, in contrast, was extremely "twiddly" and complicated).  Once I got started, however, I found that writing small routines that did one particular DAQ task, such as "Gather A/D data from 16 single-ended Channels at 1 kHz and stream them using Network Streams to the PC Host, where they can be streamed to Disk and also plotted (being decimated by a factor of 50, so the averaged or decimated values appeared at 20 points/second scrolling across the PC screen) was quite simple (once I got going, figuring out Network Streams, which might have taken until LabVIEW 2011, and "playing" with DAQmx (I strongly recommend the article "Learn 10 Functions in NI-DAQmx and Handle 80 Percent of your Data Acquisition Applications")).

 

I'm a strong believer of the "Clean Break".  Here are some other "tricks" that helped me greatly:

  • I'd never practiced this before, but there is real wisdom in "Write the Documentation First".  Or, at least, "Write Some Documentation during All Stages of Development".  Get the "Broad Outlines" down on paper (or Text, or LaTeX).  Concentrate on "What" you want to do, without worry so much about "How" you are going to do it.
  • Think "Heirarchically".  Try to break things up into broad "Tasks" (Initialization, Configuration, Acquisition -- Storage -- Display, Error Handling, Finalization -- Graceful Shutdown).  Choose an architecture that handles the "messy details" for you (JKI State Machine, QMH, DQMH, Your Favorite Paradigm).
  • To keep your sanity, organize your Project into Folders (virtual, physical, or combined virtual/physical).  I have Folders for the Host, the Target, and Common (VIs that run on both platforms), with sub-folders in each for Sub-VIs and TypeDefs.
  • TypeDefs are your Friends.  Every Enum that I create has a TypeDef, as does every Cluster.  I usually document Cluster TypeDefs, particularly where "units" (like msec or kHz) are involved.
  • Design an Icon for every VI so that you can recognize them when placed on a Block Diagram (it's fine to use a Colored Square (with Border) with 2-3 lines of short Text).
  • Try to keep all VI Block Diagrams "Laptop Screen Size".  This isn't that hard to do, particularly by "encapsulating" multiple functions into a sub-VI that "hides the messy details".  Note you can test such encapsulated VIs with made-up data and know that they work.
  • Testing as you develop is good.  

Bob Schor

Message 7 of 9
(1,325 Views)

Thank you all folks! An update is that I did localize the DAQ vi's and made new. The others with no or few errors I have fixed and it now runs correctly "most of the time". You see, I have been programming ASM since the 80's (hex code), moved to Turbo Pascal and the C and C++. Later I went to VHDL, back to C, and then ASM for a periode of uC. Since 2008 I have (occasionally!) done some simple Labview programming. I have also been through Matlab and DAQ too for some applications. The thing is that I have never mastered the Labview coding, and since I do it so seldom I probably never will.
Now I am stuck on this DAQ R/W problem that I believe is an easy fix but it is just me having problem seeing the solution:

MinAIOWave6.jpg

What I want to do in this sub-vi:
1. Write a 2D-vector to the two analoge outputs; one is a position-control signal and the other is a trigger-signal to external devices like a camera, level probes etc. Sample rate would typically be around 100Hz.

2. Read a few sensor inputs at the same rate, time and duration as the output is written.


However, I encountered errors that the Read was trying to exceed reading or did not start reading at all. I found that if I stepped through the program (runs with "light-bulb") it collected correctly amount of data while it collected 2x the amount if I did not. Timing issue I thought -so I included a delay between the two start. With this it works correctly "most of the time"... (other times it read 2x).

I believe it is related to the "start acquisition" and sample clock configuration, but I can not see the mistake. Can anybody help me here? 

Next issue is the hard coding; the source clock on the read VI is hard-coded to Dev1/ao!SampleClock inside a 3rd or 4th level sub-vi; This is not good...
Is there an easy way to extract this "configuration-label" from the output DAQ-branch below? Since I want to sync the input with the output I want to keep it related to this output sample clock. So if I choose a different input/output card I would like the sample clock to correspond to the new - without diving through the code to change it. I could pull out this input to the top-VI to configure it from there, but I guess there is a more easy and "automatic" way to retrieve this configuration locally (I have been through most of the properties node without finding any relation to this properties).

 

The error message I get when there is a mismatch between read and write data is: 
"Error -200278 occurred at Min AIO Wave 6.vi

Possible reason(s):

Attempted to read a sample beyond the final sample acquired. The acquisition has stopped, therefore the sample specified by the combination of position and offset will never be available.

Specify a position and offset which selects a sample up to, but not beyond, the final sample acquired. The final sample acquired can be determined by querying the total samples acquired after an acquisition has stopped.

Attempted to Read Sample: 3000
Property: RelativeTo
Corresponding Value: Current Read Position
Property: Offset
Corresponding Value: 0

Task Name: _unnamedTask<DE>"

 

 

0 Kudos
Message 8 of 9
(1,251 Views)

Good news -- you are now working with "modern LabVIEW" and DAQmx, and your Task is one that has been reported and discussed here on the Forum.  You can probably find "code that works" by searching this Forum, or doing a Web search.

 

Failing that, you'll probably get responses here, now that you have code to critique.

 

Bob Schor

 

P.S. -- Pascal was my favorite programming Language until I started using LabVIEW (and it was getting difficult to find a quality Pascal compiler for the PC or mini-computer) ...

0 Kudos
Message 9 of 9
(1,233 Views)