From 04:00 PM CDT – 08:00 PM CDT (09:00 PM UTC – 01:00 AM UTC) Tuesday, April 16, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Migrating large project from DSC 7.1 to LabView 2009 Shared Variables ... What's the next step after recreating my variables?

I am in the process of migrating a large distributed (multi-workstation) automation system from the LabVIEW 7.11 DSCEngine on Windows XP to the LabVIEW 2009 Shared Variable Engine on Windows 7.

I have about 600 tags which represent data or IO states in a series of Opto22 instruments, accessible via their OptoOPCServer. There are another 150 memory tags which are used so the multiple workstations can trade requests and status information to coordinate motion and process sequencing.  Only one workstation may be allowed to run the Opto22 server, because otherwise the Opto22 instruments are overwhelmed by the multiple communications requests; for simplicity, I'll refer to that workstation as the Opto22 gateway.


The LabVIEW 2009 migration tool was unable to properly migrate the Opto22 tags, but with some help from NI support (thank you, Jared!) and many days of pointing and clicking, I have successfully created a bound shared-variable library connecting to all the necessary data and IO.  I've also created shared variables corresponding to the memory tags. All the variables have been deployed.

 

So far, so good. After much fighting with Windows 7 network location settings,  I can open the Distributed System Manager on a second W7/LV2009 machine (I'll refer to it as the "remote" machine henceforth) and see the processes and all those variables on the Opto22 gateway workstation. I've also created a few variables on the remote workstation and confirmed that I can see them from the gateway workstation.

 

Now I need to be able to use (both read and write) the variables in VIs running on the remote workstation machine. (And by extension, on more remote workstations as I do the upgrade/migration).

 

I have succeeded in reading and writing them by creating a tag reader pointed at the URL for the process on the Opto22 gateway. I can see a way I could replace the old DSC tag reads and writes in my applications using this technique, but is this the right way to do this? Is this actually using the Shared Variable Engine, or is it actually using the DataSocket? I know for a fact that attempting to manipulate ~800 items via Datasocket will bog down the systems.

 

I had the impression that I should be able to create shared variables in my project on the remote workstation that link to those on the Opto22 gateway workstation. When, however, I try to browse to find the processes on that workstation, I get an error saying that isn't possible.

 

Am I on the right track with the tag reader? If not, is there some basic step I'm missing in trying to access the shared variables I created on the gateway workstation?

 

Any advice will be greatly appreciated.

 

Kevin

Kevin Roche
Advisory Engineer/Scientist
Spintronics and Magnetoelectronics group
IBM Research Almaden
0 Kudos
Message 1 of 13
(4,090 Views)

I have found the answer to part of my question -- an relatively easy way to create a "remote" library of shared variables that connect to the master library on my gateway workstation.

 

  1. Export the variables from the master library as a csv file and copy that to the remote machine.
  2. Open the file on the remote machine (in excel or the spreadsheet app of your choice) and (for safety's sake) immediately save it with a name marking it as the remote version.
  3. Find the network path column (it was "U" in my file).
  4. replace the path for each variable (which will be either a long file path or a blank, depending on the kind of variable) with \\machine\'process name'\variable name
    where machine is the name or ip address of the master (gateway) workstation (I used the ip address to make sure it uses my dedicated automation ethernet network rather than our building-wide network)
    and process name is the name of the process with the deployed variables visible in the Distributed System Manager on the gateway machine.
    NOTE the single quotes around the process name; they are required.

    The variable name is in the first ("A") column, so in Excel, I could do this for line 2 with the formula =CONCATENATE("\\machine\'process name'\",A2)
    Once the formula worked on line 2, I could copy it into all the other lines.
  5. Save the CSV file.
  6. Import the CSV into the remote library to create the variables.
    Note: at this point, if you attempt to deploy the variables, it will fail. The aliases are not quite set properly yet.
  7. Open the properties for the first imported variable.
    There is probably an error message at the bottom saying the alias is invalid.
    In the alias section, you'll see it is set to "Project Variable" with the network path from step 4.
    Change the setting to "PSP URL" with the same path and the error message should disappear.
  8. Close the properties box, save the library, and then export the variables to a new CSV file.
  9. Open the new CSV file in Excel, and scroll sideways to the Network:ProjectBound field.
    You'll notice it is False for the first variable, and true for the rest. Set the field FALSE for all lines in the spreadsheet.
  10. Scroll sideways... you'll notice there are two new columns between Network:ProjectPath and Network:UseBinding
    The first one is Network:SingleWriter; it should already be FALSE for all lines.
    The second one is Network:URL. That needs to be set equal to the value for each line of Network:ProjectPath.
    You can accomplish this with a formula like in step 4. In Excel it was =U2 for line 2, and then cut and paste into all lines below it.
  11. There is a third new field, Path, which should already be set to the location of the variable library. You don't need to do anything with it.
  12. Save the edited CSV file.
  13. Go back to the remote library, and import variables from the just-edited remote library CSV file.
    Once you have imported them and the Multiple Variable Editor opens, click on done.
  14. You should now be able to deploy the remote variable library without error. (Make sure to open the Distributed System Manager and start the local variable engine. It took me a few failures before I realized I had to do that before attempting a deployment).
  15. Voila! You now have a "remote" library of shared variables that references all the shared variables on the master machine, and which should be deployable on other machines with very little difficulty.
It actually took longer to write out the process here than to perform these steps once I figured it out.
Kevin Roche
Advisory Engineer/Scientist
Spintronics and Magnetoelectronics group
IBM Research Almaden
0 Kudos
Message 2 of 13
(4,068 Views)

Kevin R wrote:

I am in the process of migrating a large distributed (multi-workstation) automation system from the LabVIEW 7.11 DSCEngine on Windows XP to the LabVIEW 2009 Shared Variable Engine on Windows 7.

...

I have succeeded in reading and writing them by creating a tag reader pointed at the URL for the process on the Opto22 gateway. I can see a way I could replace the old DSC tag reads and writes in my applications using this technique, but is this the right way to do this? Is this actually using the Shared Variable Engine, or is it actually using the DataSocket? I know for a fact that attempting to manipulate ~800 items via Datasocket will bog down the systems.

...

 

Any advice will be greatly appreciated.

 

Kevin


I had to resort to DataSocket reads and had similar fears about laoding the CPU. The new machines handled the DataSockets with CPU to spare.

 

NI resrves the rights to re-arange the comm for Shared Varialbe anyway and anytime they want, but whne last I heard, Shared Variables were using using DataSockets under the hood. Yes this may have changed.

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 3 of 13
(4,062 Views)
oops... all those Smiley Tongue icons? They are actually supposed to be a colon followed by a capital "P".
Kevin Roche
Advisory Engineer/Scientist
Spintronics and Magnetoelectronics group
IBM Research Almaden
0 Kudos
Message 4 of 13
(4,059 Views)

I realize this is an older thread, but I think my issue is similar.

 

What I'm doing is migrating from a 7.x LabVIEW install that has a library of tags set up through the DSC to a 8.5 LabVIEW install (which now uses projects and a library of shared variables). This is being done to phase out the 7.x installation and bring all of our machines to 8.5.1 (and retire the XP). I realize 8.5 is aging, but we aren't ready to update our installed LabVIEW base just yet.

 

I believe that I have successfully created a project in LabVIEW 8.5 with the existing 7.x code and imported the scf file (which created a library with the network shared variables that I added to the project), but I'm worried about "deploying" the variables when I run the main program. I want to make sure the 7.x code isn't somehow corrupted by the variable deployment.

 

Can anyone advise me what the ramifications are in deploying? The control program is state based, so on first startup it just "watches" the system (monitors valve states and some analog signals). I'm hoping to just fire it up and make sure I can read the variables in the 8.5 version before completely decommissioning the 7.x install.

 

Thanks in advance.

 

TR

><><><><><><
Tommy R.
><><><><><><
0 Kudos
Message 5 of 13
(3,323 Views)

Depending on your available system resources, you could save the 8.5 variables in a differently-named library. Then the two projects are parallel rather than overlaid.

 

Much depends on how the variables themselves are being made available to LabVIEW. The Opto22 variables in my big project were delivered by an OPC server layer from Opto22. I can access the variables in that server with more than one alias without increasing the load on the actual instruments (the server only asked for the data once from the instrument, then LabVIEW could ask for it with multiple names).

 

Kevin

 

 

Kevin Roche
Advisory Engineer/Scientist
Spintronics and Magnetoelectronics group
IBM Research Almaden
0 Kudos
Message 6 of 13
(3,313 Views)

Thanks.

 

I guess that's what I was wondering is where the "deployment" actually happens. I've figured out that the library of variables gets published to the shared variable engine on the PC running the code. I wanted to make sure that the deployment wasn't updating/replacing/etc. the descriptions/details of variables on the actual target. If that were the case, I'd be worried other programs accessing the channels would get screwed up.

 

Since there are multiple (disparate) processes that are controlled by several separate programs, I'm hoping to use your idea of having one master server for channel names, scaling, etc. and have all the workstations reference from that library.

 

TR

><><><><><><
Tommy R.
><><><><><><
0 Kudos
Message 7 of 13
(3,300 Views)

Yeah my problem lies in the MAJOR differences between 7.x and 8.x since my 7.x is not (as I see it) a master gateway.... I've successfully exported and imported using your technique, but the project variables don't map back to the original signals out on the network. The PSP option in 2009 is not available in 8.5, so maybe that's something? I can target the controllers directly, but there are several variables that don't enumerate (mainly the "ALL" variables which I use extensively to shutdown major processes).

 

:Frustrated:

><><><><><><
Tommy R.
><><><><><><
0 Kudos
Message 8 of 13
(3,262 Views)

Is it possible for you to move to 8.6.1?

As I recall, that's where the current model really first shows up. When my group is upgrading from 7.1.1 or lower, I move them straight to 8.6.1 .

 

Kevin

Kevin Roche
Advisory Engineer/Scientist
Spintronics and Magnetoelectronics group
IBM Research Almaden
0 Kudos
Message 9 of 13
(3,258 Views)

Ugh, not an option....

 

I could map the Field Point config (would have to manually enter the conversions, or at least change them in a spreadsheet to match), but when I target the controllers directly, I can't see the "ALL" channels and at this point, going back and rewriting code to replace all the locations where the ALL variable (for instance, every DO module) would double down on the first part of this issue (recreating the scaling in all 400+ channels).

><><><><><><
Tommy R.
><><><><><><
0 Kudos
Message 10 of 13
(3,252 Views)