04-07-2010 05:33 PM
I am in the process of migrating a large distributed (multi-workstation) automation system from the LabVIEW 7.11 DSCEngine on Windows XP to the LabVIEW 2009 Shared Variable Engine on Windows 7.
I have about 600 tags which represent data or IO states in a series of Opto22 instruments, accessible via their OptoOPCServer. There are another 150 memory tags which are used so the multiple workstations can trade requests and status information to coordinate motion and process sequencing. Only one workstation may be allowed to run the Opto22 server, because otherwise the Opto22 instruments are overwhelmed by the multiple communications requests; for simplicity, I'll refer to that workstation as the Opto22 gateway.
The LabVIEW 2009 migration tool was unable to properly migrate the Opto22 tags, but with some help from NI support (thank you, Jared!) and many days of pointing and clicking, I have successfully created a bound shared-variable library connecting to all the necessary data and IO. I've also created shared variables corresponding to the memory tags. All the variables have been deployed.
So far, so good. After much fighting with Windows 7 network location settings, I can open the Distributed System Manager on a second W7/LV2009 machine (I'll refer to it as the "remote" machine henceforth) and see the processes and all those variables on the Opto22 gateway workstation. I've also created a few variables on the remote workstation and confirmed that I can see them from the gateway workstation.
Now I need to be able to use (both read and write) the variables in VIs running on the remote workstation machine. (And by extension, on more remote workstations as I do the upgrade/migration).
I have succeeded in reading and writing them by creating a tag reader pointed at the URL for the process on the Opto22 gateway. I can see a way I could replace the old DSC tag reads and writes in my applications using this technique, but is this the right way to do this? Is this actually using the Shared Variable Engine, or is it actually using the DataSocket? I know for a fact that attempting to manipulate ~800 items via Datasocket will bog down the systems.
I had the impression that I should be able to create shared variables in my project on the remote workstation that link to those on the Opto22 gateway workstation. When, however, I try to browse to find the processes on that workstation, I get an error saying that isn't possible.
Am I on the right track with the tag reader? If not, is there some basic step I'm missing in trying to access the shared variables I created on the gateway workstation?
Any advice will be greatly appreciated.
Kevin
04-08-2010 01:52 PM
I have found the answer to part of my question -- an relatively easy way to create a "remote" library of shared variables that connect to the master library on my gateway workstation.
04-08-2010 02:11 PM
Kevin R wrote:I am in the process of migrating a large distributed (multi-workstation) automation system from the LabVIEW 7.11 DSCEngine on Windows XP to the LabVIEW 2009 Shared Variable Engine on Windows 7.
...I have succeeded in reading and writing them by creating a tag reader pointed at the URL for the process on the Opto22 gateway. I can see a way I could replace the old DSC tag reads and writes in my applications using this technique, but is this the right way to do this? Is this actually using the Shared Variable Engine, or is it actually using the DataSocket? I know for a fact that attempting to manipulate ~800 items via Datasocket will bog down the systems.
...
Any advice will be greatly appreciated.
Kevin
I had to resort to DataSocket reads and had similar fears about laoding the CPU. The new machines handled the DataSockets with CPU to spare.
NI resrves the rights to re-arange the comm for Shared Varialbe anyway and anytime they want, but whne last I heard, Shared Variables were using using DataSockets under the hood. Yes this may have changed.
Ben
04-08-2010 02:14 PM
07-30-2014 10:22 AM
I realize this is an older thread, but I think my issue is similar.
What I'm doing is migrating from a 7.x LabVIEW install that has a library of tags set up through the DSC to a 8.5 LabVIEW install (which now uses projects and a library of shared variables). This is being done to phase out the 7.x installation and bring all of our machines to 8.5.1 (and retire the XP). I realize 8.5 is aging, but we aren't ready to update our installed LabVIEW base just yet.
I believe that I have successfully created a project in LabVIEW 8.5 with the existing 7.x code and imported the scf file (which created a library with the network shared variables that I added to the project), but I'm worried about "deploying" the variables when I run the main program. I want to make sure the 7.x code isn't somehow corrupted by the variable deployment.
Can anyone advise me what the ramifications are in deploying? The control program is state based, so on first startup it just "watches" the system (monitors valve states and some analog signals). I'm hoping to just fire it up and make sure I can read the variables in the 8.5 version before completely decommissioning the 7.x install.
Thanks in advance.
TR
07-30-2014 11:36 AM
Depending on your available system resources, you could save the 8.5 variables in a differently-named library. Then the two projects are parallel rather than overlaid.
Much depends on how the variables themselves are being made available to LabVIEW. The Opto22 variables in my big project were delivered by an OPC server layer from Opto22. I can access the variables in that server with more than one alias without increasing the load on the actual instruments (the server only asked for the data once from the instrument, then LabVIEW could ask for it with multiple names).
Kevin
07-31-2014 09:46 AM
Thanks.
I guess that's what I was wondering is where the "deployment" actually happens. I've figured out that the library of variables gets published to the shared variable engine on the PC running the code. I wanted to make sure that the deployment wasn't updating/replacing/etc. the descriptions/details of variables on the actual target. If that were the case, I'd be worried other programs accessing the channels would get screwed up.
Since there are multiple (disparate) processes that are controlled by several separate programs, I'm hoping to use your idea of having one master server for channel names, scaling, etc. and have all the workstations reference from that library.
TR
08-04-2014 01:50 PM
Yeah my problem lies in the MAJOR differences between 7.x and 8.x since my 7.x is not (as I see it) a master gateway.... I've successfully exported and imported using your technique, but the project variables don't map back to the original signals out on the network. The PSP option in 2009 is not available in 8.5, so maybe that's something? I can target the controllers directly, but there are several variables that don't enumerate (mainly the "ALL" variables which I use extensively to shutdown major processes).
:Frustrated:
08-04-2014 02:03 PM
Is it possible for you to move to 8.6.1?
As I recall, that's where the current model really first shows up. When my group is upgrading from 7.1.1 or lower, I move them straight to 8.6.1 .
Kevin
08-04-2014 02:44 PM
Ugh, not an option....
I could map the Field Point config (would have to manually enter the conversions, or at least change them in a spreadsheet to match), but when I target the controllers directly, I can't see the "ALL" channels and at this point, going back and rewriting code to replace all the locations where the ALL variable (for instance, every DO module) would double down on the first part of this issue (recreating the scaling in all 400+ channels).