I'm wondering whether there are any Labview implementations of an efficient optimization algorithm for "real-world" signals (i.e. a control problem rather than a pure math problem). The goal is to automatically adjust a set of physical parameters (e.g. voltages, motor positions or the like), which maximize a measured "objective" signal. An intuitive example would be the alignment of a laser beam through a pinhole using a motorized mirror and the feedback is provided by a photodiode or powermeter behind the pinhole.
I thought about using one of the optimization VIs, but I think they would have an inherent problem with an objective function derived from a "real-world" measurement because of the noise and/or drift. Maybe it's possible to control the step size and tolerance to get a reasonable result but I couldn't figure out a robust procedure so far.
I'd rather treat it as a control problem with an approach similar to a PID controller but instead of locking the "process variable" to a setpoint on a monotonous curve, it should rather lock it to an extremum (which can drift over time). I'm not very familiar with control algorithms and I don't even know whether it's possible at all. Does anyone know a suitable solution?
There is a PID toolkit that should come with LabVIEW full or professional.
That's the same as a limited PID controller (using only P or PI) and the signal is not periodic, so I don't see how I can extract a phase from that to lock onto...