Evaluating daq.io for use for some of our customers that want to get alarms and trends to their cell phones from field installations that currently report through a cell phone link sending e-mail (not continuously connected).
If I have a time series of one or multiple tags (i.e. channels in daq.io terminology) and I want to upload that to daq.io, is there a way to do that in one operation, instead of calling the Write Single/Write Multiple functions repeatedly for each point in time? At first I thought this could be done with the write waveform function (at least in the cases where the sampling rate is constant...), but waveforms are not added to the history it seems, just as waveform files(?).
It just seems a bit inefficient to not be able to an array of XY pairs (just like you would to an XY graph) with one write operation, perhaps I have overlooked a function?
I also tried the alarm function on a channel and sent it a waveform with an alarm condition...and did not see an alarm, but that's probably only applied to the data that goes to the channel history I guess?
Sorry for the late reply. The general idea is to have a continuous monitoring application with regular updates, hence the current behavior.
The "write multiple values" VI can support that. Basically, the json document expects a "time:" entry for each datapoint. Now, if you were to supply lists of channels and data points you could do that. (You have to repeat the channel names for each new entry). There are more elegant ways to do this but not specifically broken out to the LabVIEW API.