LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Save each iteration of data using VISA read to file

Solved!
Go to solution

Eventually, I went with Write >> Read + Open File + Set File Location>>Write text to file. Works perfectly. Set File Location to the end of the file, and then write at that position. New line, new text, does not get replaced(depending of your Open file settings).

0 Kudos
Message 11 of 20
(1,018 Views)

Hi frakf,

 


@frakf wrote:

Works perfectly.


Nope.

It requires additional work (SetFileLocation) and you FORGOT to close the file…

 


@frakf wrote:

Set File Location to the end of the file, and then write at that position. New line, new text, does not get replaced(depending of your Open file settings).


It would be so much easier to copy/use the already accepted solution instead of mocking up "non-ideal" code with additional (and other missing) functions to circumvent the problems arising from not copying the accepted solution…

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 12 of 20
(1,007 Views)

GerdW, 

 

1. Works perfectly for me, even without closing the file. You could’ve made that correction and moved on.

 

2. Did I or did I not state that I copied the solution in this post where I also stated I was beginner? I attached my Vi, screenshots, and the error I received as text .(implemented accepted solution )

 

3. Additional work (that I was trying to avoid with Read to File) is necessary because you refused to provide your supposed “Read to File” solution for whatever reason and, neither you nor crossrulz replied my post where I IMPLEMENTED THE ACCEPTED SOLUTION. Plus, it is as my much work as the accepted solution.

 

4. I am a beginner who had a problem, the solution I received didn’t work, and so I fixed the problem myself. My “non-ideal” code was a beginner’s last resort after 48 hours of scouring the internet. 

5. Did I or did I not use the accepted solution GerdW? Did I or did I not post that I did and that it had an error? Did you or crossrulz reply that post? 

6. Don’t help if you won’t help. But attacking me after cooperating(AND COPYING THE ACCEPTED SOLUTION THAT DID NOT WORK) is completely and utterly uncalled for. 

Keep your regards,

frakf.

0 Kudos
Message 13 of 20
(989 Views)

Hi Crossrulz,

May I ask when should we use "bytes at port"?  or when is suitable to use this vi, I did have experience of no reponse issue with it, but I don't unsterdand why it happen.

0 Kudos
Message 14 of 20
(965 Views)

Hi frakf,

 


@frakf wrote:

1. Works perfectly for me, even without closing the file. You could’ve made that correction and moved on.

 

2. Did I or did I not state that I copied the solution in this post where I also stated I was beginner? I attached my Vi, screenshots, and the error I received as text .(implemented accepted solution )

 

3. Additional work (that I was trying to avoid with Read to File) is necessary because you refused to provide your supposed “Read to File” solution for whatever reason and, neither you nor crossrulz replied my post where I IMPLEMENTED THE ACCEPTED SOLUTION. Plus, it is as my much work as the accepted solution.

 

4. I am a beginner who had a problem, the solution I received didn’t work, and so I fixed the problem myself. My “non-ideal” code was a beginner’s last resort after 48 hours of scouring the internet. 

5. Did I or did I not use the accepted solution GerdW? Did I or did I not post that I did and that it had an error? Did you or crossrulz reply that post? 

6. Don’t help if you won’t help. But attacking me after cooperating(AND COPYING THE ACCEPTED SOLUTION THAT DID NOT WORK) is completely and utterly uncalled for.


2. You did not copy the accepted solution as you added several feedback nodes which are unneeded (or even produce the erranous behaviour of your code version)...

 

3. All you needed to do is to use the accepted solution. When I answered previously I was using my mobile, now I'm on a computer with LabVIEW installed:

(This is basically the same as the accepted solution, just with changed wiring of the error wires.)

 

4. This solution is working.

 

5. According to the VIs attached in this or that message you were not using the accepted solution.

 

6. I only write messages to help. Sometimes it's just help to find other help, like links/suggestions for manuals…

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 15 of 20
(949 Views)

@yeah wrote:

May I ask when should we use "bytes at port"? 

For example, you need BytesAtPort if you want to write your own reading routine because you have a device that uses more than one delimiter, e.g. 0x0D0A. Then you read the existing bytes, concatenate the message and check whether the delimiter is present.

Message 16 of 20
(912 Views)

@ThomasHenkel wrote:

@yeah wrote:

May I ask when should we use "bytes at port"? 

For example, you need BytesAtPort if you want to write your own reading routine because you have a device that uses more than one delimiter, e.g. 0x0D0A. Then you read the existing bytes, concatenate the message and check whether the delimiter is present.


I've been down that path.  It SUCKS.  If you have a proper protocol, you do not need the Bytes At Port to determine the number of bytes to read.  In your exact example, just use the Line Feed (0x0A) as your termination character.

 

The one place that I consistently use Bytes At Port is for devices that do not constantly send out data.  I use the Bytes At Port to see if a message has at least started to come in.  After that, I depend on the protocol.  For ASCII messages, just do a read with a large number of bytes to read and rely on the termination character.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 17 of 20
(895 Views)

I have to disagree. We've been using this class for years on a variety of protocols and it works great.

 

  In your exact example, just use the Line Feed (0x0A) as your termination character.

If part of the delimeter can appear as a character in your message, there is no other way.

0 Kudos
Message 18 of 20
(864 Views)

@ThomasHenkel wrote:

If part of the delimeter can appear as a character in your message, there is no other way.

That is why I say you have to use the protocol to determine how much to read.  A good ASCII protocol will not have the termination character be able to be in the data.  If you use a raw/binary/hex protocol, then you need framing to know how many bytes to read and the termination character cannot be used.  I've been down all of those paths.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 19 of 20
(858 Views)

Dear Sir, 

Can you please send me VI of this solution in LV2017 version. I am not able to open this file of your proposed solution.

Thanks in advance

 

0 Kudos
Message 20 of 20
(820 Views)