ni.com is currently undergoing scheduled maintenance.

Some services may be unavailable at this time. Please contact us for help or try again later.

LabVIEW Communications System Design Suite

cancel
Showing results for 
Search instead for 
Did you mean: 

How to implement LIN using LabVIEW VISA?

Hello,

 

I am working on implementing LIN 2.1 protocol on existing RS232 hardware using LabVIEW VISA. Does anyone have experience with this?

 

I am just implementing unconditional frames, so it should be pretty straightforward once I have the timings right. But that is the part where I am getting stuck. The protocol uses a 13-bit break with a delimeter to signal start of frame header. I am currently using VISA Serial Break.vi with a break time of 1ms to achieve this. One doubt I have is that my application runs at 19.2k baud, therefore a 13-bit break only translates to roughly 2/3 of a millisecond. The closest timing I can get is the 1ms floor of the VISA Serial Break.vi

 

Then I am sending 0x55 through a VISA write and then writing the ID of the frame. The program gives me an I/O error everytime I try to run this. 

 

How can I fix this? does someone here have experience implementing LIN using VISA?

 

Any help would be much appreciated!!

Thanks very much

0 Kudos
Message 1 of 3
(3,818 Views)

Hello js7041,

 

I believe you will have better traction if you post on our Automotive and Embedded Networks discussion forum as that is where most LIN questions are.

 

Here is a link to help you get started.

https://forums.ni.com/t5/Automotive-and-Embedded-Networks/bd-p/30?profile.language=en

 

Malkolm A.
Applications Engineer
National Instruments
0 Kudos
Message 2 of 3
(3,779 Views)

Done, Thanks. 

0 Kudos
Message 3 of 3
(3,775 Views)