LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Reading messages from CAN bus using NI 9862

Solved!
Go to solution

I am remotely accessing a cRIO 9025 with a 9112 chassis. On the chassis are five NI 9236 modules, one 9219, a 9211, and a 9862. So far, the 9236, 9219, and 9211 are working as wanted. The last step is to configure the 9862 to be able to read messages from a CAN bus.

 

Here's what I have done so far:

In terms of hardware, I have connected the 9862 to a 12V source as referenced to one of the COM pins. I have also connected my CAN bus to this voltage supply. The 9862 CAN_H and CAN_L pins are also wired to the bus. Next, within the LabVIEW project manager, I created an XNET session, selected an interface, loaded the corresponding .dbc file, and added all of the signals to the session. After that, I compiled an FPGA VI to load the XNET support and deployed everything to the target. To test the system, I wrote a RT VI on the target where I dragged the XNET session from the project onto the block diagram and wired it to a read function (the session mode was input, signal single point). This ended up outputting a constant value for each signal of about -1.7. Oddly, even when I disconnected the CAN bus from the NI module, this reading showed no change.

 

Suspecting that I was not even receiving messages from the bus, I went to the interface on MAX and selected bus monitor. When I tried to run this, it returned Error 63, which corresponds to either a buffer overflow or that the connection was refused by the server. I am not sure how to fix this problem and have not been able to find a solution that works on other forums, so any help would be much appreciated.

0 Kudos
Message 1 of 16
(5,734 Views)

Hello,

 

I have recently been teaching the Embedded Control and Monitoring course, for IO node errors we used removal of a module as an example. Can I confirm you are using the Scan Engine to read CAN frames?

 

Have you setup termination? The module has software selectable termination but you will need to enable this with an API call to XNET.  Here's some documentation on the property node you will need to write too:

 

http://zone.ni.com/reference/en-XX/help/372841M-01/nixnet/propertysessioninterfacecantermination/

 

You mentioned wiring 12 Volts to the COM port? This is in fact the Ground and should be connected to the CAN Bus reference ground (CAN_V-). You need to connect the 12 Volt supply to Vsup pin instead.

 

These are generally the main two reasons for incorrect data being received.

 

Best regards,

 

Ed

0 Kudos
Message 2 of 16
(5,669 Views)

Sorry I misworded that. The voltage supply is connected to Vsup and COM is connected to ground. As for termination, I included the termination resistors in the CAN bus external to the 9862. Does it make a difference to use the software selectable termination instead?

0 Kudos
Message 3 of 16
(5,640 Views)

Hi daqrookie15,

I would recommend focusing on Bus Monitor to start with. Once you have used that to verify your overall wiring and setup, then move on to your code. 

To test cleanly with Bus Monitor, you will need to ensure the correct software set (including NI-XNET) is installed on the target. This should match the versions installed on your host computer. You will also need to ensure you are using NI-XNET 14.0 or later, as Bus Monitor support for Real-Time targets was added in that release. 
http://digital.ni.com/public.nsf/allkb/92F10284E8061FB68625793600484D79

Once you have the correct software, create a new project with only the XNET module in it. This lets us eliminate any outside interference from your code just to be safe. In that project, you will need to add an FPGA target and compile a blank FPGA VI, as documented in the following KnowledgeBase. Once you have compiled that blank VI, make sure to download it to the cRIO FPGA. 
http://digital.ni.com/public.nsf/allkb/333860B4B66BA4BD86257E6100452593

At that point, try to connect to the XNET module using Bus Monitor and see if you receive the same error. If not, do you see the expected traffic on the bus?


Charlie J.
National Instruments
0 Kudos
Message 4 of 16
(5,631 Views)

Charlie,

 

I double checked the software like you said and it is all up to date, so I tried creating a project with just the XNET module, compiled a blank VI, and tried to connect via the bus monitor.

 

Now I seem to be getting extremely inconsistent results. Sometimes, I am able to see the traffic on bus monitor without any problems. Sometimes, however, when I close the window and try to reconnect (without changing anything), I get a variety of error messages.Some are time out errors, others say "no heartbeat message detected". I'm really confused as to why I'm getting such varying results but I am fairly inexperienced with all of this. Are there any settings or details that I could be overlooking?

 

Thanks,

Brandon

0 Kudos
Message 5 of 16
(5,609 Views)

Do you have any third party hardware to confirm whether what is being transmitted on the CAN network by your ECUs is actually correct?

 

The XNET bus monitor monitors the driver level, so this rules out application level, leaving driver, OS, hardware or actual traffic on the network

0 Kudos
Message 6 of 16
(5,605 Views)

Yes, another engineer I was working with was able to verify with third party equipment that the ECU's are transmitting data correctly. However, I am just realizing that when using the NI Bus Monitor, the arbitration values were not what was expected.

0 Kudos
Message 7 of 16
(5,601 Views)

Ahhhhh, there is something called bit timing registers, possibly your ECU does not match our standard defaults.

 

I'd take a look at following the below article:

 

http://digital.ni.com/public.nsf/allkb/CBA1FC627FFA0F8F862564DC0071B3A7

0 Kudos
Message 8 of 16
(5,598 Views)

So I am still unable to get the bus monitor to work consistently, but I have managed to get an API that can monitor the bus traffic. The messages appear to be correct and now I need to extract the signal values from the frames.

 

Each signal is 10 bits, and each frame contains 6 signals (8 byte payload). All of the signal information is contained in the dbc file which is referenced by the xnet session. When I set the session to frame input mode, I am able to read the frames as expected. However, if I attempt to use any of the signal input modes, the read function only returns a constant (and I'm assuming default) value. In an attempt to troubleshoot, I physically disconnected my 9862 from the CAN bus, and this reading did not change. Evidently there is something about the signal input mode that I'm missing.

 

Since I can read the raw frames, I could write a VI to manually separate the payload into signals but I'd imagine this is a fairly tedious task. Is there any way I can make XNET do the heavy lifting for me? 

 

 

0 Kudos
Message 9 of 16
(5,569 Views)

What session types are you using, and how are you configuring them? For Signal Session types, XNET uses the database to identify specific frames and determine how to split them into their constituent signals. If the database or items in the database are not correctly formatted, then XNET will not be able to parse out the individual signals.

It does sound like you are receiving default values, but you could confirm this is the case by changing the defaults in your database and comparing the behavior to what you are currently seeing. 

Another option would be using XNET Conversion Mode. This would involve logging the Frames to a file during your application and then opening a conversion session to convert them to signals. If you aren't sure your database is resulting in signals being pulled properly, I would recommend playing around with this some to see how XNET is interpreting Frames.
http://zone.ni.com/reference/en-XX/help/372841M-01/nixnet/modeconversion/

Charlie J.
National Instruments
0 Kudos
Message 10 of 16
(5,559 Views)