I have been playing with the 2020 Labview Linx toolkit recently and wanted to use a Raspberry Pi in a project to revive an old FRC robot from the Crio days so it can be used for demonstrations without having to connect to a base PC. I just wanted to have a generic HID joystick like the Logitech F710 plug in to the Raspberry Pi USB and run the LV realtime code when the battery is plugged in. When searching on how to do this, I couldn't find any real results so when I had a small breakthrough recently I decided to share this so anyone else trying to do something similar can see my result and possibly improve upon it.
The main breakthrough came when I was researching on how Linux treats human interface devices and came across this link showing how Linux stores HID devices as binary files that can be called. These files are stored in /dev/input/ on the linux file system and are named jsX for joysticks, mouseX for mice, and eventx for keyboards where X is whatever order Linux has assigned.
I then started the code by using the read binary file block. As far as configuring the settings, HID devices send out their data as 16-bit Signed integers in groups of 4 bytes. Because this is the Linux operating system, it uses the little-endian convention. The first byte is a timing marker. I am am unsure what the 2nd and 4th byte are used for. The 3rd byte is the data from the buttons or axis. This pattern repeats for as many buttons and axis the joystick has. In my testing, I used a Saitek ST90 which has 3 buttons and 3 axis so I read 24 bytes total. I tried a few other joysticks and while not a comprehensive data set, I found that they all seemed to read the buttons first then read the axis. The read binary block produces an array that can then be indexed to get the relevant data. Here is a picture of my testing code.
As far as experimenting, I found that opening and closing the file is necessary on each loop otherwise I was having data overruns. I wired the output of the trigger button to an LED on the Pi in order to test the system without a computer in the loop. There is still some perceivable lag between the button press and the reaction even running headless, but it is acceptable for what I am doing given that the robot is only used for demonstration. I tested this on an older Raspberry Pi 2B so I don't know if having a faster RPi 4 would improve performance. I briefly experimented with grabbing mouse and keyboard data but decoding that data is less straight forward so I may revisit that another day. If anyone has any improvements to make the sampling faster or decoding the mouse and keyboard, it would be great to see it. Otherwise I hope this code helps anyone trying to read USB joystick values from the Raspberry Pi.