University: University of Leeds
Team Member(s): Chris Norman, Dominic Clark and Barnaby Cotter
Faculty Advisors: Prof Martin Levesley, Dr Pete Culmer
Country: United Kingdom
Below is a List of the various applications that have been developed as part of the project.
VTOL Rig Control
The video below shows an example of how we can interface the Microsoft Kinect with LabVIEW to produce an intuitive control system. Using the Kinect's skeleton tracking function, we show how a Vertical Takeoff and Landing (VTOL) demonstration rig can be controlled with the body, interfacing via a NI DAQ board. Users have the option to control it using a hand control, signal generator or through body position using the Kinect. In this last case, the angle of the spine is calculated on the fly, and used to control voltage to the two motors. Additionally, the system can be operated through a manual or a Fly-by-Wire control system.
Figure 1: Kinect and LabVIEW controlled VTOL demonstration rig.
3D Model Explorer
The video below demonstrates one of the applications produced using the Kinesthesia Toolkit. User's gestures are picked up by the Kinect's skeleton tracking and these are used to manipulate the 3D picture control within LabVIEW. In the example program any .stl file can be loaded into the picture control, a. stl parser finds the size of the model and user manipulations are scaled to the size of the model, in the video a simple cube and a .stl model of a colon, gained through post processing of a CT (Computated Tomography) scan are used. This program has potential applications within operating theatres however the use is equally valid for manipulation of any 3D data such as CAD models, or as has been developed by the team to navigate slides in a powerpoint presentation.
Figure 2: Gesture based 3D picture control manipulation
In addition to developing a driver set and toolkit for the Kinect, we are targetting our application development at three key areas:
Stroke, the disturbance of blood supply to the brain, is the leading cause of disability in adults in the USA and Europe, and has a profound effect on the quality of life of those to whom it occurs. The physical effects of stroke are numerous, and there exists a substantial field of study devoted to the assistance and rehabilitation of stroke sufferers. It is a complex and multidisciplinary affair, requiring constant monitored cognitive and physical therapy. As such, a need is presented for a system that provides mental stimulation to the patient whilst accurately recording body position and movement, allowing physiotherapists to both maintain patient motivation and extract detailed information on their movements.
With the advent and continued development of laprascopic (keyhole) surgery, it is essential that the operating surgeon has accurate and up to date information on an area that may not be directly visible to them. Before operating, the abdomen is usually inflated to provide the surgeon with the necesessary space to both view and access the internal organs. However, it is difficult for the surgeon to accurately guage the level of inflation, and the working space available to them. A tool that can determine inflation of the abdomen, and provide an estimate of the inflated volume would increase the safety of procedures and provide the surgeon with more information about the patient prior to operation.
It has been suggested that a person's gait is more unique than their fingerprint. Indeed, the way in which we walk can offer insight into a number of medical problems which may take longer to present in other forms. Stroke, for example, can result in a defined limp in one side of the body, and the extent of this limp may offer further information into the extent of the stroke. Similarly, analysis of one's gait can also offer information into struggling hip, knee or ankle joints, and can suggest not only that a joint replacement is required, but also that a specific type of replacement would be suitable. A low-cost, investigative tool that can be used prior to expensive specialist referrals would offer a significant benefit to clinicians.
The Microsoft Kinect has already revolutionised the gaming industry with its ability to track users motions, marking a key movement away from traditional control systems. This project is aimed at taking advantage of Microsofts innovative technology, and interfacing it with NI LabVIEW, via the development of a fully functional LabVIEW driver and toolkit. With this in place, we are developing a selection of motion tracking tools and programs specifically aimed at tackling these the three challenges above:
Figure 3: Kinect Sensor
Currently, a number of technologies exist to track patient movements. However, not only can these systems can cost upwards of £40,000 per camera, they have the added chore of requiring the user to wear markers placed on the skin or clothing, and are often significantly more accurate than is necessary, providing no extra insight into patients movements at a large capital expense. As such, a significant advantage is presented by the development of an easy to use, Kinect-based system which can produce similar results at a fraction of the cost. A system is developed that can record normal camera footage of a patient, alongside a full 3D rendering of the user's skeleton, allowing the operator to rotate and explore the users movements. This VI embeds the skeletal data directly into an .avi file. A further VI has been developed which allows this video to be reviewed, chopped up and saved, giving cherry-picked footage of a physiotherapy session, providing a 3D, rotatable rendering of the patient's skeleton, alongside the raw video footage, as can be seen in figure 11.
Figure 4: Video analysis suite developed for physiotherapists to play back, edit and analyse data recorded from assessments.
Further to this, A virtual stroke rehabilitation environment is created within LabVIEW, that provides patients with tasks to perform based upon the industry-standard ARAT (Action Reach Arm Test). These are intended to mimic day to day human movements, an example of this application can be seen in Figures 2 and 3.
Figure 5: A demonstration of the Virtual ARAT test being developed that shows your arms moving in real time with camera control maintained by your head position
Figure 6: shows the 3D scene created for the Virtual ARAT tests.
Gesture based manipulation of the .stl files allows surgeons to manipulate CT scans and 3D models wirelessly without the need to leave the sterile operating environment.
As with stroke rehabilitation, gait analysis is often undertaken via the application of various sensors to the patients body, the location of which is then fed into the computer system. This is generally a long and drawn out affair, which may result in patient discomfort. Using the Kinect to track the user's skeleton, gait can be quickly analysed and important metrics concerning the patient can be calculated through LabVIEW and fed back to the operator. Figure 4 shows the real time position tracking of a patients right hand in the X axis. A user can select any of the 20 joints to track and access a number of metrics on this joint.
Figure 7: shows an example of the data that can be obtained on the fly from the system. The program above allows the smoothness and magnitude of movements to be monitored as well as the velocity and acceleration of any one of the 20 joints in the X, Y and Z axis.
In order to assess the accuracy of the Kinect's skeletal tracking, we tested the Kinect against the system currently used by the university, Optotrak. This camera system is the current industry benchmark, and is capable of extremely accurate marker tracking down to sub millimetre levels. The system at the university is currently set up to track arm motion in stroke patients, monitoring speed, accuracy and fluidity of motion. Small wired sensors are attached at various points on the users upper body, whose position is then determined by a triscopic camera, and relayed to a LabVIEW processing environment. The high accuracy level in this system lends itself perfectly as a validation tool; effectively providing a gold standard to which we can compare the Kinect. We used what is known as a 'far reach' exercise for our initial validation. This consists of the user sitting up straight in a chair, and attempting to touch a target on an LCD screen with their hand whilst maintaining a vertical trunk. At a later date, we performed full standing validation for compound upper body movements.
Below are photos of the Optotrak system being used to record our primary validation measurements.
Figure 8: Optotrak (tm) camera, industry standard motion tracking system capable of sub millimetre accuracy.
Figure 9: A team member using the Optotrak rig developed for stroke patient reach exercises, being compared with the Kinect (visible in upper right)
Figure 10: Optotrak rig from another angle, showing the LCD screen which the patient must attempt to touch.
Figure 11: Video footage of the Optotrak rig in use.
The toolkit is now complete, and allows the user to initialise and close the Kinect's different components through polymorphic VIs (RGB Camera, Depth Camera and Skeletal Tracking) dependent on the functionalities they require.
The toolkit is formatted in concordance with the official National Instruments' driver layouts:
Figure 12: Operational Layout for the Kinect LabVIEW toolkit.
Figure 13 shows the block diagram of the polymorphic Kinesthesia toolkit. A developer simply has to drag these four VIs onto a block diagram and can select which data streams they which to access.
Figure 13: Complete Kinesthesia Toolkit allowing access to all data streams from the Kinect through the polymoprphic VIs.
Kinesthesia toolkit can be added to the LabVIEW functions palette allowing simple drag and drop access to the Kinect sensor's features.
Figure 14: Kinesthesia Toolkit accessible from the Instrument Drivers panel on the Functions Palette
A number of sub-VIs are either completed, or approaching completion, which relate to the processing, extracting and displaying of data from the Kinect. Figure 10 shows the Polymorphic VIs that the user can simple drag into the block diagram, the code displayed lets the users access the Video, Depth and Skeleton data. In the example below the code is also producing a 3D picture control plot of the skeleton on the fly. The toolkit is colour coded such that initialise, configure, read, display and close are shown in green, purple, blue, orange and red, respectively.
The system has great scope for future development, and we look forward to seeing what other people use the LabVIEW toolkit for when the toolkit is released on the LabVIEW Tools Network. In relation to furthering our own work, the system has the potential to allow a rehabilitation user to be completely independent. After the initial meeting with a physiotherapist, and a rehabilitation regime in place, the user could perform all the tasks at home, with the data including recorded video, sent back to the physio who could analyse the information remotely, a practice known as telerehabilitation. This would be especially useful for users who currently struggle to leave the house, or may feel burdened by regular rehabilitation appointments.
Work will be ongoing to allow for upper body tracking of the skeleton with realeases of new SDK's for the Kinect.