I reported what I thought was a "bug" in LabVIEW, namely that using Mouse events (Mouse Up and Mouse Down) on a Windows 7 platform that ran on a "touch-enabled" PC (in our case, a Dell Vostro) gave inconsistent and "wrong" results, I believe due to Windows "capturing" the touch events and deciding "for itself" whether they were Mouse Up or Mouse Down.
I realize this may not be considered a "New Idea" -- I'm posting this here on the suggestion of the NI Applications Engineer to whom I reported this "Feature".
Background -- I developed a "Push the Button" test in LabVIEW that works as follows -- one of two "buttons" on the screen lights up, and the subject is instructed to push the button if it is green, and to not push it if it is red. We time how long the subject takes to push the button, and also measure accuracy. The original test used lighted switches mounted on a board -- adding or rearranging switches involved carpentry. However, implementing this in LabVIEW, using boolean indicators as buttons and putting a conventional touch screen (which acts like a USB mouse) in front of the monitor worked very nicely. We could easily program button size, color, placement, etc., and by using Mouse Down events could tell when, and where, the subject touched the screen.
The only drawback was that the combination of monitor + touchscreen overlay was a little clumsy (and held together by velcro). So when an all-in-one PC with a touchscreen display (like the Dell Vostro) became available, we transferred our program to this platform.
Except it doesn't work! For example, if we use a mouse and click on a boolean control set to "Switch when pressed", when the mouse is pressed down, the control switches (as it should!). However, when we do the same thing with a finger, the control does not switch on touch, but only on release. The "exception" is if we leave our finger on the control -- after about a second or two, a ring is drawn on the screen and the control switches, even though our finger has not moved. [Explanation -- first scenario is a left-mouse click, the second a right-mouse click. Windows 7 interprets a long touch as "right-mouse", and cannot tell a touch is "short" until you let go of the screen].
I think this is a potential problem for LabVIEW, particularly as Touch becomes more integral to operating systems (can you say Windows 8?]. Note that pre-Touch, LabVIEW "controlled" the Mouse, capturing Mouse events and presenting them to the user, who could decide whether or not to "pass them on" to LabVIEW. For example, it is possible to detect Mouse Down on a control, but by dismissing the Mouse Down event, prevent the control from being activated.
I think Windows 7 Touch is doing something like this with Touch, "capturing" the touch before "interpreting" it and sending it to LabVIEW as a Mouse action. I'd like to see LabVIEW be able to "grab" Touch events first, so that the LabVIEW developer can use (and interpret) them as needed. Otherwise, I may have to either (a) install Windows XP on this nice shiny new PC in the hopes that Touch will, in this OS, be interpreted as a Mouse Down (and my program will work correctly) or (b) give up on the Vostro and go back to my monitor + Touch Screen + Velcro.
I'm attaching a simple VI that will demonstrate the incompatibility between Mouse and Finger when run on a Touch-enabled Windows 7 machine.