From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW Idea Exchange

cancel
Showing results for 
Search instead for 
Did you mean: 
Bob_Schor

Fix inconsistency between LabVIEW mouse events and Windows 7 Touch

Status: New

I reported what I thought was a "bug" in LabVIEW, namely that using Mouse events (Mouse Up and Mouse Down) on a Windows 7 platform that ran on a "touch-enabled" PC (in our case, a Dell Vostro) gave inconsistent and "wrong" results, I believe due to Windows "capturing" the touch events and deciding "for itself" whether they were Mouse Up or Mouse Down.

 

I realize this may not be considered a "New Idea" -- I'm posting this here on the suggestion of the NI Applications Engineer to whom I reported this "Feature".


Background -- I developed a "Push the Button" test in LabVIEW that works as follows -- one of two "buttons" on the screen lights up, and the subject is instructed to push the button if it is green, and to not push it if it is red.  We time how long the subject takes to push the button, and also measure accuracy.  The original test used lighted switches mounted on a board -- adding or rearranging switches involved carpentry.  However, implementing this in LabVIEW, using boolean indicators as buttons and putting a conventional touch screen (which acts like a USB mouse) in front of the monitor worked very nicely.  We could easily program button size, color, placement, etc., and by using Mouse Down events could tell when, and where, the subject touched the screen.

 

The only drawback was that the combination of monitor + touchscreen overlay was a little clumsy (and held together by velcro).  So when an all-in-one PC with a touchscreen display (like the Dell Vostro) became available, we transferred our program to this platform.

 

Except it doesn't work!  For example, if we use a mouse and click on a boolean control set to "Switch when pressed", when the mouse is pressed down, the control switches (as it should!).  However, when we do the same thing with a finger, the control does not switch on touch, but only on release.  The "exception" is if we leave our finger on the control -- after about a second or two, a ring is drawn on the screen and the control switches, even though our finger has not moved.  [Explanation -- first scenario is a left-mouse click, the second a right-mouse click.  Windows 7 interprets a long touch as "right-mouse", and cannot tell a touch is "short" until you let go of the screen].

 

I think this is a potential problem for LabVIEW, particularly as Touch becomes more integral to operating systems (can you say Windows 8?].  Note that pre-Touch, LabVIEW "controlled" the Mouse, capturing Mouse events and presenting them to the user, who could decide whether or not to "pass them on" to LabVIEW.  For example, it is possible to detect Mouse Down on a control, but by dismissing the Mouse Down event, prevent the control from being activated.

 

I think Windows 7 Touch is doing something like this with Touch, "capturing" the touch before "interpreting" it and sending it to LabVIEW as a Mouse action.  I'd like to see LabVIEW be able to "grab" Touch events first, so that the LabVIEW developer can use (and interpret) them as needed.  Otherwise, I may have to either (a) install Windows XP on this nice shiny new PC in the hopes that Touch will, in this OS, be interpreted as a Mouse Down (and my program will work correctly) or (b) give up on the Vostro and go back to my monitor + Touch Screen + Velcro.

 

I'm attaching a simple VI that will demonstrate the incompatibility between Mouse and Finger when run on a Touch-enabled Windows 7 machine.

 

7 Comments
Bob_Schor
Knight of NI

I discovered a work-around to the problem that "Touch" on the Dell Vostro 360 does not equal Mouse Down (and hence cannot be used as a LabVIEW Event).  My idea was to dual-boot to Windows XP, but Dell doesn't have any XP drivers.  So I installed VMWare and a Windows XP VM, and my test program works the way I think it should, namely touching the screen is a Mouse Down, removing your finger from the screen is a Mouse Up, and it doesn't matter how long you hold your finger on the screen.  So until NI figures out how to handle the "Touch" Event (I recommend they "capture" it early, treating it as Coordinates + Touch On/Off + Time, with perhaps later refinements, if warranted, for "multi-touch"), I can continue to use my "TouchScreen" program on a new and very nice PC.

X.
Trusted Enthusiast
Trusted Enthusiast

Just curious: what kind of temporal precision do you expect for these touch-sensitive screens? What precision do you need? Are there some "user warming up" effect, as well as "tired user" ones? What's the typical delta_t between press and release for the average user? For a given user? What amount of that is due to the touchscreen or the user?

Bob_Schor
Knight of NI

I'm happy to respond to this, but here does not seem to be an appropriate place.  Is there an appropriate forum for such a discussion, or should we just use straight e-mail?

X.
Trusted Enthusiast
Trusted Enthusiast

PM is a possibility, but since it might be of general interest, it could be posted in the "Breakpoint" forum (or general "LabVIEW" forum).

Bob_Schor
Knight of NI

I found a work-around!  When I examined the "Devices" in Device Manager, there is one called "Microsoft Input Configuration Device", with a driver called MTConfig.sys (MT = MultiTouch?).  If you Disable the driver and reboot, the touchscreen will start acting like a Mouse, and will generate the expected Mouse Up and Mouse Down events, as well as providing the coordinates of the Touch, exactly what I needed.

 

This "fix" works in Windows 7.  Who knows if it will be this simple in Windows 8?

AristosQueue (NI)
NI Employee (retired)

Bob: I've brought this idea to the attention of folks who work on events. Even though the idea hasn't received many kudos, I felt we needed to be more aware of this issue and perhaps look into it. Thanks for posting this.

Bob_Schor
Knight of NI

This is a comment on my WorkAround.  We found a curious thing.  For most of our application, we put indicators on the screen, detected mouse-down and mouse-up events, and "measured" the distance from the touch/click to the center of the indicator to decide which button the user was trying to push.  This worked just fine once we disabled multi-touch.

 

However, my student wanted to add "one more feature".  We added a control, a square button, and detected "value change" events on this control.  We used the Value Change event to decide which of two controls was chosen, and continued to use the Mouse events for timing and program control.  To our surprise, we would sometimes touch the screen and the system would act as though we right-clicked the control, bringing up an unexpected menu!

 

The "fix" was easy, once we realized what was happening.  We changed the Event from Value Changed to Mouse Down? (the "filtered" version), forcing the "button" setting on the output slot to 1, "left", thus preventing the touch from being (mis-)interpreted as a right-mouse click.