OK, here's my take on things, but using my C/Unix knowledge as the functions and cluster are very similar to the LabVIEW way.
What seems to have been forgotten is that the routine that displays a timestamp converts the number of seconds stored in the timestamp (as the number of seconds since the epoch in the GMT time zone) into a human readable form, AND IT TAKES DST INTO ACCOUNT WHEN SO DOING.
To illustrate my point, create a vi containing a double-precision numeric control and a timestamp indicator, Set the double to show 12 sig figs and the timestamp to show the time zone. Wire them together via a "to time stamp" conversion.
Now set the double to 3321133199 (seconds since the epoch) and run the vi. The time shows (should show) as 00:59:59 29/03/2009 GMT standard time. This is one second before we switched to BST this year. Now increment the double (by 1) and re run. The time is now 02:00:00 29/03/2009 GMT Daylight Time [Ugh, horrible name - Most Unix implemtations call the zones GMT and BST] showing that the visual representation of a time stamp is DST-aware. [This will only work if your PC is set to UK time zone, with DST adjustment enabled]
Now consider your conditions:
We know that if Windows does not do DST corrections, whatever setting you use for isDST and isUTC, the time is correct. This is because you have disabled the DST corrections at a windows level, so all times are stored and manipulated in GMT and isDST is ineffective. Setting isUTC also has no effect because the system is already operating in GMT (same as UTC)
If windows DST is enabled....
You should enable the time zone display of your timestamp indicator to see the results more clearly.
The value contained in the isDST field tells the conversion routine whenther or not the time is in DST, 0=no, 1=yes, -1="I don't know, you can work it out for yourself"
Setting isDST=-1 tells the routine to use Windows idea of whether or not the time is GMT or BST, so it will in effect use isDST=0 during winter and isDST=1 during summer. The time cluster is convertd to a timestamp, taking the DST setting into account. When you display that timestamp, the routine that displays the timestamp also applies DST corrections, so the whole thing remains consistant.
The results are also consistant if you specify isDST=0 (or isUTC=TRUE) in winter, or isDST=1 in summer.
So in the above cases you specify that it's 10:00 GMT in the winter, and 10:00 BST in the summer, either by explicitly setting the correct value for isDST or by implicitly setting the correct value by telling windows to do it. The readable display of the timestamp shows the same information thay tiu input to the cluster.
However if you use isDST=0 (or isUTC=TRUE) in summer, you are telling the system that the time you have specified is the time in GMT. When it comes to display the time, the display routine converts the GMT time in the timestamp to BST for you. Hence your input of 10:00 GMT will show as 11:00 "GMT Daylight time".
If you use isDST=1 in the winter, you are telling the systen that the time you have specified is the time in BST, so it applies the DST adjustment while converting to a timestamp. When you come to display the time, the system sees that it is in winter and displays the time in "GMT standard time" hence your input of 10:00 BST will show as 09:00 "GMT Standard time"
So in fact all the representations are correct, provided that you know whether the time being input or displayed is GMT or BST.
Whilst the reasoning may be different, the method for consistancy is clear. Allow windows to do DST corrections and set isDST to -1, or turn windows DST corrections off.
To support my description, my isDST usage agrees with both the new help page for LabVIEW and the Unix man page for mktime() describing the use of tm_isdst.
EDIT: Fixed types in para 4.