Font Style shortcuts should be provided on front panel.
User can change the font style of labels/comments.
Font style like Bold, Italic, Underline, StrikeOut and should also support shortcut commands to increase/decrease the size of label/comment.
I like the 'Group Libraries' options in the VI Hierarchy window, but I'd like it a lot more if it worked for classes as well as lvlibs.
It would also be nice if there were a way to have it start up with those collapsed. Navigating a large vi hierarchy is painful on my laptop.
What is wrong to suppose have an indicator/function in LabVIEW to display Yield likes in Vision Builder AI. This can helps to display the results (Pass.Fail,Total) as a standard format. Most of the applications might requires this and especially all Vision Development Applications (thats why its included Vision Builder AI).
Bonjour à tous,
Je m'adresse à vous dans la langue de Molière car je ne maitrise pas la langue de Shakespeare, veuillez m'en excuser.
Je développe une interface graphique. J'ai besoin de masquer certaines commandes, et certains indicateurs. Je ne peux pas noté « (masqué) » dans l'étiquette car je connecte ces commandes / indicateurs sur le connecteur du sous-vi. Je ne peux pas non plus noter « (masqué) » dans le sous-titre car il prends la place de l'étiquette dans le connecteur (ce qui n'a aucun sens !).
Pour éviter fouiller le diagramme à la recherche de l'élément masqué, je me suis tourné vers l'outil de recherche, qui en principe sert à trouver les éléments facilement. J'ai eu l'espoir de trouver le champ « recherche par propriété » afin de cocher la case « indicateur / commande masqué(e) ».
Je propose donc plusieurs point à améliorer :
- le sous-titre ne prends pas la place de l'étiquette dans le connecteur de sous-vi ;
- la fonction de recherche doit permettre de rechercher par catégorie d'objet (commande - indicateurs - décoration...) ;
- la fonction de recherche doit permettre de rechercher par propriété (activé - grisé - masqué - sauter en cas de navigation par tabulation...).
Bref, la personne qui reprendra le programme après moi aura un certain nombre de difficulté à retrouvé l'ensemble des indicateurs masqués...
A moins que je ne soit passé à coté de quelque chose (!)
Merci à vous !
PS : je ne fais pas confiance aux traducteurs automatique...
Hi, the first time I watched labview code I got the feeling that you can't figure out what it means just by looking at it carefully. You are met with all this cryptic looking blocks that give no info at all. After some heartbreaks you stumble across "Show Context Help" which should have been showed by default.
From the "Show Context Help" you can go to detailed help, but the help examples are not minimalistic, instead they often are big applications.
For instance if you go to the example that is found in helpsection for the Euler differential equation solver, you find an application that can be used to show what can be done with Labbview, not as a help to use that particular function.
A pretty smart thing that is done of you is to have an example written on the panel when opening the Euler solver block. An even smarter approach would be to already have an equation already in place, so that the user just have only to m o d i f y.
The key concept to make things easy is to let the user modify a suggested solution.
Couldn't all vi's that are accesible from the Functions palette come in that fashion? Instead of filling in some required number or a string in a vi, you get suggestions. You already have default values in the unwirede inputs and outputs of a vi and that is good, but you have to take it yet another level.
Take for example when you want to plot the output from a differential equation solver. You have to transpose and do all kinds of shit to be able to get it across to an XY-graph. Why cannot labview suggest a solution? Or if you must recast an integer to a string, why cannot it be automated, so that Labview puts in the required converter automatically.
Why should the user get stuck at broken wires. Suggest something insted, and if the user is not satisfied, let the user be presented valid alternatives.
Let graphical programming be easiser. Compare Matlab command "plot(X,Y)" from a differential equation solver with Labview's code to present it on a waveform graph (you have to put a transposer in between. Horrible!), or if you want to use a XY-graph! (Horribel ! Horrible!!)
I think the vector operations are horrible to understand. Use minimalistic examples accessible from the "Context help".
Make whole videocourses availible on youtube etc. Why can't you have the same courses that you can visit in real life, be on the internet as video lectures? Isn't it better that you have free access to Labview education, so that many, many more people start to use it, instead of making peanuts (for NI) from live Labview education. You shouldn't stop offering courses that go with a teacher in a real life NI class room, but I don't think you get your maximum gain in spreading Labview from courses that costs in Sweden 10 000 kronor to attend. Who can afford that after paying for the Labview hardware and software?
Glad if this feedback of mine can help you make Labview even greater,
Adding an amplitude offset terminal to the FPGA Sine Wave Generator.VI would add capability of removing the negative component of the sine wave within the VI.
One simple idea that can be helpful to all,
many times we have an array from which we need to find out the blank elements which can be present in between. For that we need to check each element and then verify whether it is blank or not(for numeric it is comparison for zero and non zero).
One function can be made available in the array pallete which can give the output array which is not having any blank element.
Input will be the array from which blank elements needs to be removed.
I come from an embedded background.
I came across this Idea when I became more and more comfortable using the LabVIEW. It would be good to develop a system not only for testing, control and/or industrial purposes. We can have this as an exclusive development environment not only to develop applications on the desktop or embedded devices but also at the OS(or the system level) level. As more and more cloud computing is catching up these days there could be also a support for JVM or a new virtual machine can be developed which is more close to assembly level so that the speed is not compromised and platform independent code can be developed using labview. If its a proprietary issue then a complete development environment right from applications till the core of the OS can be touched using labview. LabVIEW can be made to use in such a way that even the basic coding need not be done, it can be a complete replacement for the scripting or a language. It should be in such a way that a new standalone system can be developed using only Labview just like C. Ubuntu is catching up these days as an intelligent system as it has got a lot of potential to grow to all the computer users so a support for that as well can be developed which is directly compiled to assembly rather than C or any other language. I know the Idea is quite complicated but I think this can make a revolution in the software development as a new language which directly goes in as assembly to the machines.
if we want to move a decoration object on the front panel, we can use code similar to this one:
On one hand, this is quite complicated. On the other hand you don't know at this stage, which front panel decoration object you are currently moving (in this case each one).
I think it would be quite helpful, if you could create property nodes for the decoration objects as you would for normal controls or indicators.
As expected, the use of a Value or Value (Signaling) property for a button with a latch mechanical action will generate the error 1193 when executing this code.
Run this code ? Yes, the Run arrow is unbroken in this case. I suggest to break it... like when trying to write to a local variable of such a button.
Very often you get into the situation where you always have to do the same hundred clicks in the same order.
My suggestion: make shortcuts for every LabView menu item, also the context menues user defineable (right click on controls etc., e.g. "set control to typedef" or "set cluster autosize to arrange vertically" etc. It is the same with the cluster constants in the block diagram, not only with controls/indicators on the front panel. Additionally it might be really an option to record little macros and make those macros available via short cuts, like you are used to have in MS tools.
Comment: I know you can do a lot of stuff with VI server, but nevertheless: recording makros as a LabView function would just be more comfortable ...
When you use cluster or enums, you will have to define them as Type Definition. This should always be done by default from LabView, because this is the most common option at least for clusters and typedefs.
But there are other settings like the autosize, which you do not want to reedit every time when you create a new cluster typedefinition.
My suggestion is to let be clusters (and enums) typedefinitions by default. Then, additionally, to make the cluster default settings optional (maybe this is already LabView standard... but if, it´s not well documented, at least I haven´t found the option ...).
Give LabVIEW full parity with the CVI 'Network Variables' API.
CVI has many useful API functions such as adding a NSV to a existing deployed process
or registering for SV change events that are impossible in LabVIEW.
This is a bit esoteric but for those who use the DSC toolkit to read trace data from a Citadel DB you will find that you cannot always obtain the most current value
of a trace. Here is the text describing what a ghost point is:
Citadel uses ghost points to maintain real-time awareness of traces that do not change frequently.
For example, suppose you configure a thermocouple measurement for logging with a deadband of 1°F. The thermocouple is in an environment that remains at a constant temperature most of the time, but you need to know when any major changes occur. If the temperature remains within 1°F for one hour, Citadel does not log redundant points, even if the temperature is sampled once every five seconds. However, if the logging system experiences a failure at some point during the sampling, you want to know the temperature at the time of the system failure. You also want to know the approximate time of the failure. Citadel handles this situation by logging a single ghost point for each trace. The ghost point in each trace updates constantly at a rate based on the time group specified for the trace. The ghost point always reflects the most recent time and value of a trace.
Citadel stores ghost points in a .tgpffile that updates once every ten seconds. This file ensures that Citadel has an accurate record of the value for every trace at the time of a system failure.
It would be very useful to be able to specify in the trace read vi that you would like it to return the ghost data points. Otherwise, it is very cumbersome to try to detect if you are missing this data point and then have to programmatically read it yourself from the originating data system.
The Labview Build in Semaphore is a "Software Semaphore" created with a simple Queue.
I would like to have a Semaphore which could be abble to manage exclusion between "Threads, Process, Compilation units".
Actualy, the build in Semaphore can't be taken twice by a same process, Thread, Compilation Unit, because it doesn't take in account the process which has tryed to take it.
You can easily test it by creating a simple VI which calls twice a "Semaphore take" ... you will "deadlock yourself" !
I would like to have a "One token Semaphore or MUTEX", which could be taken multiple times by the owning process.
My need was at the begining, to made an existing instrument driver to be thread safe ... without having to create a "Gaz factory" and to modify an existing application architecture !
So i started all the VI's of the instrument driver, with a "Take semaphore", and finished them with a "Release Semaphore" ...
But some of the VI's where called by others ... and then => DeadLock !!!!
The solution would be to place in an intellignet way the Take and the release ... but i am sure i will forgot some special cases ....
So the idea of a new "Process/Thread/Compilation Unit MUTEX".
The behaviour of the new Semaphore will be quite the same as the existing one, the only diffirence is that the owning process,thread or compilation unit, will have the right to take a Semaphore more than one time without creating a Deadlock.
I think that the behaviour i describe here, is the behaviour in most of the existing real time systems like WxWorks, IRMX, OS9 ...
It would be nice to have such a tool, in order to simplify The ressources exclusions !