09-13-2011 05:32 AM
I'm developing a LabVIEW application, that needs to change the data transfer rate between
transmitter and receiver,to achieve similar function like a adaptive-rate channel machine.For example, reduce the data transfer rate when the channel quality is bad,, otherwise
increased. My idea is to achieve a queue buffer, write in every M ms and read in every N
ms.Reference channel quality parameters to change the size of M and N and achieve the
function of data rata change.
If this method is feasible, is there anything in need of improving? Are there any examples
I can refer to?
Waiting for your help, your any reply will be appreciate! Please let me know if you have any questions about the described above.
09-13-2011 10:17 AM
What is the ciommunication medium?
RS-232, telnet, TCP/IP, GPIB, etc?
09-13-2011 10:11 PM
Thanks! We use communication cable in a LAN system
09-14-2011 07:56 AM - edited 09-14-2011 08:00 AM
Are you referring to something similar to what was discussed in this thread?
I don't know that you can throttle ethernet (speed) using native LabVIEW functions, but doing a google search on the term "throttle ethernet" produced a long list of hits. Once you find which method is most appropriate, you may be able to use LabVIEW to make the necessary changes for the transfer speed.
In Linux, there is an app that can do it called trickle.
http://monkey.org/~marius/pages/?page=trickle
You may be able to get the source code, check out how they do it and code it in LabVIEW (or if they have a dll available for your OS, it would be easier). You can also create your own dll that is called by LabVIEW.
09-16-2011 06:32 AM
Dear Ray, thank you help led me in the right direction,and also thank "WesChiu02" in the link you gave me as below: