LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

UDP - Max "lossless" speed?

Hey everyone.  Anybody out there know what the max losses speed LabVIEW's UDP is capable of?  I read HERE that I can increase the buffer size to possibly assist in data loss, but I've seen no change in how much data I'm receiving.  The device I'm listening to says it streams data at 600Mbs, but I'm counting about 1.8Mbs and I continue to read Error 113.  According to LabVIEW knowledge database, there seems to be a recommendation to use a different application to read UDP data.  Anyone care to chime in on any of this and offer advice?  I'd greatly appreciate it.

0 Kudos
Message 1 of 11
(1,147 Views)

We probably need to know much more about your application, what it does, and how it does it. Each packed has headers for each layer, (ethernet, IP, UDP header etc.) so if you are sending one byte at a time, the overhead is most of the transmission.

 

What is your connection? 1000Mbps? 100Mbps? Wifi? etc. How are you "counting"?

0 Kudos
Message 2 of 11
(1,144 Views)

Yes, I've considered that maybe the overhead is factored into that 600Mb/s.  I am using the 1000Mbs wired connection. I'm counting the bytes I actually receive from the UDP Read - which of course negates the overhead.  So with the constant 113 error, I was mostly curious what the LabVIEW UDP max read speed is.  Trying to find some answers to the error.

0 Kudos
Message 3 of 11
(1,119 Views)

@DailyDose wrote:

Yes, I've considered that maybe the overhead is factored into that 600Mb/s.  I am using the 1000Mbs wired connection. I'm counting the bytes I actually receive from the UDP Read - which of course negates the overhead.  So with the constant 113 error, I was mostly curious what the LabVIEW UDP max read speed is.  Trying to find some answers to the error.


Have you seen this link?  Hopefully it can give you some ideas.

Bill
CLD
(Mid-Level minion.)
My support system ensures that I don't look totally incompetent.
Proud to say that I've progressed beyond knowing just enough to be dangerous. I now know enough to know that I have no clue about anything at all.
Humble author of the CLAD Nugget.
0 Kudos
Message 4 of 11
(1,109 Views)

I doubt LabVIEW does anything special reinventing the wheel, I am sure they are just using standard libraries.

 

You still haven't told us how many bytes you receive per iteration and what you then do with them in the same loop. If you choose UDP, you need to accept loss and deal with it.

Message 5 of 11
(1,104 Views)

@billko wrote:

@DailyDose wrote:

Yes, I've considered that maybe the overhead is factored into that 600Mb/s.  I am using the 1000Mbs wired connection. I'm counting the bytes I actually receive from the UDP Read - which of course negates the overhead.  So with the constant 113 error, I was mostly curious what the LabVIEW UDP max read speed is.  Trying to find some answers to the error.


Have you seen this link?  Hopefully it can give you some ideas.


Actually the link I posted in my original post.  😉

0 Kudos
Message 6 of 11
(1,084 Views)

@altenbach wrote:

I doubt LabVIEW does anything special reinventing the wheel, I am sure they are just using standard libraries.

 

You still haven't told us how many bytes you receive per iteration and what you then do with them in the same loop. If you choose UDP, you need to accept loss and deal with it.


Right now I'm just gathering a couple seconds worth of data into a shift register.  Looks like that could affect the processing speed.  Which is why I only do a couple seconds.  Just getting some data to mess with and develop with.  But can't get much further if I can't prove that I can even get the data in a consistent and reliable manner.  I'm just looping around a UDP Read and letting it go.  It gets data every 3500 - 5000 iterations seems like.  I've changed some configuration settings of the device I'm talking to that I'm now getting what appears to be between 204 - 400 bytes per "successful read."  Seems like to me the device is not streaming at 600Mbs...  And all of this is may be a non-issue.

0 Kudos
Message 7 of 11
(1,082 Views)

Here's a better question:

From what I'm reading about this device, it streams data to the same port but from 3 different ports.  And apparently that's your "header."  Is it possible to tell LabVIEW which port to watch for that the data is coming from?  Instead of which port to listen to?

0 Kudos
Message 8 of 11
(1,077 Views)

@DailyDose wrote:
I'm just looping around a UDP Read and letting it go.  It gets data every 3500 - 5000 iterations seems like.  

Hmm, so for most iterations you don't get anything? Did you set the UDP read timeout to zero for some weird reason?

0 Kudos
Message 9 of 11
(1,063 Views)

@altenbach wrote:

@DailyDose wrote:
I'm just looping around a UDP Read and letting it go.  It gets data every 3500 - 5000 iterations seems like.  

Hmm, so for most iterations you don't get anything? Did you set the UDP read timeout to zero for some weird reason?


So.... no.  I did not wire a 0 to the timeout.  But... upon further investigation of the "Max Size" input, I see that I misunderstood that at first to mean that the max was 548.  Which is where all my problems originate from.  After changing that to a larger and more realistic value for this application, I no longer getting Error 113 and my loop iterations aren't completely bizarre where I'm getting nothing for 3500 iterations.  And it would appear that LabVIEW is plenty capable of dealing with this streaming speed.

 

And today was another fun adventure of "Programmers who don't know what they're doing..."

0 Kudos
Message 10 of 11
(1,056 Views)