BreakPoint

cancel
Showing results for 
Search instead for 
Did you mean: 

Value of Side-by-Side Comparisons?

As a company that provides test instruments, we get pulled into providing proof of concepts and side by side comparisons all the time.  In February 2011, I decided to make a demo that compared the speed of one of our latest instruments, the NI PXIe-5665 vs. the Agilent PXA vector signal analyzer.  We recognized that the Agilent PXA is a world class instrument and wanted to show our customers that we have similar performance, but with the obvious advantages of the PXIe bus and faster processing (amongst other benefits) the NI PXIe-5665 is about 10-15 times faster for measurements such as ACLR.

 

I recorded a video showing the differences in the two instruments.  I picked different benchmarks such as list mode, WCDMA ACP and LTE EVM for the comparison.   The link to the video can be found here.

 

http://www.youtube.com/watch?v=9kBd2JMdFgE
(The handsome guy with the great hair in the video is me 😉 )

 

I also put in months of effort to make sure that the demo showed an equal and fair comparison, such as performing reads on both instruments, using the same number of averages and using the same attenuation / RBW settings in all cases.  I made it a point to mention that we are not doing a spec sheet comparison of the instruments and that both instruments have better measurement capabilities.  The demo that I was showing for was for one particular test setup.  The numbers shown in the demo pertain to that setup only and better results might be achieved by both instruments for a different setup (using a different filter for example).  The speed results are similar irrespective of the setup.

 

I also had many RF engineers at NI look over the comparison before going public.  There are a few instances where both instruments implement things differently, so we picked the typical setup that a test engineer would use.  To read more about the setup of the demo, see here.

 

http://zone.ni.com/devzone/cda/tut/p/id/12651

 

I expected some rebuttal from Agilent of course, and welcomed the discussion.  Initially this rebuttal was professional and courteous, where they made some interesting points about the demo.  You can read about it here.

 

http://www.eebeat.com/?p=3928

 

My team and I tried to address all of their questions and posted responses in an act of healthy competition.

However, this all seemed to change at European Microwave Week (EuMW) 2011 hosted in Manchester, UK.  I showed the same comparison demo in our booth, while keeping Agilent’s response in mind (for example, I added an option of fast mode instead of IBW mode on the PXA, which loses dynamic range but improves speed on the Agilent PXA).  The consequences of showing this demo at EuMW surprised us.

 

At the show, Agilent told us that we cannot show the demo in Europe (even though it is perfectly legal to do a side by side comparison in the UK).  I spent over an hour working with Agilent’s engineers showing them that the demo shows an equal and fair comparison.  We reviewed the code and even asked Agilent for suggestions to improve the demo.  In spite of this, Agilent asked show organizers to force us to remove the demo whether it was technically accurate or not.   We were threatened to be kicked out of the show and to have show technicians take down the demo in our booth.  The show organizers would not allow us to keep the Agilent box in our booth, even when it was turned off.

 

We ended up taking down the demo before the last day of the show.  We eventually ended on good terms with the show organizers, but it was still surprising that a large company would behave so aggressively and unprofessionally by trying to take down the demo with such tactics instead of trying to technically prove the demo wrong.

I created this demo thinking that it would help customers make informed decisions, and to provide additional options when selecting test instrumentation.

 

I want to ask you, our customers, whether side by side comparison demos such as the one talked about above are useful to you or not?

 

 

Raajit L
National Instruments
Message 1 of 28
(13,331 Views)

Rajiit,

they are very useful indeed.

One thing that I, as a user, would always like to see is a comprehensive list of

 

  • what are the configuration settings for the equipment in the comparison
  • what specific parameters/specs are being compared
  • what workarounds (if any) would be available in the competitor product to bring it closer (if not overcome) the performances of the NI piece being demoed.

This not only contributes to the comparison being fair, but it also helps us, the users, in making the right decisions based on our applications.

 

 

0 Kudos
Message 2 of 28
(13,257 Views)

Interestingly when I clicked on the link to the eebeat site  http://www.eebeat.com/?p=3928 I got blocked by my "Norton's safe surf" software, declaring that it had been ID'd as a malicious site! Boy, your interaction with Agilent must have gotten really hot!  The Norton says that it has "drive by downloads" of anti-virus redirects"

Putnam
Certified LabVIEW Developer

Senior Test Engineer North Shore Technology, Inc.
Currently using LV 2012-LabVIEW 2018, RT8.5


LabVIEW Champion



0 Kudos
Message 3 of 28
(13,213 Views)

Raajit,

 

As a test engineer I have typically avoided presentations of side by side comparisons.  The funny thing is- the presenting companies product ALWAYS seems to be the no-brainier better solution and I (the audience) should have my head examined for even considering the other instrument.  That being said, I have listened to your presentations before and found them remarkably open and honest so, I dug into the video and discussion.  Here are my key takeaways:

  • The presentation did not make clear that the "Measurementtimes" included transfer to test control software.  You did both NI and Agilient a disservice with this omission.  The PXIe bus is screaming fast and NI controllers and chassis just work by esign at higher transfer speeds (not to mention instrument synchronization is orders of magnitude easier on PXIe, the cost savings by externalizing the processor and sharing it in a retaskable form, etc.... the PXI/Box instrument argument should be over.....)  Conversely, the PXA measurement times (conversion) were maligned by not clairifying the "time of measurement" and IMHO Agilient was on point in their rebuttal you left the impression that Agilient did not meet its published spec for those who might later read it.  The debate cleared up the point.  Side by side comparisions should carefully and completely tell what is being compared; Oranges? Apples? Granny Smith Apples?
  • Quality of the companies is important to me:  Who I perceive as the better service support team influences my purchase.  Ease of obtaining expert assistance with integrating highly complex instruments is a must.  Walking into a instrument of this class I ASSUME that just reading the manual won't be quite enough and I'll need a app eng to assist on one or more details.  Look it up yourself Raajit, I've never had a SSP ticket open for long at NI (Though, some CARs remain in development).  Agilient hasn't acted on my open ticket for 13 months calls are answered with "that's being handled by a group overseas" and "I'll try to get a progress report and get back to you."  SELL  it.

By the way.  Nice piece of gear. Congrats to the team.


"Should be" isn't "Is" -Jay
Message 4 of 28
(13,155 Views)

I enjoyed the presentation, even thought I don't know anything about these kind of measurements.

 

I agree with Jeff that I am usually very suspicious with such comparisons, because the side sponsoring the tests always wins hands down, and often further examination shows great bias. Things are sometimes more believable if done by an independent outside party, but even that can be tricky because you never really know who's behind it (see e.g. astroturfing).

 

  • Do the speed differences really matter in practical applications?
  • Which instrument is easier to setup and configure to somebody who is not familiar with either product?
  • Are both instruments of similar vintage?

 

Overall, I truly believe that the time of monolithic instruments are over and NI is on the right track. Congratulations to such a great product. The presentation was cool!

 

(Maybe it should all have been programmed in VEE, just kidding! ;))

 

I am curious about the debate at the show. Did the legal teams get involved or was it just a local fistfight?

0 Kudos
Message 5 of 28
(13,146 Views)

AlessioD,

 

Thank you for your response.  My thoughts are below.

·         what are the configuration settings for the equipment in the comparison

·         what specific parameters/specs are being compared

 

I tried to include all the high level configuration settings included in this white paper

http://zone.ni.com/devzone/cda/tut/p/id/12651

We tried to reference the white paper everywhere the video was shown.  Unfortunately I could not mention all the setup details in the video in the interest of keeping the video short.

As you probably noticed, the two instruments are different architectures and we have had customers ask for even details on the exact configurations.  For example we had a customer who asked how to get repeatability to .1 std deviation. This was achieved by both instruments at about 25 averages, and we are more than open to give out more details or tweak the demo if a customer asks for it.

See the comments section on this video for more details.
 http://www.youtube.com/all_comments?v=AkJ12ny6A2w 

The parameters and specs are also mentioned in the white paper.

Raajit L
National Instruments
0 Kudos
Message 6 of 28
(13,112 Views)

  • what workarounds (if any) would be available in the competitor product to bring it closer (if not overcome) the performances of the NI piece being demoed.

We tried to incorporate all the “workarounds” suggested by Agilent to make their instrument faster, and some of these methods did make the instrument faster but at the cost of something else.

This is a great point you bring up.  We were aware of most of the workarounds on the PXA to make it faster, unfortunately implementing some of these work around (such as the FAST ACP mode) on the PXA meant losing out on dynamic range.  While developing the demo, I wanted to show the best dynamic range possible out of both instruments for all scenarios.  We did document the workarounds in the white paper as well.  This table in the white paper showing the FAST ACP mode is an example.

 

Raajit L
National Instruments
0 Kudos
Message 7 of 28
(13,109 Views)

@LV_Pro wrote:

Interestingly when I clicked on the link to the eebeat site  http://www.eebeat.com/?p=3928 I got blocked by my "Norton's safe surf" software, declaring that it had been ID'd as a malicious site! Boy, your interaction with Agilent must have gotten really hot!  The Norton says that it has "drive by downloads" of anti-virus redirects"


 

Yikes! Sorry about the malicious site, not sure why that happened to you.  Here is another link on evaluation engineering that mentions the same rebuttal from Agilent.  Hope this works

 

http://www.evaluationengineering.com/index.php/industrynews/instrumentation/agilent-refutes-measurem...

 

Raajit L
National Instruments
0 Kudos
Message 8 of 28
(13,108 Views)

@Jeff Bohrer wrote:



Nice to hear from you again, it’s been some time.  You bring up some good points for sure.

I agree that I could have done a better job earlier of making it clear that the demo applies more to automated test and not as much to debugging or benchtop testing.  The assumptions that I made for the video were

·         -          The test time is the time taken to make a single measurement on a DUT (moving averaging does not count as a single measurement)

·         -          The test time includes initiate + acquire + process + return measurement

·         To be clear, I am not transferring the entire trace from the PXA, only the ACP measurements, whereas on the PXI module, I have to transfer the entire waveform (just a difference in architectures)

·         -          Assumed 10 averages for some kind of repeatability

 

I am glad that you are happy with our support Jeff J.  Having been part of our Applications Engineering team myself a few years ago, I can tell you that NI AEs will do our best to help you.

Raajit L
National Instruments
0 Kudos
Message 9 of 28
(13,104 Views)

altenbach,

 

I am glad you enjoyed the presentation.  Below are my thoughts on your concerns.

·         Do the speed differences really matter in practical applications?
For an automated test environment, I believe that they do.  We have seen customers such as Triquint and STEricsson who have reduced their characterization times from a couple of weeks to a day by moving from a boxed instrument to PXI.  The video above shows just one measurement but a company such as STEricsson repeats the above mentioned ACP measurement 800,000 times and that is where a difference of days comes in to play.  

Alternatively, if you are R&D and debugging your DUT for the first time, especially if you are doing it manually, the small difference in a few 100s of ms might not matter.

·         Which instrument is easier to setup and configure to somebody who is not familiar with either product?
Honestly, If you are doing a simple measurement, just one time, not automating it, and not transferring any data back to a PC, a boxed instrument might be easier for display purposes.  However, if you are

·         automating in any way

·         synchronizing,

·         Transferring data back to a PC

·         Performing real time processing

·         Doing MIMO applications

The PXI platform provides plenty of benefits over a traditional boxed instrument.

·       

Raajit L
National Instruments
0 Kudos
Message 10 of 28
(13,104 Views)