NI Blog

Community Browser
cancel
Showing results for 
Search instead for 
Did you mean: 

Why Test Engineers Have the Biggest Data Problem

In my last blog post, I discussed the type of data test engineers are collecting. The key takeaway: we’re dealing with a lot of data.

 

All this data leads to my next topic: the three biggest problems test engineers face. At

NI, we refer to these collectively as the Big Analog Data problem.

big_analog_data_infographic_feature.jpgFinding the Valuable Data Points in a Mountain of Data

 

All the data from the increasingly complex tests run every day eventually needs to be stored somewhere, but it's often stored in various waysfrom a local test cell machine to an individual employee’s computer. Simply locating the data you need to analyze can be a giant pain, let alone trying to sift through pages of meaningless file names and types without metadata for context.

 

We see too many of our customers wasting time because they don't have an efficient way of searching files. Even if engineering groups are lucky enough to have a centralized database to store test data, they still run into difficulties accessing it because it’s not optimized for waveform data types and rarely shared between groups.

 

All of this leads to “silos” of data that then can’t be used efficiently, causing more wasted time trying to access it. In extreme cases, these problems even cause companies to rerun tests because they simply can’t find data they’ve already collected.

Validating Data Collected

 

 

An issue that most don’t think about until they experience it firsthand is the validation of collected data. Ideally, every test runs the way it’s intended to, but there’s no way to know, unless some validation steps are performed.

 

There are countless ways to get incorrect data, from improper test rig setup to data corruption. If invalid data goes on to be analyzed and used in decision-making, there could be disastrous results.

 

Big Analog Data validation presents extra headaches due to the sheer volume and variety of data types. A gut-wrenching example of this is NASA’s 1999 Mars Climate Orbiter that burned up in the Martian atmosphere because engineers failed to convert units from English to metric.

 

Manual processes work, but are extremely time-consuming. To save engineers from wasting valuable person-hours, an automated solution is usually required.

Analyzing Large Volumes of Data Efficiently

 

 

Studies show that on average only five percent of all data collected goes on to be analyzed. By not analyzing more of the data you collect, you risk making decisions without considering the bigger picture.

 

A great way to illustrate this in the engineering world is the Nyquist Theorem, which states that you must use a minimum number of data points in your analysis to get accurate results. For example, without analyzing more data points, you may just see an exponential signal (Figure 1) instead of the sine wave that’s actually there (Figure 2).

 

 Picture1.pngFigure 1Picture2.pngFigure 2

There are two reasons why test engineers don’t analyze more data. The first, as I mentioned earlier, is being unable to find the right data in the mountains of Big Analog Data they’ve collected. But, they’re also using systems and processes that aren’t optimized for large data sets. Manual calculations with inadequate tools are typically the roadblock when it comes to analyzing large quantities of data.

 

Even when the right tools are used, processing Big Analog Data can be troublesome and usually requires an automated solution where processing can be offloaded to a dedicated system. 

 

In my next post, I’ll give you some options for tackling your Big Analog Data problem, so you can be sure you’re making the best data-driven decisions possible.

 

NEXT:Addressing Your Big Analog Data Challenges >>

 

Find out more about NI’s solutions for data-driven decisions >>

Daniel Parrott
Software Product Marketing - Data Management & LabVIEW
National Instruments
Comments
Member

I would also argue that many test engineers don't analyze data because they don't have a compelling reason: their managers - sometimes up the VP level - have not pushed down the Big Questions they want answered. This is obviously much more of an organizational challenge than a technical one. Test engineers can help by asking their managers what Big Questions they have. If they don't have any, then go past their managers up the chain until they get some Big Questions to ask. Conversely, the people with the Big Questions need to push their questions down all the way to the Test Engineer's level. 

Member

I'd definitely agree Ackbach! It takes a lot of alignment within the organization to have these Big Questions pushed down all the way to the Test Engineering level. It could even be a problem of managers having these Big Questions, but not realizing those questions could be answered with the data Test Engineers are collecting and therefore never bringing it up.

 

All goes to show that communication within the org is extremely important! 

 

 

 

 

Daniel Parrott
Software Product Marketing - Data Management & LabVIEW
National Instruments
Member

Great post! I've experienced this problem working on different projects for multiple companies. Data analysis and more importantly data storage is a BIG problem. The companies I have contracted for have no plan in place for data acquisition, analysis or storage. So data based decision making is non-existent...until we put something in place to start the process. 

Test engineers have big problems to face and solve going forward and therefore a bright future, because we are leaders in the charge to put data driven systems in place. I know I have found myself in situations where I am pitching to upper management an entire data -based system to implement (data - acquisition, storage, & analysis). The future looks bright for test engineers that recognize this problem and help companies solve it!!!