Community Browser
cancel
Showing results for 
Search instead for 
Did you mean: 
3345 Views
2 Comments

PCIe upgrades for legacy PCI DAQ devices 

Read more...

  • DAQ
3065 Views
0 Comments

Two ways you can solve your Big Analog Data challenges

Read more...

5728 Views
3 Comments

In my last blog post, I discussed the type of data test engineers are collecting. The key takeaway: we’re dealing with a lot of data.

 

All this data leads to my next topic: the three biggest problems test engineers face. At

NI, we refer to these collectively as the Big Analog Data problem.

big_analog_data_infographic_feature.jpgFinding the Valuable Data Points in a Mountain of Data

 

All the data from the increasingly complex tests run every day eventually needs to be stored somewhere, but it's often stored in various waysfrom a local test cell machine to an individual employee’s computer. Simply locating the data you need to analyze can be a giant pain, let alone trying to sift through pages of meaningless file names and types without metadata for context.

 

We see too many of our customers wasting time because they don't have an efficient way of searching files. Even if engineering groups are lucky enough to have a centralized database to store test data, they still run into difficulties accessing it because it’s not optimized for waveform data types and rarely shared between groups.

 

All of this leads to “silos” of data that then can’t be used efficiently, causing more wasted time trying to access it. In extreme cases, these problems even cause companies to rerun tests because they simply can’t find data they’ve already collected.

Validating Data Collected

 

 

An issue that most don’t think about until they experience it firsthand is the validation of collected data. Ideally, every test runs the way it’s intended to, but there’s no way to know, unless some validation steps are performed.

 

There are countless ways to get incorrect data, from improper test rig setup to data corruption. If invalid data goes on to be analyzed and used in decision-making, there could be disastrous results.

 

Big Analog Data validation presents extra headaches due to the sheer volume and variety of data types. A gut-wrenching example of this is NASA’s 1999 Mars Climate Orbiter that burned up in the Martian atmosphere because engineers failed to convert units from English to metric.

 

Manual processes work, but are extremely time-consuming. To save engineers from wasting valuable person-hours, an automated solution is usually required.

Analyzing Large Volumes of Data Efficiently

 

 

Studies show that on average only five percent of all data collected goes on to be analyzed. By not analyzing more of the data you collect, you risk making decisions without considering the bigger picture.

 

A great way to illustrate this in the engineering world is the Nyquist Theorem, which states that you must use a minimum number of data points in your analysis to get accurate results. For example, without analyzing more data points, you may just see an exponential signal (Figure 1) instead of the sine wave that’s actually there (Figure 2).

 

 Picture1.pngFigure 1Picture2.pngFigure 2

There are two reasons why test engineers don’t analyze more data. The first, as I mentioned earlier, is being unable to find the right data in the mountains of Big Analog Data they’ve collected. But, they’re also using systems and processes that aren’t optimized for large data sets. Manual calculations with inadequate tools are typically the roadblock when it comes to analyzing large quantities of data.

 

Even when the right tools are used, processing Big Analog Data can be troublesome and usually requires an automated solution where processing can be offloaded to a dedicated system. 

 

In my next post, I’ll give you some options for tackling your Big Analog Data problem, so you can be sure you’re making the best data-driven decisions possible.

 

NEXT:Addressing Your Big Analog Data Challenges >>

 

Find out more about NI’s solutions for data-driven decisions >>

21546 Views
0 Comments

The latest model of the Industrial Controller adds IP67 reliability to high-performance processing and control applications.

 

Read more...

4937 Views
3 Comments

From connected grids to road networks to social media, we create a lot of data—“big data.

Read more...

2331 Views
1 Comment

Ensure your signal rises above the noise

Read more...

2895 Views
0 Comments

Inaccurate timing can lead to invalid analysis 

Read more...

2248 Views
0 Comments

Your go-to technical resource for understanding and improving the quality of your analog measurements

Read more...

3269 Views
0 Comments

Making data acquisition as simple as 1-2-3 

Read more...

3029 Views
0 Comments
2658 Views
0 Comments

It’s time you demand more from DAQ software.

Read more...

6033 Views
0 Comments

How can you know what you don’t know?

Read more...

23373 Views
2 Comments

Announcing: the integration of Time Sensitive Networking (TSN) into CompactDAQ, with the introduction of the new cDAQ-9185 and cDAQ-9189 multislot Ethernet chassis!

Read more...

1402 Views
0 Comments

The forefront of pathogen detection is being powered by our platform.

Read more...

3467 Views
0 Comments

The NI Combustion Analysis System (CAS) is a low-cost, portable combustion analyzer based on the popular CompactDAQ platform. The CAS includes all the hardware, I/O, and software necessary to read, analyze, and log data from pressure sensors in an internal combustion engine. The combination of ready-to-use software and portable, modular I/O makes the CAS the perfect combustion analysis system for in-vehicle, lab, or test cell applications.

 

CAS.png

 

 

Key Features:

  • Raw data logging, summary data reporting, and postprocessing
  • Empty chassis slots for customization
  • Concurrent simultaneous data monitoring up to 1 MS/s per channel and streaming to disk

  • DAQ