In my last blog post, I discussed the type of data test engineers are collecting. The key takeaway: we’re dealing with a lot of data.
All this data leads to my next topic: the three biggest problems test engineers face. At
NI, we refer to these collectively as the Big Analog Data™ problem.
Finding the Valuable Data Points in a Mountain of Data
All the data from the increasingly complex tests run every day eventually needs to be stored somewhere, but it's often stored in various ways—from a local test cell machine to an individual employee’s computer. Simply locating the data you need to analyze can be a giant pain, let alone trying to sift through pages of meaningless file names and types without metadata for context.
We see too many of our customers wasting time because they don't have an efficient way of searching files. Even if engineering groups are lucky enough to have a centralized database to store test data, they still run into difficulties accessing it because it’s not optimized for waveform data types and rarely shared between groups.
All of this leads to “silos” of data that then can’t be used efficiently, causing more wasted time trying to access it. In extreme cases, these problems even cause companies to rerun tests because they simply can’t find data they’ve already collected.
Validating Data Collected
An issue that most don’t think about until they experience it firsthand is the validation of collected data. Ideally, every test runs the way it’s intended to, but there’s no way to know, unless some validation steps are performed.
There are countless ways to get incorrect data, from improper test rig setup to data corruption. If invalid data goes on to be analyzed and used in decision-making, there could be disastrous results.
Big Analog Data validation presents extra headaches due to the sheer volume and variety of data types. A gut-wrenching example of this is NASA’s 1999 Mars Climate Orbiter that burned up in the Martian atmosphere because engineers failed to convert units from English to metric.
Manual processes work, but are extremely time-consuming. To save engineers from wasting valuable person-hours, an automated solution is usually required.
A great way to illustrate this in the engineering world is the Nyquist Theorem, which states that you must use a minimum number of data points in your analysis to get accurate results. For example, without analyzing more data points, you may just see an exponential signal (Figure 1) instead of the sine wave that’s actually there (Figure 2).
Figure 1Figure 2
There are two reasons why test engineers don’t analyze more data. The first, as I mentioned earlier, is being unable to find the right data in the mountains of Big Analog Data they’ve collected. But, they’re also using systems and processes that aren’t optimized for large data sets. Manual calculations with inadequate tools are typically the roadblock when it comes to analyzing large quantities of data.
Even when the right tools are used, processing Big Analog Data can be troublesome and usually requires an automated solution where processing can be offloaded to a dedicated system.
In my next post, I’ll give you some options for tackling your Big Analog Data problem, so you can be sure you’re making the best data-driven decisions possible.
Wireless engineers face a big challenge today. They must prototype next-generation wireless communications systems and increasingly connected devices in a more competitive and fast-changing communications industry.
European Microwave Week (EuMW) 2017, a six-day event in Nuremberg, will focus on the future of microwave technology globally and give us a better look at how this challenge is impacting the industry today and changing the way engineers work.
Look for the following topics at EuMW 2017:
5G prototyping: the progress we have made
5G continues to capture headlines as wireless companies everywhere take on the challenge of building a 5G wireless network. NI’s engagement with industry leaders in 5G prototyping has resulted in MIMO systems with world record-breaking spectrum efficiency, including one of the world’s fastest mmWave channel sounders.
NI at EuMW:We’ll demonstrate a real-time, 28 GHz, over-the-air prototype aligned with the Verizon 5G specification. We’ll also showcase an academic partnership enabling research on ultra-reliable, low-latency wireless communications for mobile video recording and broadcasting. Follow @NIglobal for updates during the show.
Sensor fusion test: a key part of the race towards autonomous vehicles
As automakers race to produce autonomous vehicles, sensors like cameras, lidar, GNSS, and radar are making automotive test much more complex. Due to the speed at which this industry trend is evolving, shows like EuMW help us keep up with the progress towards making sensor fusion test faster and safer. This is critical for automotive suppliers to remain competitive as we move toward more connected autonomous cars.
NI at EuMW:We’ll demonstrate an advanced driver assistance system (ADAS) test solution, developed in collaboration with Germany’s ADAS IIT consortium, for short- and long-range radar at 76–81 GHz. The solution is based on the industry-standard PXI modular instrumentation platform. Using PXI’s timing, triggering, and synchronisation capability, along with instruments from DC to RF and bus interfaces like CAN, this system provides an ideal solution for testing sensor fusion. Follow @NIglobal for updates during the show.
Also, in the MicroApps theatre, NI distinguished engineer Paul Khanna will discuss high-performance test techniques for automotive radar sensors at 12:30 p.m. on Wednesday, 11 October.
Software: the solution for faster and smarter microwave design and test innovation
As wireless capability is integrated into a dramatically growing number of devices, manufacturers increasingly need to test larger volumes of connected devices. This makes it even more important to efficiently design, deploy, and maintain automated wireless test systems. Productive development software is key to achieving the goal of efficiently creating test systems.
NI at EuMW:At NIWeek 2017, we announced LabVIEW NXG 1.0, the next generation of LabVIEW systems engineering software. LabVIEW NXG accelerates automated test system development and deployment with these essential features: guided, instrument-specific examples; test and function reuse; engineering data exploration; ability to build scalable libraries; and remote result viewing.
This year,we’vecreated a series of new technical sessions focused onDesign and ArchitectureandSoftware Engineeringby partnering withseveralsubject matter experts and leveraging LabVIEW Champions inmanyfields and industries.
LabVIEW 2017 debuts some new capabilities designed to drastically simplify the development, deployment, and management of distributed systems. We're continuing to streamline complex system design with an open, software-centric platform.
“I'd like to think that if actual engineers were involved in more projects, we wouldn't live in a world where it's a given that most websites, applications, apps, and embedded systems are poorly designed, overly buggy, and insecure. And though one would like to imagine that all safety-critical systems, at least, are created under the aegis of engineers and engineering principles, I have my doubts.”
This is one of the reasons we exist. We supply an integrated software-based platform that enables engineers to do exactly what Mr. Dunn states. Our choice in this endeavor isn’t to explore a path where engineers needed to become software developers. Instead, our mission is to create an engineering system design tool that empowers engineers to build world-class, complex, mission-critical, software based systems.
LabVIEW is at the core of our engineering platform, and when coupled with our modular hardware platforms, becomes a gateway to innovation that engineers the world over are using to make products safer, get to market faster, and accomplish amazing things.
LabVIEW simplifies the development of complex engineering applications. It’s native graphical language uses a concept called dataflow to define the parameters of execution and combines its native language with an open interface that integrates code from other software approaches. With this, we ensure that engineers can choose the approach they’re most familiar with for any individual component of the application.
We’ve invested 30 years into LabVIEW, making it one of the most productive tools on the planet. And we’re excited to share that the best is yet to come.