[IDLE] Dallas User Group Community

cancel
Showing results for 
Search instead for 
Did you mean: 

Treating Test Like a Product

Test engineers and magicians have a lot in common; they are both expected to  pull rabbits out of a hat on command. But it doesn't stop there. Test engineers  are expected to pull test systems out of a hat while juggling multiple projects  at the same time. Unfortunately, many test engineers are nearing the end of  their bag of tricks for ensuring test systems are released on time and under  budget, especially as the time to release a test system is continually shrinking  along with test engineering head count and budgets.

An increasing number  of electronic manufacturers have discovered that treating test like a product is  essential for engineering a competitive test strategy. This enables them to  ensure the optimum quality, budget, release date, and use of test  resources.

Typically, the test-development phase of the NPI (new product  introduction) process begins when the design process is complete. Test engineers  are given the key product specs and a set of test requirements generated by  R&D to ensure the product is tested properly. This is most commonly known as  the "throw it over the wall" test-development strategy. Despite being an  oft-ridiculed approach and the source of many engineering jokes, this  test-development strategy is still widely used. Yet, as many companies are  discovering, it is becoming increasingly difficult to compete in today's  marketplace by using this strategy.

There are many problems with this  form of sequential test development. One of the biggest is that the process  practically guarantees your product release date will slip, the project will  come in above budget, and corners will be cut that will ultimately jeopardize  product quality.

The target release date becomes difficult to meet  because the development time allocated for the project is often exceeded, which  then shortens the time budgeted for test development. As a result, costs  increase due to a lack of time to research alternative instrumentation and  measurements for performing the tests beyond those specified by R&D, which  often uses very expensive, high-precision instruments during design verification  and validation. Additionally, there is often little-to-no time for evaluating  the potential reuse of test systems from previous projects, and production test  times are rarely reduced due to compressed time schedules.

Finally,  quality control can be compromised because of the pressure to stay on schedule  and under budget. This impact on quality control can manifest itself in the form  of tests that are omitted in the absence of specific instruments, test limits  that are eased to avoid low first-pass yield rates, and perhaps the most  dangerous of all, a test engineer's lack of understanding of the actual design  of the product under test which affects his or her ability to ensure the product  is being tested thoroughly.

Full article below:

http://zone.ni.com/devzone/cda/pub/p/id/1109

0 Kudos
Message 1 of 1
(3,675 Views)