08-09-2017 09:17 PM
With the aim of finally getting some better understanding of unit testing, I'm now trying James McNally's "Test First" development style, as detailed in How We Unit Test LabVIEW.
I've decided that I want to write a new data logging class, which will implement storage and retrieval of (numeric) data with some simple(ish) interface. Additionally, I have public functions to read and write a collection of parameters, and open, close and iterate the data file (iteration used when the file becomes too large, if the max size parameter is positive).
As a starting point, I've created a parent "DiskLogger" class, which gives this interface, but most of the methods are largely unimplemented. Then, I have a "TDMSLogger" child class, which should in the future implement all of these functions for a TDMS file. Future children will use different logging formats, as needed.
I'm a little confused as to the best way to set up the hierarchy for JKI's VI Tester, to best allow me to test all of the children without creating mountains of duplicate code.
It seems like I want to be able to test only the public interface (perhaps) and with the goal of writing the test code only once, I defined a TestCase class with a DiskLogger object as a private data item, and a Write DiskLogger accessor method.
In the DiskLoggerTestSuite:New.vi, I can place the DiskLoggerTestCase constant on the left position, along with an array of different child classes, then autoincrement the child classes (with indexing disabled on the TestCase, which is no longer an array) calling the Write method and generating one test case for each child class. This is nice, and fits exactly the sort of coding that would be ideal for this in my (uneducated) opinion - to test a new child implementation using my nicely (not yet, but...) written testThisFeature.vi and testThatFeature.vi from DiskLoggerTestCase, I need only to drop a new child class constant, add to the Bundle Array, and reload the VI Tester window. Hurray!
The problem becomes, all of the test cases have the same name (this is taken from the Class Name, seemingly via Type Descriptor). Is there anything I can do about this? Clearly I should just test frequently, then when something doesn't work, it's whatever I didn't write yet, or just changed, but not being able to identify the tests doesn't seem ideal.
Perhaps I've gotten the entire Suite/Case setup wrong - I also considered creating a TestSuite for each child, each using the same TestCase (I was surprised to learn this was possible - thanks to Daklu on LAVA - Unit Testing Strategies with xUnit?) but then I end up creating lots of basically identical Suites (which eventually I'm sure to change some part of one, and not another), basically just so that I can have a new class name... This seems to be the definition of bad coding!
Any tips/thoughts?
08-09-2017 09:47 PM - edited 08-09-2017 09:55 PM
I always bring this back to the "units of work" or use cases for the Logger hierarchy. In your example I would suggest that, by definition, this entails knowing what the children are and what is expected of them. Of course in this example it seems you are performing Integration tests (you're touching the disk via a volatile dependency on file access through LabVIEW -> Win32) but that doesn't make the tests less useful by any means. For the purposes of what follows I am assuming that we are Integration testing your public API end-to-end which also makes it tolerant of any design changes you make under the API surface, making your tests more maintainable. You could add seams in to make the tests "pure unit tests" but I don't see that is adding any value - your dependencies, while not under your control, are pretty stable - and adding more abstraction just increase complexity for no real gain since you'll still want those Integration tests to disk either way.
I'll assume we have a DiskLogger class that exposes some public API. Internally it calls some protected dynamic dispatch methods that your child classes implement. As a client I could use your API by:
When I look at units of work I am thinking about what I want the system to achieve through the public API that is exposed. In your case there is only one API to worry about - lets say for now a "Write Value to Disk" and "Read Value from Disk". Let's also say I have two children - a TDMS variant and a XML variant. As the user of your API the units of work that I can think of are:
This results in four tests in this instance, two for each child class. Each child type will generate a logged file that needs verifying in some way and each child type will need to read in a "test" file of each type and the test verifies that the value is correct. The test code for the former will be unique to the child variant and the latter test code will be virtually identical (separate files to read back are needed but other than configuring this and the child type the actual API call and return value verification is the same) and could be encapuslated into a common SubVI that is independent of the child.
How you structure the tests is up to you but the TestSuites sharing TestCases is a useful strategy. It depends on how maintainable you want your test suite to be. Much like production software design, you should only add the necessary complexity to your test "architecture" if it is warranted; YAGNI and all. In this example it could really be as simple as adding the "two" tests for each child to a single test case and performing the setup inside each test method.
08-10-2017 03:04 AM
Firstly, thanks for the response.
You're quite right, although I hadn't considered it, this particular application wouldn't be true Unit Testing, but rather Integration. Oops... None-the-less, I'll go ahead and try get it going as my first significant attempt at Unit Testing since it's what I'm trying to write at the moment, and there's no time like the present for self-improvement.
How you described the client using the API is what I had in mind too - it was my goal to allow the client to simply switch the constant in one place (or perhaps use a config file and get default class object by name/path, etc) and have a different logging format without changing any of the other code.
I had imagined tests as follows - perhaps you could suggest if these seem reasonable, or if I'm missing the point entirely:
Apologies for the long post. I appreciate any further guidance you would be willing to offer!
08-10-2017 03:22 PM - edited 08-10-2017 03:24 PM
Some thoughts:
08-13-2017 10:05 PM
So I managed to create a handful of tests for the Open operation using each of the possible file naming formats I've allowed by enum (along the lines of timestamped, incrementing number, etc) and I managed to catch a mistake in one of the cases handling incrementing the path.
First win for my VI Tester testing!
08-14-2017 08:01 AM
Thats great to hear you have already had a win!
It sounds like you are on the right track. I have to say I've always kept to the simpler question of keeping seperate test VIs for these cases rather than trying to do too much with test suites and inheritance. Driven from keeping the test code simple so I'm not spending too long on it.
For files, while strictly an integration test the divide is wherever you want it so I often include these in unit tests (the downside is file access time but that hasn't been a problem for me).
I would tend to skip testing of close since it is normally simple and hard to test - that is unless it is doing something like removing files which I would test by looking for files on disk.
For read write I think tyk007's advice is bang on. If it is a simple file format (and that format is important for external support e.g. a user editable file) then I will often write to disk, read it back in and compare to a known good content. If it is just for your program sometimes I will skip this step and test the read write pair (since the actual data on disk is not important as long as read and write provide the same data).