|
Thank you Marc It was one of my first CodeProject articles and probably not very good.
Mircea
|
|
|
|
|
I use MSUnit but am quite strict about using it to test units, not larger lumps of functionality so broadly speaking there is not much of a maintenance cost to keep the tests working...then it is automated as part of the CI/CD pipeline on Azure DevOps.
|
|
|
|
|
- 1 for philistine methodology
"Life should not be a journey to the grave with the intention of arriving safely in a pretty and well-preserved body, but rather to skid in broadside in a cloud of smoke, thoroughly used up, totally worn out, and loudly proclaiming “Wow! What a Ride!" - Hunter S Thompson - RIP
|
|
|
|
|
It's the debate between "is it good enough to ship?" and "we have 500 unit tests but haven't shipped anything in 2 years". (True story).
"Before entering on an understanding, I have meditated for a long time, and have foreseen what might happen. It is not genius which reveals to me suddenly, secretly, what I have to say or to do in a circumstance unexpected by other people; it is reflection, it is meditation." - Napoleon I
|
|
|
|
|
Make that 2 for philistine method! Actually, my users pay good money to do the testing.
"Go forth into the source" - Neal Morse
"Hope is contagious"
|
|
|
|
|
Unit tests are often a waste of time. See the articles linked here[^]. Coplien is one of the few "gurus" for whom I have much use.
Unit tests are orthogonal to whether tests are automated. Automation, and system and regression tests, are essential to anything beyond toy projects.
|
|
|
|
|
Best unit test I have ever found= user (see idiot)
>64
Some days the dragon wins. Suck it up.
|
|
|
|
|
For algorithmic things like what you posted, unit tests are great, and I would definitely write that with a unit test "engine." That said, I also end up spending time debugging the tests, not the algorithms.
|
|
|
|
|
Writing unit tests means you have no customers
|
|
|
|
|
My experience is that most test frameworks rapidly grows into such a complexity that you spend far more time on all the required red tape than on developing good tests. It may pay for huge systems that will be in development for many years, by scores of developers, but for smaller systems, you can do 99% of the same amount of testing with a much simpler infrastructure, with far less test management.
Certainly: Do systematic testing! And have a setup that allows you to play old tests again - a.k.a. regression testing. Just don't let the testing infrastructure completely take over.
The important tasks in testing is not managing the tests, but rather to identify relevant test cases. All corner cases - and sometimes the cartesian product of all possible cases (when the product is within reasonable limits). How to provoke synchronizing and timing issues. Identify relevant stress testing. And so on. I have seen cases where far more time was spent on test management than on developing relevant tests.
Regression testing is essential (and I am surprised by how often I see new software releases witn regression from earlier releases!), but sometimes I wonder if it is getting out of hand: Some years ago, I worked in a development environment having collected regression tests for many years. Before a release, we started the test suite before going home on Friday evening, hoping that it would complete before Monday morning ten days later. So for bugs/fails reported by that week (++) run, there was a ten day turnaround. We invested in the very fastest Sun machine available on the market, cutting the time to complete the tests started on Friday afternoon to complete some time on the (first) following Monday, a week earlier than with the old setup.
Yet I was asking myself if we should possibly consider reducing the amount of regression testing, or trying to make the structure more efficient. Fact is that continuous unit, module and system tests regularly applied during development were so complete that the week long (later: weekend long) regression test run practically never revealed any problems.
In later jobs, I have seen tests requiring magnitudes more power than they should have, due to lack of proper unit and module tests. Or rather: Management of such. The developers do not trust that units have been properly tested, so in every module where the unit is used, the unit tests are run again, 'in this context'. Then for every (sub)system referencing a module, all the module tests are repeated, repeating all the unit tests ... and so on. The whole thing is repeated for each possible configuration / platform. The developers are completely deaf to proposals for managing tests in a way where you have some trust in the tests done the previous day of some unit that hasn't been modified for a month and has been tested in that configuration you are asking for about fifty times since. Any proposal for a more resource friendly test regime is, by the developers, considered an inappropriate interference with their 'professional' work. So, in my last job, any commit required several times the resources of the compilation and building, in doing all the testing that the developers insisted on.
Testing is fundamental to software quality. Yet I have seen so many crazy ways of doing it that I tend to sharpen my claws every time someone insists on spending even more resources on even more expensive (both monetary and in learning and managing) even more complex test infrastructures.
Testing should be relativistic: Make it as simple as necessary, but no simpler.
|
|
|
|
|
I don't always test my code, but when I do, I do it in Production.
|
|
|
|
|
Sounds like a Corona beer commercial from “the world’s most interesting man” 😊
|
|
|
|
|
I'm both old and old-fashioned. I view the unit testing fad with the same disdain as I do Scrum. It's double the work and I am set in my ways for testing. I build internal-facing apps only, and I just don't see the benefit to TDD. That's what users and UAT is for.
But I am impressed with your test code. Kind of already looks all unit testy to me.
If you think 'goto' is evil, try writing an Assembly program without JMP.
|
|
|
|
|
I'm old too. Had a manager who was into TDD. He said things like "write the test before the method" How in the blue heck am I supposed to write a test for something I haven't figured out what it's supposed to do yet?
Mercifully he moved to Washington state then Idaho. Don't have to deal with him anymore.
I’ve given up trying to be calm. However, I am open to feeling slightly less agitated.
|
|
|
|
|
MarkTJohnson wrote: How in the blue heck am I supposed to write a test for something I haven't figured out what it's supposed to do yet? You don't. You must define the contract you're testing in its entirety before you can write a test for it. Otherwise (as you said), how do you know what to test? If the contract evolves, so must the tests.
/ravi
|
|
|
|
|
TDD:
1. Write a test
2. Start writing the code
3. Change the test
4. Write some more code
5. Go back to step 3 as required
6. Finish the code.
7. Fix the bugs in the test.
That said, thinking about tests early do help me think "what are the edge cases here" etc. And a few times I do write test first if it is obvious what they should be. As with anything, as soon as you become pedantic you are in for pain.
|
|
|
|
|
lmoelleb wrote: thinking about tests early do help me think "what are the edge cases here" etc. I think of those things while writing the code inside the try{} block and inside the catch{} block. I don't see any benefit to doubling my work for the little, if any, benefit gained by adding a unit test.
Major companies like MS have been using TDD for years now, and their software still has bugs. I'm not impressed.
If you think 'goto' is evil, try writing an Assembly program without JMP.
|
|
|
|
|
If you asked me 5-10 years ago, I would have said the same. And the code I did write back then was not really testable, so based on that it was definitely a bad cost/benefit to write many tests. I have now adjusted how I write the code to make it easy - and now I find the code is more easy to read - but all of this is obviously personal preference.
|
|
|
|
|
OriginalGriff wrote: but should I embrace unit testing? I do, because it lets me sleep at night.
But I'd be lying if I said I do TDD - I don't. I write unit tests after the fact (but I do write them), and after I've written integration tests. Why? Because I find them more valuable than unit tests, but don't consider them to be a substitute for unit tests. Integration tests first, then unit tests. At least that's how I run.
/ravi
|
|
|
|
|
What you're doing, at least in the example you've given, is not too far off from the way the "cool kids" are doing unit testing. You're doing the whole AAA thing (arrange, act, assert), you've just bundled all of your test cases into a monolithic block. Since each assertion in your example depends on the outcome of exactly one action, and no action depends on the outcome of any other action, this could be made to fit into the modern unit testing box very easily by just breaking it up into a method-per-test structure, but I couldn't really make a strong case for why you should bother. Things might change if the code under test isn't quite as simple as in your example though, for example if there are dependencies that need to be mocked/stubbed.
|
|
|
|
|
I and reading the comments here, old farts who are set in their ways - who basically do what the cool aid drinkers want to call by a new name TDD - almost completely done, I once had an agile "evangelist" asses our methods and conclude that we already do close to agile (same methodology I had been using for 30+ years), she left us alone and concentrated on another team, poor bastards.
Stick with what you know and do, it works, it is tried and tested and your in depth knowledge of how it works is invaluable.
Never underestimate the power of human stupidity -
RAH
I'm old. I know stuff - JSOP
|
|
|
|
|
That evangelist sounds like one of the better ones. Recognize your existing methodology is OK - and not inventing a stupid recommendation to change something simply because she want to justify her own cost. Unfortunately the SAFe consultant at my previous employer did not have that skill (though from what I hear, they got rid of that crap after I left - as soon as the manager who introduced it moved on).
|
|
|
|
|
OriginalGriff wrote: Or are you all philistines who don't automate tests at all? The one time I automated a test, it was to verify I'd addressed all of the memory leaks in my UI application, written in C#. I'd discovered that the mechanism I was using for navigation leaked memory depending upon how the user navigated the application. Fixing the problem was the death of a thousand cuts. Every navigation destination could potentially leak bindings or event handlers, causing memory to not be garbage-collected. This problem showed up only when the application had been running for several days (it's a control app for a machine).
My 'automated test' was a little bit of code that woke up every half a second and navigated to a random destination. I ran the test on a couple machines over the course of a week. At the end of the test, the UI was still bopping about a couple times a second. The peak working set was about 150% of the initial working set after the app started up.
I viewed it as a successful test.
Software Zen: delete this;
|
|
|
|
|
For me, there are two primary benefits:
- Communicating intent -- using a well-known testing framework (as opposed to rolling your own) more easily communicates to other people what the testing code is supposed to do, because it's using a standard language.
- Integration -- such as with an automated build pipeline, or Visual Studio's Test Explorer.
|
|
|
|
|
You’re basically doing all the hard work that any self-respecting dev should do with testing so shifting to TDD would be no effort. But I would watch Ian Cooper on YouTube, “where did it all go wrong” and maybe read the best book on it (Kent Becks) as many have mis-represented his words of wisdom. TDD will speed you up if done right.
|
|
|
|