|
|
That's a team problem then. Just like how git can't police people into making concise, responsible commits in the right places, and wasn't designed to, that tool is to warn people who actually care when stuff breaks. If your cowboys don't care, that's a whole other issue that needs to be figured out, because they're creating work for other people.
That doesn't mean that an api cannot change - it means that the people involved have to do so responsibly - eg by providing overloads, or at least trying to keep existing signatures as intact as possible.
I'm not a fan of it, but perhaps you need something like gated checkins. I'd much rather have the team chat where we all come to an agreement on how we're going to work together though. You need to sell this idea of improving team cohesion up the chain - because right now, the cowboys are costing the company money when other people have to deal with their fallout.
------------------------------------------------
If you say that getting the money
is the most important thing
You will spend your life
completely wasting your time
You will be doing things
you don't like doing
In order to go on living
That is, to go on doing things
you don't like doing
Which is stupid.
|
|
|
|
|
OriginalGriff wrote: as best I can
Which is why I don't bother.
There is no way to test for many of the bugs I write.
Just yesterday I ran into a situation which I'm sure can't be tested statically, the problem arose because I fed two incompatible CSV files into a parser -- and it blew up with an IndexOutOfRangeException . Today I'm telling the parser to catch the Exception and return null .
Maybe it can protect large teams from simple mistakes made by inexperienced developers.
|
|
|
|
|
Not only I do it but I ended up writing my own unit test framework (and published it on CodeProject[^])
It is so easy to create new tests that most bugs I find end up as test cases and serve as regression tests.
Mircea
|
|
|
|
|
How did that article not get any votes? Well, it got mine now (a 5).
|
|
|
|
|
Thank you Marc It was one of my first CodeProject articles and probably not very good.
Mircea
|
|
|
|
|
I use MSUnit but am quite strict about using it to test units, not larger lumps of functionality so broadly speaking there is not much of a maintenance cost to keep the tests working...then it is automated as part of the CI/CD pipeline on Azure DevOps.
|
|
|
|
|
- 1 for philistine methodology
"Life should not be a journey to the grave with the intention of arriving safely in a pretty and well-preserved body, but rather to skid in broadside in a cloud of smoke, thoroughly used up, totally worn out, and loudly proclaiming “Wow! What a Ride!" - Hunter S Thompson - RIP
|
|
|
|
|
It's the debate between "is it good enough to ship?" and "we have 500 unit tests but haven't shipped anything in 2 years". (True story).
"Before entering on an understanding, I have meditated for a long time, and have foreseen what might happen. It is not genius which reveals to me suddenly, secretly, what I have to say or to do in a circumstance unexpected by other people; it is reflection, it is meditation." - Napoleon I
|
|
|
|
|
Make that 2 for philistine method! Actually, my users pay good money to do the testing.
"Go forth into the source" - Neal Morse
"Hope is contagious"
|
|
|
|
|
Unit tests are often a waste of time. See the articles linked here[^]. Coplien is one of the few "gurus" for whom I have much use.
Unit tests are orthogonal to whether tests are automated. Automation, and system and regression tests, are essential to anything beyond toy projects.
|
|
|
|
|
Best unit test I have ever found= user (see idiot)
>64
Some days the dragon wins. Suck it up.
|
|
|
|
|
For algorithmic things like what you posted, unit tests are great, and I would definitely write that with a unit test "engine." That said, I also end up spending time debugging the tests, not the algorithms.
|
|
|
|
|
Writing unit tests means you have no customers
|
|
|
|
|
My experience is that most test frameworks rapidly grows into such a complexity that you spend far more time on all the required red tape than on developing good tests. It may pay for huge systems that will be in development for many years, by scores of developers, but for smaller systems, you can do 99% of the same amount of testing with a much simpler infrastructure, with far less test management.
Certainly: Do systematic testing! And have a setup that allows you to play old tests again - a.k.a. regression testing. Just don't let the testing infrastructure completely take over.
The important tasks in testing is not managing the tests, but rather to identify relevant test cases. All corner cases - and sometimes the cartesian product of all possible cases (when the product is within reasonable limits). How to provoke synchronizing and timing issues. Identify relevant stress testing. And so on. I have seen cases where far more time was spent on test management than on developing relevant tests.
Regression testing is essential (and I am surprised by how often I see new software releases witn regression from earlier releases!), but sometimes I wonder if it is getting out of hand: Some years ago, I worked in a development environment having collected regression tests for many years. Before a release, we started the test suite before going home on Friday evening, hoping that it would complete before Monday morning ten days later. So for bugs/fails reported by that week (++) run, there was a ten day turnaround. We invested in the very fastest Sun machine available on the market, cutting the time to complete the tests started on Friday afternoon to complete some time on the (first) following Monday, a week earlier than with the old setup.
Yet I was asking myself if we should possibly consider reducing the amount of regression testing, or trying to make the structure more efficient. Fact is that continuous unit, module and system tests regularly applied during development were so complete that the week long (later: weekend long) regression test run practically never revealed any problems.
In later jobs, I have seen tests requiring magnitudes more power than they should have, due to lack of proper unit and module tests. Or rather: Management of such. The developers do not trust that units have been properly tested, so in every module where the unit is used, the unit tests are run again, 'in this context'. Then for every (sub)system referencing a module, all the module tests are repeated, repeating all the unit tests ... and so on. The whole thing is repeated for each possible configuration / platform. The developers are completely deaf to proposals for managing tests in a way where you have some trust in the tests done the previous day of some unit that hasn't been modified for a month and has been tested in that configuration you are asking for about fifty times since. Any proposal for a more resource friendly test regime is, by the developers, considered an inappropriate interference with their 'professional' work. So, in my last job, any commit required several times the resources of the compilation and building, in doing all the testing that the developers insisted on.
Testing is fundamental to software quality. Yet I have seen so many crazy ways of doing it that I tend to sharpen my claws every time someone insists on spending even more resources on even more expensive (both monetary and in learning and managing) even more complex test infrastructures.
Testing should be relativistic: Make it as simple as necessary, but no simpler.
|
|
|
|
|
I don't always test my code, but when I do, I do it in Production.
|
|
|
|
|
Sounds like a Corona beer commercial from “the world’s most interesting man” 😊
|
|
|
|
|
I'm both old and old-fashioned. I view the unit testing fad with the same disdain as I do Scrum. It's double the work and I am set in my ways for testing. I build internal-facing apps only, and I just don't see the benefit to TDD. That's what users and UAT is for.
But I am impressed with your test code. Kind of already looks all unit testy to me.
If you think 'goto' is evil, try writing an Assembly program without JMP.
|
|
|
|
|
I'm old too. Had a manager who was into TDD. He said things like "write the test before the method" How in the blue heck am I supposed to write a test for something I haven't figured out what it's supposed to do yet?
Mercifully he moved to Washington state then Idaho. Don't have to deal with him anymore.
I’ve given up trying to be calm. However, I am open to feeling slightly less agitated.
|
|
|
|
|
MarkTJohnson wrote: How in the blue heck am I supposed to write a test for something I haven't figured out what it's supposed to do yet? You don't. You must define the contract you're testing in its entirety before you can write a test for it. Otherwise (as you said), how do you know what to test? If the contract evolves, so must the tests.
/ravi
|
|
|
|
|
TDD:
1. Write a test
2. Start writing the code
3. Change the test
4. Write some more code
5. Go back to step 3 as required
6. Finish the code.
7. Fix the bugs in the test.
That said, thinking about tests early do help me think "what are the edge cases here" etc. And a few times I do write test first if it is obvious what they should be. As with anything, as soon as you become pedantic you are in for pain.
|
|
|
|
|
lmoelleb wrote: thinking about tests early do help me think "what are the edge cases here" etc. I think of those things while writing the code inside the try{} block and inside the catch{} block. I don't see any benefit to doubling my work for the little, if any, benefit gained by adding a unit test.
Major companies like MS have been using TDD for years now, and their software still has bugs. I'm not impressed.
If you think 'goto' is evil, try writing an Assembly program without JMP.
|
|
|
|
|
If you asked me 5-10 years ago, I would have said the same. And the code I did write back then was not really testable, so based on that it was definitely a bad cost/benefit to write many tests. I have now adjusted how I write the code to make it easy - and now I find the code is more easy to read - but all of this is obviously personal preference.
|
|
|
|
|
OriginalGriff wrote: but should I embrace unit testing? I do, because it lets me sleep at night.
But I'd be lying if I said I do TDD - I don't. I write unit tests after the fact (but I do write them), and after I've written integration tests. Why? Because I find them more valuable than unit tests, but don't consider them to be a substitute for unit tests. Integration tests first, then unit tests. At least that's how I run.
/ravi
|
|
|
|
|
What you're doing, at least in the example you've given, is not too far off from the way the "cool kids" are doing unit testing. You're doing the whole AAA thing (arrange, act, assert), you've just bundled all of your test cases into a monolithic block. Since each assertion in your example depends on the outcome of exactly one action, and no action depends on the outcome of any other action, this could be made to fit into the modern unit testing box very easily by just breaking it up into a method-per-test structure, but I couldn't really make a strong case for why you should bother. Things might change if the code under test isn't quite as simple as in your example though, for example if there are dependencies that need to be mocked/stubbed.
|
|
|
|