|
Oh and please don't turn this into one of these dumb git-sucks type debates. I'm too old for that.
Jeremy Falcon
|
|
|
|
|
IMO, it only makes sense to do unit testing when the inputs & outputs from a function/module can be specified. To take a very simple case, testing the strlen() function in C:
- Input must be a non-null pointer
- Output must be a non-negative integer
- The (output)th character of the input is a null character.
- No null characters are to be found in the range [ 0 .. (output) ) of the input
In cases where the output is not easy to check (for example a trigonometric function), exhaustive testing is impractical. In this case, only very simple "sanity" tests can be performed.
In real-world code I usually try to test all boundary conditions, but don't try to perform exhaustive testing.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
Daniel Pfeffer wrote: IMO, it only makes sense to do unit testing when the inputs & outputs from a function/module can be specified. Fo sho, that's actually a unit test. There other type of larger tests (functional tests) that get a bit more abstract, which one can make a case for or against. But, a unit test should test a very small unit. Typically that will equate to a routine, um... unless you have 5 page long functions.
Daniel Pfeffer wrote: In cases where the output is not easy to check (for example a trigonometric function), exhaustive testing is impractical. In this case, only very simple "sanity" tests can be performed. Keep in mind, I don't know trig like at all... but most testing frameworks allow you to test all kinds of output. If by not being able to test trig you mean like a picture on the screen, you can even test that too whether it's against a fixture or something else. Or perhaps test the routine before it gets sent to a renderer than then also visually compare and so on. It's like riding a bike, the more you do it the mo' easy it becomes to test.
Jeremy Falcon
|
|
|
|
|
One can only test a trigonometric function by comparing its results to the results of another implementation coded using a different approximation. The problem is that one has to write this additional implementation, at least doubling the work that must be performed. One can perform spot checks by comparing the results to known result calculated by another implementation, but that is hardly an exhaustive test of one's implementation.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
Daniel Pfeffer wrote: One can only test a trigonometric function by comparing its results to the results of another implementation coded using a different approximation. There's nothing preventing you from unit testing that. It's call mocking and just about every testing framework supports that. Testing approximations with even random values is completely doable in just about any testing framework.
Daniel Pfeffer wrote: One can perform spot checks by comparing the results to known result calculated by another implementation, but that is hardly an exhaustive test of one's implementation. There's always more code to write a unit test even if you're testing how to cross the street with grandma. That's not the point. The point is, it's worth it. And tests are an art just like software development, it's as exhaustive as you make it. Just because I don't know trig, doesn't mean I don't know things like cryptography and randomness. You can test that. Promise.
But, let's pretend you can't test that one tiny part. Just for the sake of argument. You can still test 80-90% of the rest of the application.
Edit: Btw, I hope this post didn't come across as sour man. I never know these days, and well most online chats are... you know.
Jeremy Falcon
modified 21-Apr-24 11:20am.
|
|
|
|
|
I sit corrected.
Jeremy Falcon wrote: Btw, I hope this post didn't come across as sour man.
Not at all. We're having a civilised debate, a rarity on the Internet these days...
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
That's not an argument against writing tests, it's merely pointing out that some functions need to be tested exhaustively to be completely confident in their correctness, which may be impractical.
|
|
|
|
|
That was exactly my point.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
Yay for unit tests, because I like to sleep easy at night.
Our DOD requires the creation/modification of unit tests when new functionality is implemented and existing functionality modified. We don't yet do TDD but are in the process of implementing integration test projects that would make it easy for devs to write the test before writing the code.
Note: IMHO best practices like these require the buy in of management. Thankfully all our dev managers are ex-developers.
/ravi
|
|
|
|
|
Ravi Bhavnani wrote: Note: IMHO best practices like these require the buy in of management. Thankfully all our dev managers are ex-developers.
Upvoted for this.
Over the decades, I have tried many times to get better practices to be adopted in my places of employment. My attempts have failed, usually when the managers realized that it isn't a magic bullet, and that there is a learning curve for adoption.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
And I upvoted your upvote... because why not.
Jeremy Falcon
|
|
|
|
|
Ravi Bhavnani wrote: Yay for unit tests, because I like to sleep easy at night. Preach brother.
Ravi Bhavnani wrote: Our DOD requires the creation/modification of unit tests when new functionality is implemented and existing functionality modified. What's DOD mean? I think Dept of Defense when I hear that. Just curious.
Ravi Bhavnani wrote: We don't yet do TDD but are in the process of implementing integration test projects that would make it easy for devs to write the test before writing the code. Be curious to know how it goes. I've never done full blown TDD (I'm stubborn), but would love to hear a use case about it.
Ravi Bhavnani wrote: Thankfully all our dev managers are ex-developers. The best ones are, buddy.
Jeremy Falcon
|
|
|
|
|
DOD = "definition of done" as applied to a work item. Before a work item can be marked complete, we require that it be unit tested and documented (this applies more to APIs).
Jeremy Falcon wrote: The best ones are, buddy. Agreed. I've found this to be the case more at early stage companies, which are the only places I've worked at since 2000.
/ravi
|
|
|
|
|
Ravi Bhavnani wrote: DOD = "definition of done" as applied to a work item. Oh crap. I should've figured that out. I need coffee. Thanks tho.
Ravi Bhavnani wrote: I've found this to be the case more at early stage companies, which are the only places I've worked at since 2000. I've been the enterprise world for a while, but I'm starting to think you're onto something. Need a change, might have to give that a go.
Jeremy Falcon
|
|
|
|
|
Jeremy Falcon wrote: What's DOD mean? I think Dept of Defense when I hear that. Just curious.
Design or Death?
(The Software Engineer's equivalent of Publish or Perish... )
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
Sometimes I am lazy and skip them - typically when I am not quite sure I have the main "flow" worked out. It gives a short term benefit not spending time on them, but of course that has to be paid later - so I do at least make sure to write decoupled code that I can easily add the test. If I am reasonable certain of the flow, I write the test along with the code (sometimes even before as TDD, but that is rare). It is often much faster to itterate over a code block in the test than running an application.
And of course, when I do go back and write the tests I skipped I find a bug or two....
In general it works as an investment: loose an hour writing a test now, or waste a day at a later time due to lack of tests... Sometimes the hour now is worth more than the day in the future. It only becomes a problem if the cost of the day in the future isn't even considered when skipping the test.
|
|
|
|
|
Same man. Not every piece of code is tested, but for the code I know that has to work correctly or else... it is.
Jeremy Falcon
|
|
|
|
|
I always write tests for the small components in the code (aka unit tests) for
two reasons:
1. 1 day of writing unit tests saves me a week of looking for bugs in the small crevices of a larger project
2. unit tests describe the behaviour of the component, so they double as documentation
Also, since I have mostly worked at small companies there is usually nobody to double
check my code. So testing is fundamental to avoid big mistakes.
|
|
|
|
|
Nelson Goncalves Oct2022 wrote: Also, since I have mostly worked at small companies there is usually nobody to double check my code. That's a good point. I've found some of my own silly bugs that way too.
Jeremy Falcon
|
|
|
|
|
It's a "yay" from me! However I'm a bigger fan of integration testing, whereby one can test the full functionality of a system or part of it. Not a believer in TDD.
|
|
|
|
|
Fo sho, both integration testing and unit testing should happen. Usually integration testing is done by QA though.
Jeremy Falcon
|
|
|
|
|
The best use of unit-testing I've seen (ie. admired, admittedly from a distance thus far) is to create a test that breaks in a meaningful way (when fixing a bug, it tickles the bug and fails ... or when adding a feature, it tries to perform the actions that are not yet implemented). Then, 'fixing the bug' or 'implementing the feature' is 'done' when your test passes. The test lingers on ... because it continues to pass, you know that your latest changes didn't take other parts of your code backward. A great example of this discipline in action is the main dev of jOOQ (Github link)[^] ... he pretty much doesn't start a bit of new code without an issue and a failing test.
Unit testing should absolutely not be used for things like double-checking that code does what the complier pretty much says it will. Less is more.
|
|
|
|
|
That's just TDD, isn't it? 😉
|
|
|
|
|
Yeah, kinda. I feel it's less tedious/rigorous/exhaustive than TDD as I've seen it explained. I've seen TDD promoted as an iterative design aid: you don't know what you're doing exactly so you write a test which uses an imaginary API, then try to get the test working. Then you reflect a little more and adjust the test and write some more primary code. There are some benefits of this such as you've got only a very short departure from code that runs at all times. However the test *driven* nature of it doesn't sit well with me. I like to do as much up-front-design as I can: in my head, on paper, as formal requirements, whatever.
In the Unit Testing I admire, it's more of a "there, I deliberately broke something, and when I'm done it won't be broken anymore". You're not so much testing for correctness or using it as a design process, as you're throwing spanners in your own gears and making your code cope. It now 'covers more ground' than it did previously
|
|
|
|
|
DT Bullock wrote: Unit testing should absolutely not be used for things like double-checking that code does what the complier pretty much says it will. Less is more. Compilers can't check logic errors. Not sure if that's what you meant or not.
Jeremy Falcon
modified 22-Apr-24 10:23am.
|
|
|
|