Click here to Skip to main content
15,885,366 members
Articles / All Topics

“Done”. “Done Done”. When Is A Feature Really Done?

Rate me:
Please Sign up or sign in to vote.
5.00/5 (1 vote)
31 Mar 2012CPOL6 min read 10.1K   3   2
Summarizing the long bulleted list definition of "Done Done" as a single sentence just to make sure we clearly identify the core item we are solving and make it easy for everyone to understand the basic premise.

At Telerik, just like at every other software development shop, we have struggled with the definition of "Done". Throughout the years, we've had many definitions of "Done" within teams and individuals:

  • "It's done, but it's not tested" (Testing is left for the next sprint).
  • "Yeah, it's working well. There are a few blocking issues but we'll fix that later" (Bug fixing is always a "tail" task and is never budgeted as part of the v1 of a "Done" feature).
  • "Sure, it's 100% done but it doesn't have any UI yet" (Missing core deliverables of the feature like a UI for an end-user feature and defining done as having the API done).
  • "It's kinda ugly but it works, right?" (Zero focus on usability).
  • "It works on my machine so it's done" (Doesn't take into account the differences between the local and target deployment environment).
  • "Yeah, the API is a bit awkward and customers won't get it but it works" (The feature does not work in a way the customer expects it to work).

I am sure that most of you have experienced frustration when everyone's definition of "Done" is very different. In addition to tension, it has many cascading effects on planning as well as trust between team members and with customers. After analysis, we came to the real culprit - everyone interpreted "Done" from only their perspective. There really was no good flow from the "Done" on the personal level, to "Done" from the customer's point of view.

Seeing this happen, we came to the conclusion that we need a more uniform definition of "Done" that we could push across the company. We sat down and tried to find the common denominator. While we knew there would be variations from product to product, we came up with the following generic checklist items to try define "Done" from a product team's perspective:

  • Acceptance tests identified and written. The feature must cover the use cases/stories for the customer. Everything starts from here - picking the correct minimal set of features to accomplish the use cases is the real key as it drives true customer value. You need to think about the problem you are solving - the feature by itself is not providing value to the customer.
  • All of the must-have components of features are implemented. Every feature should deliver a complete benefit to the customer. For example, often teams would complete 9 out of 10 tasks for a feature and would miss a key task that allows the feature to provide true benefit to the customer.
  • All the code and documentation artifacts for the feature have undergone review and have been moved from the features branches to the main trunk in source control.
  • Feature is covered with appropriate amounts of unit and integration tests and you have a "green" automated build that compiles and runs these tests.
  • Feature is covered with automated UI tests to prevent future regressions that are often outside of control (e.g. browsers, library upgrades etc.).
  • Feature is tested in the near-real environment; for example a forum software should be tested with 1 million threads with hundreds of simultaneous users (definition of "near-real" depends of course on the feature, scope and target).
  • All public APIs have a real world example; if an example cannot be thought of, API should not be public. If there is no example, public facing API is only regression risk.
  • Release notes (internal) and/or a video have been posted so that colleagues are aware of the feature and implementation.
  • Upgrade paths are tested and backwards compatibility is assured, otherwise breaking changes are clearly stated with outlined mitigation paths.
  • It's live in the test environment and there's a build or URL where you can see the feature live.
  • There are no blocking bugs that hinder the adequate usage of the software.
  • There are examples and documentation how to properly use the feature.
  • And so on….

It was a long list of prescriptive guidance. Each team picked the items that made the most sense to them but there was an element of commonality - they are designed to ensure that nothing gets in front of a user of our products with known critical bugs, and without meeting some acceptance criteria set for it. This is the primary reason for a definition of done. It ensures everyone shares a common understanding of what it means to be done, and it ensures that the "done" stuff that gets to a user works as it was intended to.

The above established the "framework" but we found another problem - teams were not always conscious of the entire flow of a feature, from the time it is conceived, to the time a customer can start using it. What was the obvious decision to solve the problem? To extend further the definition of "Done." We came up with our own definition of "Done Done" in order to capture the notion that "Done" is really done when a customer can benefit from the feature:

  • It's shipped. Only shipped features are really done. Even the nicest and most tested and complete feature is not truly done if it's not in the hands of customers.
  • It's discoverable. This starts much earlier and there are many artifacts that need to be produced such as marketing collateral, website copy, newsletters, promotion through social media, docs, SDK, examples, etc. It's a team effort, not just an engineering effort and everyone needs to prepare the stuff that will allow customers to come across the feature. A feature is useless if customers don't know about its existence.
  • There's mechanisms to capture feedback and/or telemetry and understand whether this feature really is used, whether it's relevant for customers and where it needs improvement so that we can run future cycles on it.

After creating this comprehensive guideline, we went back to the drawing board and tried to summarize the long bulleted list definition of "Done Done" as a single sentence just to make sure we clearly identify the core item we are solving and make it easy for everyone to understand the basic premise:

  • "Each product group at Telerik is empowered to define "Done" in a way that ensures that every feature and every release meets a minimum state of completeness that allows customers to find, properly use and genuinely appreciate the new release of our software."

Today, each team has its own definitions of "Done" all through the lifecycle but the common denominator is that it culminates in "Done" from the perspective of our customers. It might sound like a small change of focus/perception but it has had a tremendous impact on results, from customer loyalty and happiness, to reduced support costs due to higher quality.

This article was originally posted at http://feeds.feedburner.com/Telerik

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)


Written By
United States United States
This member has not yet provided a Biography. Assume it's interesting and varied, and probably something to do with programming.

Comments and Discussions

 
QuestionGood explanation on Done, Pin
Madhanlal JM3-Apr-12 21:13
Madhanlal JM3-Apr-12 21:13 
GeneralMy vote of 5 Pin
Kunal Chowdhury «IN»1-Apr-12 23:56
professionalKunal Chowdhury «IN»1-Apr-12 23:56 
A good read. Thanks for sharing. 5ed.

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Praise Praise    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.