"Done" is one of the most overloaded words in custom software. Different people in the same room use it to mean wildly different things. The misalignment causes more friction and missed deadlines than any technical issue.
Here's an honest taxonomy of what "done" can mean, why the differences matter, and how to align everyone on what they're actually asking for.
The 5 versions of "done"
1. "Code complete"
A developer's standard. The function or feature has been written, compiles/runs without errors, and the developer has tested the happy path manually. This is usually 20–30% of the actual work.
2. "Tested"
The feature has been QA-tested by someone other than the developer. Edge cases have been considered. Bugs have been filed and (mostly) fixed. This is typically 50–65% of the work.
3. "Production-ready"
The feature passes load testing, security review, has monitoring and alerting in place, is documented for support, and can survive realistic production conditions. Adds 15–25% to "tested."
4. "Launched and stable"
The feature has been in production for 2–4 weeks. Real users have hit edge cases the QA team missed. Bugs have been patched. Performance has been observed. Adds another 10–15% of effort.
5. "Owned and maintained"
There's a team or person who knows the code, can extend it, knows how to debug it, and has the operational responsibility. Adds variable cost but is what really determines whether the feature is "done" long-term.
Why this matters
When a project plan says "the feature will be done in 6 weeks," that statement is incomplete without specifying which version of "done."
- A developer saying "the build is done" usually means version 1 (code complete) or sometimes 2 (tested).
- A product manager saying "the feature is done" usually means version 3 or 4.
- An executive saying "the project is done" usually means version 4 or 5.
A 30% discrepancy in the definition of "done" is enough to make a 6-week project feel like it slipped by 2 months when nothing actually went wrong — everyone was working toward different goals.
What we mean by "done"
For our project deliveries, "done" means version 4 — launched and stable. The deliverable includes:
- Feature in production
- Real users have used it for 2+ weeks
- Edge case bugs fixed
- Performance monitored
- Documentation written
- Team trained
Anything less and we don't bill it as complete. This is why our timelines have a "stabilisation" phase built in.
How to align with your vendor
Three questions to ask any custom software vendor:
- "When you say 'done', which of these 5 versions do you mean?" If they say "code complete" — you're going to do most of the operational work yourself. If they say "launched and stable" — they're taking real responsibility.
- "What's included in your warranty/post-launch period?" Look for at least 30 days of free bug fixes after launch. Less than that, and "done" means "you're on your own immediately."
- "What does your handover look like?" Who's on the call? What documents are produced? Who has ongoing access to engineers if you need it?
What "done" should mean for you
Pick your definition of done before signing anything. Communicate it explicitly to the vendor. Make sure the contract reflects it.
For most custom software projects, the right definition is "launched and stable." Some projects (internal tools you'll iterate on quickly) might be OK with "production-ready." Some (mission-critical infrastructure) demand "owned and maintained" with explicit transition planning.
But never accept "code complete" as the definition of done unless you have engineering capacity in-house to take it from there.
For a transparent project quote with explicit "done" criteria, our cost calculator takes 60 seconds. Or contact us for a 48-hour scope.