This week’s feature article on StickyMinds is a about completion requirements which is another thing I feel persons in QA should be involved in defining. Here are some check-lists to get you started on thinking along what done really means.

Requirements

  • Are requirements of sufficient detail and clarity to be developed upon?
  • Are requirements of sufficient detail and clarity to be tested upon?
  • Is the origin of the requirement (potential customer, existing customer, market movement, etc) recorded?
  • Are all outstanding questions from within the organization regarding the requirement set been addressed?

Feature Development

  • Are new feature design documents accurate and check into source control?
  • Are existing feature design docs that were impacted by this feature updated and in source control?
  • Are the unit tests for this feature in source control?
  • Has the new feature been integrated into the automatic build process?
  • Have the unit tests been run by Test to verify their validity and worth? Unit tests that only check “perfect” data do not really do much to increase the quality of the code base. Likely the “perfect” condition will be one of the few the developer checks during integration testing
  • Is there a bug associated with all FIXME or related comments in the code? By definition, if a developer puts something like FIXME in their new feature code, it means that there is something to fix in there. Something to fix equals bug.
  • Is there no more TODO or related comments in the code? Again, by definition, if there is something left “TODO” in a feature, then the feature is not 100% complete.

Feature Testing

  • Is there a new Feature Test Plan checked into source control?
  • Are test plans of other affected features updated and checked into source control?
  • Were new features test cases developed and checked into source control?
  • Were existing test cases for other features updated as a result of this new feature?
  • Is there a clear mapping between the feature test cases and the feature requirements?
  • Have final test results been archived in source control?
  • Has the feature been added to the automated testing solution? or Has a plan been recorded in source control regarding how the new feature could/should be integrated into the automated testing solution
  • Do all modifications to the automated testing solution framework have associated unit test? Practice what you preach
  • Have you tested with non-ascii data? The software market is global these days and not everyone limits their character usage to the standard ascii set

Current Product Engineering

  • Have unit tests that verify the bug no long exists checked into source control?
  • Have both the bug fix and unit tests been brought into the other relevant trees?
  • see Feature Development items

This list is only what popped into my head on the train. There is likely (absolutely) more. Add yours below.