I used to periodically clean out my mailing list boxes and post the bits of messages I thought were interesting. Here is June 2008 to now of the Agile-Testing one.

  • Capturing Requirements
    • There is very little you can do to capture requirements up front – correctly. Instead we embrace change and work in small increments, getting constant feedback from the customer. … We can’t get it right up front because we didn’t understand. – Mark Levison
    • Fail Fast, High Feedback loops, Adapt to change – Matt Heusser
    • And then have the customer actually use the resulting working software right after each aspect of the feature is implemented to verify that the implementation is what they want and need. – Steve Gordon
    • … we don’t have to capture them all from the get-go, but we can make discoveries about what’s important (and what isn’t) as we go. – Michael Bolton
  • Name magic is very powerful, for good and for ill. – Michael Bolton
  • A Selenium/FIT mashup: fitinum
  • ShuHaRi
  • … you might be able to do some more proactive testing by scanning the CSS or inline “style” attributes for potential issues. The incompatibilities are relatively well-known – if you could parse the styles for “danger signals” that would give you hint on where to look. – Dave Rooney on cross-browser testing
  • Testing Flex Apps
  • Wisdom begins when we discover the difference between “That makes no sense” and “I don’t understand”. – Mary Doria Russell (via Ron Jeffries)
  • There’s the stuff we know we know (addressed by unit and functional tests), the stuff we know we don’t know (addressed to some extent by integration and smoke tests) and the stuff we don’t know we don’t know (addressed by exploratory testing) – Rumsfeld as interpreted by Titus Brown
  • Talking about Guidelines
  • Agile Acceptance Testing Video
  • Scott Ambler’s The Role of Testing and QA in Agile Software Development
  • Strangler Applications paper by Mike Thomas
  • Some things that make up the Testing Mindset (from Michael Bolton)
    • to think critically about the software
    • to remain doubtful and cautious in the face of certainty from others;
    • to question, rather than to confirm
    • to assist the team in not being fooled
    • to help in identifying factors other than functional incorrectness that can threaten the value of the product
    • to help in identifying other usage modes of the product that can add to its value
    • inventing and/or creating tools that can help to evaluate the product more quickly
    • to dig up and expose assumptions that have been buried and that might be invalid or inconsistent
    • to recognize that inductive proof that “it works” (lots of passing tests) isn’t proof at all
    • to keep searching for non-white swans, even when all our tests seem to show that all swans are white
    • to recognize that things can be different
  • Beginning with a modest goal and refactoring as I went along, I was able to construct a harness that was just powerful enough for the task at hand but flexible enough to grow to meet our future testing needs. – Kevin Lawrence