The other trade rag that was queued up is Software Test & Performance. Here are my thoughts on the last three issues.

October, 2008

  • Talend seems to be a company that is trying to make a go at thingss using the value-added model to open source. The Data Quality Suite was just released and seems pretty interesting. Or at least it does with the database I have. Can I have a native Mac version please?
  • There is a good overview of the hows and whys of detecting ‘borrowed’ code by Bob Zeidman called Are There Copycats In Your Company?. It gets mathy pretty quick, but if your cares about this sort of thing (such as you when you have an closed-source application written which has functionality of an open-source one) then this might be something worth integrating into your testing process.
  • Elfriede Dustin starts a series of articles about test automation. Check out the ‘Building Testability Into
    The Application’ and ‘GUI/Interface Testing Recommendations’ sections for things that make automation easier.
  • The main idea behind Development and Testing: Two Sides of the Same Coin is that test automation projects are coding projects, just done by the test team and as such need to be treated and developed the same way. I agree, but not necessarily to the waterfall-ish model presented.

November, 2008

  • The Testers Choice winners are listed. Doesn’t seem to be much difference between last year’s winners and this. I wonder how many products get actually purchased as a result of these. I remember a story I was told by a Mercury consultant in late 1999. Mercury became the number one automation platform because it had a campaign that advertised that they were. Managers (and their influencers) saw that and figured it was a safe bet to buy the products that were best in the market. Mercury said they were, so they bought the products and the resulting revenue increase made them number one in marketshare. Nobody ever got fired for buying IBM meets test automation.
  • Step-by-Step Performance Test lists 5 types of performance tests and spends about half a page on each. Those types are:
    • Response Time
    • Load
    • Stress
    • Stability
    • Database Volume
  • Elfriede Dustin continues her series about automation failure with the various myths that surround automation. Quick, how many have you been hit with? Likely all of them if you have been around for a couple years.
    • Test Efforts are Immediately Reduced
    • Schedules are Immediately Compressed
    • Automation Tools are Easy to Use
    • All Tests Can be Automated
    • Automation Provides 100 Percent Test Coverage
    • Test Automation is the Same As Capture and Playback
    • Automated Software Testing is A Manual Tester Activity

    And of course, you yourself have to be careful of Losing Site of the Testing Goal: Finding Defects

December, 2008

  • Elfriede Dustin finishes the series on test automation by presenting the Automated Testing Lifecycle
    Methodology (ATLM). These are the phases of it.
    • Requirements Gathering— Analyze automated testing needs and develop high level test strategies
    • Design & Develop Test Cases
    • Automation framework and test script development
    • Automated test execution and results reporting
    • Program review It is very waterfall; in fact, they she talks about gating the steps. I’m not sure how it would mesh with agile-ish environments.