Saturday, June 25, 2005

10 Success Criteria For Software Development

The 10 QA traps mentioned in a previous post (10 Critical Quality Assurance Traps) had a major emphasis on test activities. However QA is much more than Quality Checking. Let's illustrate this with the results from the CHAOS report from the Standish Group.

Software never comes in on-time or on-budget, and it always breaks down. Software development projects are in chaos.

The Standish Group research showed a staggering 31.1% of projects are canceled before they ever get completed. 52.7% of projects cost 189% of their original estimates. Lost opportunity costs are not measurable.

On the success side, the average is only 16.2% for software projects that are completed on-time and on-budget. In the larger companies, the news is even worse: only 9% of their projects come in on-time and on-budget.

A whopping 61.5% of all large company projects were challenged: over-budget, over the time estimate, and offers fewer features and functions than originally specified.

Almost a third of the challenged and impaired projects experienced cost overruns of 150 to 200%, time overruns of 200 to 300%, and deliver less than 50% of the original features/functions.

Project challenged factors

1. Lack of User Input
2. Incomplete Requirements & Specifications
3. Changing Requirements & Specifications

Success criteria

1. User Involvement
2. Executive Management Support
3. Clear Statement of Requirements
4. Proper Planning
5. Realistic Expectations
6. Smaller Project Milestones
7. Competent Staff
8. Ownership
9. Clear Vision & Objectives
10. Hard-Working, Focused Staff

In order to make order out of the chaos, we need to examine why projects fail. Each major software failure must be investigated, studied, reported and shared. Failure begets knowledge. Out of knowledge you gain wisdom, and it is with wisdom that you can become truly successful.

Are the software development projects in your organisation successful? If there is room for improvement do you use the top10 above to identify the improvement areas to focus on?

Friday, June 17, 2005

10 Critical Quality Assurance Traps

From the Quality Techniques Newsletter QTN May 2005 Issue:

Seems obvious but great companies with great products fall into these traps.

1. Unclear ownership of product quality.
2. No overall test program design or goals.
3. Non-existent or ill-defined test plans and cases.
4. Testing that focuses narrowly on functional cases.
5. No ad hoc, stress or boundary testing.
6. Use of inconsistent or incorrect testing methodology.
7. Relying on inexperienced testers.
8. Improper use of tools and automation, resulting in lost time and reduced ROI.
9. No meaningful metrics for tracking bugs and driving quality back into development.
10. Incomplete regression cycle before software release.

To avoid these traps, it is important to incorporate best practices into your quality assurance process. The process should include an evaluation of where you are with quality assurance today, what your QA goals are, what the gaps are in the process, and finally you should build a roadmap to obtain your goals. Only after these steps have been taken can you avoid these quality assurance traps.

What was your worst experience ?

Friday, June 10, 2005

Use Test Cases to Clarify Requirements

How can you as a tester reduce the problems caused by inadequate requirements processes earlier on in the lifecycle ? Finding bugs earlier reduces the amount of rework for developers and allows the testers focus more on acceptance level testing.

Failed tests and re-execution of tests often result from vague requirements. Here are a couple of quick-win strategies for improvements in the requirements area. It are small gradual steps, below the management radar (no approval required) with benefit for developers, testers, and other members of the team.

Direct your attention to answering the question "What happens when ..."

When you define a test case:
  • Identify input states and then specify the expected outcome. In the process of developing system level test cases, you can easily identify poorly documented requirements whether it’s ambiguous input conditions, missing information, or unspecified expected outcome. Pinpoint where requirements fail to describe the outcome.
  • Complete the test descriptions by replacing missing information with reasonable assumptions and indicate this as such. This is of enormous assistance to the manager or lead developer, who then merely has to approve the modified description rather than schedule resources to complete the product definition.
  • Share the information with developers, previews what tests are planned, give them the opportunity to bullet-proof their code if necessary.
These shortcut actions are not sufficient to comply with CMM or other quality standards, it is not really requirements management, important boundary/performance/configurations may be overlooked, they do not present you the big picture, neither a full description of system behaviour. However see them as a jumpstart to process improvement, you have to start somewhere and it provides a baseline for future improvements.

Find this and other paractical strategies for testers in the article: Quick Start to Quality - Five Important Test Support Practices, by Louise Tamres at StickyMinds.

To learn more about writing use cases: Writing Effective Use Cases, by Alistair Cockburn, Addison-Wesley 2001.

Friday, June 03, 2005

How to Use Design Patterns

Eric Gamma interviewed by Bill Venners:

Design Patterns help people learn object-oriented thinking: leverage polymorphism, design for composition, delegation, balance responsibilities, provide pluggable behavior. They help you where you need more flexibility, to encapsulate an abstraction, to make code less coupled, preserve layers, avoid up calls and circular dependencies.

You only appreciate a pattern once you have felt this design pain (for example duplicated code). Do not start immediately throwing patterns into a design, but use them as you go and understand more of the problem. Trying to use all the patterns is a bad thing. When you have a code smell go to the patterns toolbox to find a solution. The goal is a clean, easy to understand API, not to win an I-used-the-most-patterns contest.

When you really need extensibility, then patterns provide you with a way to achieve it. But when you don't need it, you should keep your design simple and not add unnecessary levels of indirection.

Pattern density: core abstractions, and around that several other emerging design points, which in turn are materialized by pattern instances.

Being fluent in patterns conversation goes really fast, enabling a high-velocity design. Patterns give us a language to talk about design.

micro-architectures: recurring design structures that give you properties like extensibility, decoupling, and last but not least, elegance.

For a good example of pattern usage see JUnit: A Cook's Tour, an article by Erich Gamma and Kent Beck, that walks you through the design of JUnit by "starting with nothing and applying patterns, one after another, until you have the architecture of the system."

Design Patterns: Elements of Reusable Object-Oriented Software by Erich Gamma, Richard Helm, Ralph Johnson, John Vlissides. The authoritive guide and without doubt one of the most influential books on software development of the past decade.

Head First Design Patterns, by Elisabeth Freeman, Eric Freeman, Bert Bates, Kathy Sierra. A recent book that communicates the essence of design patterns in a novel and highly visual way.