Preface. |
Acknowledgments. |
I. REQUIREMENTS PHASE. |
1. Involve Testers from the Beginning. |
2. Verify the Requirements. |
3. Design Test Procedures as Soon as Requirements Are Available. |
4. Ensure That Requirement Changes Are Communicated. |
5. Beware of Developing and Testing Based on an Existing System. |
II. TEST PLANNING. |
6. Understand the Task at Hand and the Related Testing Goal. |
7. Consider the Risks. |
8. Base Testing Efforts on a Prioritized Feature Schedule. |
9. Keep Software Issues in Mind. |
10. Acquire Effective Test Data. |
11. Plan for the Test Environment. |
12. Estimate Test Preparation and Execution. |
III. THE TESTING TEAM. |
13. Define the Roles and Responsibilities. |
14. Require a Mixture of Testing Skills, Subject Matter Expertise, and Experience. |
15. Evaluate the Testers' Effectiveness. |
IV. THE SYSTEM ARCHITECTURE. |
16. Understand the Architecture and Underlying Components. |
17. Verify That the System Supports Testability. |
18. Use Logging to Increase System Testability. |
19. Verify That the System Supports Debug vs. Release Execution Modes. |
V. TEST DESIGN AND DOCUMENTATION. |
20. Divide and Conquer. |
21. Mandate the Use of a Test Procedure Template, and Other Test Design Standards. |
22. Derive Effective Test Cases from Requirements. |
23. Treat Test Procedures as "Living" Documents. |
24. Use System Design and Prototypes. |
25. Use Proven Testing Techniques When Designing Test Case Scenarios. |
26. Avoid Constraints and Detailed Data Elements in Test Procedures. |
27. Apply Exploratory Testing. |
VI. UNIT TESTING. |
29. Develop Unit Tests in Parallel or before the Implementation. |
30. Make Unit Test Execution Part of the Build Process. |
VII. AUTOMATED TESTING TOOLS. |
31. Be Aware of the Different Types of Testing Support Tools. |
32. Consider Building a Tool Instead of Buying One. |
33. Be Aware of the Impact of Automated Tools on the Testing Effort. |
34. Focus on the Needs of Your Organization. |
35. Test the Tools on an Application Prototype. |
VIII. AUTOMATED TESTING—SELECTED BEST PRACTICES. |
36. Do Not Rely Solely on Capture/Playback. |
37. Develop a Test Harness When Necessary. |
38. Use Proven Test Script Development Techniques. |
39. Automate Regression Tests Whenever Possible. |
40. Implement Automated Builds and Smoke-Tests. |
IX. NONFUNCTIONAL TESTING. |
41. Do Not Make Nonfunctional Testing an Afterthought. |
42. Conduct Performance Testing with Production Sized Databases. |
43. Tailor Usability Tests to the Intended Audience. |
44. Consider All Aspects of Security, for Specific Requirements and System-Wide. |
45. Investigate the System's Implementation to Plan for Concurrency Tests. |
46. Setup an Efficient Environment for Compatibility Testing. |
X. MANAGING THE TEST EXECUTION. |
47. Clearly Define the Beginning and the End of the Test Execution Cycle. |
48. Isolate the Test Environment from the Development Environment. |
49. Implement a Defect Tracking Life-Cycle. |
50. Track the Execution of the Test Program. |