The Review Process is a systematic quality management activity that evaluates software work products, project status, and processes to identify defects, ensure compliance, verify progress, and facilitate informed decision-making throughout the software development lifecycle.
Why Reviews Matter
Reviews serve as critical quality gates that prevent defects from propagating to later stages where they become exponentially more expensive to fix. Beyond defect detection, reviews provide:
- Visibility into actual project status for stakeholders
- Risk identification before issues become crises
- Shared understanding across team members
- Compliance evidence for regulatory requirements
- Knowledge transfer among team members
Without systematic reviews, projects suffer from undetected defects, unrealistic expectations, misaligned stakeholder assumptions, and preventable failures.
The Five Types of Software Reviews
The IEEE Standard 1028-1997 defines five distinct types of reviews, each serving a specific purpose with different participants and levels of formality:
| Review Type | Primary Purpose | Typical Participants | Formality Level |
|---|---|---|---|
| Management Review | Evaluate project status, schedule, resources, and risks | Project manager, leads, SQA, SCM | High |
| Technical Review | Assess technical approach, design decisions, and product quality | Technical peers, subject matter experts | Medium-High |
| Inspection | Detect defects in work products with rigorous process | Moderator, author, inspectors, scribe | Highest |
| Walkthrough | Educate team, find issues informally | Author as presenter, peers | Low-Medium |
| Audit | Verify compliance with standards, contracts, and requirements | Independent auditors, SQA, customer representatives | Highest |
The Core Review Process Flow
While specific review types vary in formality, they follow a common structural flow:
text
┌─────────────────────────────────────────────────────────────────────────────┐ │ REVIEW PROCESS LIFECYCLE │ ├──────────┬──────────┬───────────┬──────────┬───────────┬──────────────────┤ │ 1. PLAN │ 2. PREPARE│ 3. EXECUTE│ 4. REPORT│ 5. REWORK │ 6. VERIFY & CLOSE │ └──────────┴──────────┴───────────┴──────────┴───────────┴──────────────────┘
Phase 1: Planning
The review process begins with establishing the review framework. A review matrix is created to define which document types require which reviewer roles, similar to a RACI matrix. Key planning activities include:
- Selecting review type based on product criticality and risk
- Defining entrance criteria (e.g., document complete, peer review performed)
- Identifying participants with appropriate perspectives
- Scheduling within project timeline
- Determining focus areas and specific review objectives
For agile projects, reviews often target individual requirements as soon as they are “Ready for Review.” Waterfall projects typically align reviews with milestones, reviewing complete requirement sets before phase transitions.
Phase 2: Preparation
Preparation is critical for effective reviews and primarily falls to the moderator:
- Distributing materials to reviewers sufficiently in advance
- Setting review focus – what risks or concerns need coverage
- Assigning reviewer perspectives (e.g., security, performance, maintainability, testability)
- Gathering contact information and establishing communication channels
- Preparing administration – tracking logs, checklists, reporting templates
The moderator informs all participants about the process, expectations, and their specific roles before the review begins.
Phase 3: Execution
During the review meeting or asynchronous review period, participants examine the work product:
For formal reviews (Inspections/Technical Reviews):
- Moderator leads the session, keeping focus on issue identification (not solution discussion)
- Scribe documents all substantive comments and issues
- Reviewers present findings from assigned perspectives
- Author listens and asks clarifying questions, may close some issues immediately
For tool-supported reviews:
- Modern platforms like DOORS Next and Jama Connect enable concurrent review
- Reviewers can log feedback and approvals simultaneously
- All feedback associates with specific artifact revisions
- Status tracking shows progress across participants
For management reviews:
- Focus on schedule, resources, risks, and significant technical issues
- Review peer review summaries to extract issues requiring management attention
- Track open action items and resolution progress
Phase 4: Reporting
Within one working day following the review, formal documentation is produced:
Review Summary includes:
- List of identified issues (categorized by severity/priority)
- Review participants and their roles
- Product reviewed and version information
- Date and duration of review
- Disposition (Accept, Accept with changes, Re-review required)
For tool-based reviews, status transitions are tracked through lifecycle states: Draft → In Progress → Reviewed → Finalized (or Overdue if deadline passes).
Phase 5: Rework
The author addresses identified issues. For formal reviews, follow-up peer reviews may be conducted based on issue importance. Management monitors that issues receive proper resolution, and SQA verifies that affected documentation is updated appropriately.
Phase 6: Verification and Closeout
The moderator determines review status at conclusion. For significant issues requiring re-review, a follow-up review is scheduled. The review is formally closed when:
- All identified issues are resolved
- Required rework is verified
- Updated documentation is complete
- Status is recorded in project tracking systems
Peer Reviews: The Foundation of Quality
Peer reviews are the most frequent and technically focused reviews. Two types operate at different cadences:
One-on-One Peer Reviews
- Frequency: Multiple times per week
- Duration: 2 hours maximum
- Process: Author works with one colleague familiar with the product
- Recording: Author keeps notes; no formal issue list
- Management oversight: Tracked through weekly status reports
Scheduled Peer Reviews
- Frequency: Biweekly
- Duration: 2 hours maximum
- Participants: 3-6 technical staff (moderator, scribe, author/developer, reviewers)
- Recording: Formal list of issues distributed to SQA and management
- Closeout: No significant open issues remain
Roles in scheduled peer reviews:
| Role | Responsibilities |
|---|---|
| Moderator | Organizes review, selects participants, schedules, runs meeting, determines status |
| Scribe | Documents all substantive comments, produces review summary within one day |
| Author/Developer | Listens, asks clarifying questions, may close some issues at meeting conclusion |
| Reviewers | Technical peers assigned specific perspectives (reusability, standards compliance, testability, etc.) |
| Project Management | Ensures reviews occur, allocates resources, monitors issue resolution |
| SQA | Participates as observer, verifies process compliance, tracks resolution |
Management Reviews
Management does not attend peer reviews, but they are responsible for ensuring the review process functions effectively.
Project Management Reviews
- Frequency: At least every two weeks
- Participants: Project leads (Systems Engineering, Test, Software Engineering), SQA, SCM
- Focus: Track progress against software development plan; address open significant issues
- Output: Open action list maintained by SCM, distributed to project and line management
Line Management Reviews
- Division Manager: Monthly reviews
- Operations Manager: Quarterly reviews
- Group Manager: Semi-annual reviews
These reviews evaluate project status, risks, and resource needs at organizational levels above the individual project.
Formal Milestone Reviews
Major project milestones require formal reviews that serve as decision gates. NASA defines these as event-based, occurring when entrance criteria are satisfied rather than on calendar schedules.
Common Milestone Reviews
| Review | Timing | Key Assessment |
|---|---|---|
| Preliminary Design Review (PDR) | After architecture/design completion | Is the design feasible? Will requirements be met? |
| Critical Design Review (CDR) | Before implementation begins | Is design complete and detailed enough for coding? |
| Test Readiness Review | Before formal testing | Are test plans complete? Is system ready? |
| Production Readiness Review | Before release | Is product ready for deployment? |
Milestone Review Process
- Develop review plan – Define timing, participants, criteria
- Prepare materials – Progress reports, technical documents, test results
- Conduct review meeting – Project manager facilitates, team presents, stakeholders participate
- Provide evaluation – Assess against predetermined standards
- Create action plan – Address identified issues with owners and deadlines
- Track execution – Ensure actions complete before next milestone
Entrance and Exit Criteria
Effective milestone reviews use explicit criteria:
Entrance Criteria (must be satisfied before review begins):
- Required documents complete and available
- Required peer reviews conducted
- Action items from previous reviews closed
Exit Criteria (must be satisfied to pass the review):
- Review board approves product
- Open issues documented with resolution plans
- Recommendations accepted or rejected with rationale
Stakeholder Involvement
Regular reviews require participation from relevant stakeholders. NASA guidance identifies key stakeholder groups:
- Quality assurance
- Systems engineering
- Independent testing
- Operations
- Independent Verification and Validation
- Project and engineering management
- Other organizations performing project activities
External stakeholders (principle investigators, science community, sponsors) typically participate only in major milestone reviews rather than regular internal reviews.
For agile and hybrid environments, the review strategy should be tailored: foundational requirements may follow milestone-based reviews while evolving requirements use continuous agile reviews.
Tool Support for Reviews
Modern review tools provide significant efficiency improvements:
| Capability | Benefit |
|---|---|
| Concurrent review | Multiple reviewers provide feedback simultaneously |
| Revision tracking | Feedback tied to specific versions |
| Status dashboards | Visibility into completion progress |
| Email notifications | Automated updates for review events |
| E-signatures | Formal approval documentation |
| Baseline integration | Reviews anchored to immutable baselines |
IBM DOORS Next and Jama Connect are examples of platforms supporting formal review workflows with role-based permissions, status tracking, and reporting.
Tool tip: When creating reviews in editable streams, artifacts can change during review. For stable reviews requiring validation, create reviews from baselines instead.
Key Success Factors
For Effective Peer Reviews
- Keep meetings to 2 hours maximum
- Focus exclusively on identifying issues, not solving them
- Assign specific perspectives to each reviewer
- Management does not attend (prevents intimidation)
- Track completion through weekly status reports
For Effective Management Reviews
- Review peer review summaries to extract significant issues
- Maintain open action lists with clear owners
- Include SQA and SCM as mandatory participants
- Distribute minutes and action items promptly
General Best Practices
- Define clear, measurable review criteria upfront
- Prepare thoroughly before review meetings
- Maintain open, honest communication (no blame culture)
- Convert findings to actionable items with owners and deadlines
- Use lessons learned to continuously improve the review process
Common Pitfalls to Avoid
| Pitfall | Consequence |
|---|---|
| No preparation | Review wastes time; superficial findings |
| Too many participants | Unfocused discussion; scheduling difficulties |
| Solving problems in review | Meeting runs long; author becomes defensive |
| Management attendance at peer reviews | Team members hesitate to raise issues |
| No follow-up on findings | Issues persist; reviews lose credibility |
| Insufficient time allocated | Rush through material; miss critical defects |
Relationship to Other Processes
The review process interacts with several other project management disciplines:
- Quality Management – Reviews are primary verification and validation activities
- Configuration Management – Reviews establish baselines; baselines anchor reviews
- Risk Management – Reviews identify and track risks
- Issue Tracking – Review findings become tracked issues requiring resolution
- Audit and Compliance – Formal reviews provide compliance evidence
Summary
The review process is a systematic, multi-level quality control mechanism that spans informal peer checks through formal milestone reviews. Effective implementation requires:
- Right-sizing formality to project risk and criticality
- Clear role definitions with trained participants
- Disciplined execution following defined processes
- Proper tool support for efficiency and traceability
- Management commitment to resource allocation and follow-through
When executed well, reviews find defects early, build shared understanding, provide stakeholder confidence, and drive continuous process improvement. When neglected or performed poorly, they become wasteful ceremonies that fail to prevent preventable failures.