Skip to content

The Review Process in SPM

The Review Process is a systematic quality management activity that evaluates software work products, project status, and processes to identify defects, ensure compliance, verify progress, and facilitate informed decision-making throughout the software development lifecycle.

Why Reviews Matter

Reviews serve as critical quality gates that prevent defects from propagating to later stages where they become exponentially more expensive to fix. Beyond defect detection, reviews provide:

  • Visibility into actual project status for stakeholders
  • Risk identification before issues become crises
  • Shared understanding across team members
  • Compliance evidence for regulatory requirements
  • Knowledge transfer among team members

Without systematic reviews, projects suffer from undetected defects, unrealistic expectations, misaligned stakeholder assumptions, and preventable failures.

The Five Types of Software Reviews

The IEEE Standard 1028-1997 defines five distinct types of reviews, each serving a specific purpose with different participants and levels of formality:

Review TypePrimary PurposeTypical ParticipantsFormality Level
Management ReviewEvaluate project status, schedule, resources, and risksProject manager, leads, SQA, SCMHigh
Technical ReviewAssess technical approach, design decisions, and product qualityTechnical peers, subject matter expertsMedium-High
InspectionDetect defects in work products with rigorous processModerator, author, inspectors, scribeHighest
WalkthroughEducate team, find issues informallyAuthor as presenter, peersLow-Medium
AuditVerify compliance with standards, contracts, and requirementsIndependent auditors, SQA, customer representativesHighest

The Core Review Process Flow

While specific review types vary in formality, they follow a common structural flow:

text

┌─────────────────────────────────────────────────────────────────────────────┐
│                         REVIEW PROCESS LIFECYCLE                             │
├──────────┬──────────┬───────────┬──────────┬───────────┬──────────────────┤
│ 1. PLAN  │ 2. PREPARE│ 3. EXECUTE│ 4. REPORT│ 5. REWORK │ 6. VERIFY & CLOSE │
└──────────┴──────────┴───────────┴──────────┴───────────┴──────────────────┘

Phase 1: Planning

The review process begins with establishing the review framework. A review matrix is created to define which document types require which reviewer roles, similar to a RACI matrix. Key planning activities include:

  • Selecting review type based on product criticality and risk
  • Defining entrance criteria (e.g., document complete, peer review performed)
  • Identifying participants with appropriate perspectives
  • Scheduling within project timeline
  • Determining focus areas and specific review objectives

For agile projects, reviews often target individual requirements as soon as they are “Ready for Review.” Waterfall projects typically align reviews with milestones, reviewing complete requirement sets before phase transitions.

Phase 2: Preparation

Preparation is critical for effective reviews and primarily falls to the moderator:

  • Distributing materials to reviewers sufficiently in advance
  • Setting review focus – what risks or concerns need coverage
  • Assigning reviewer perspectives (e.g., security, performance, maintainability, testability)
  • Gathering contact information and establishing communication channels
  • Preparing administration – tracking logs, checklists, reporting templates

The moderator informs all participants about the process, expectations, and their specific roles before the review begins.

Phase 3: Execution

During the review meeting or asynchronous review period, participants examine the work product:

For formal reviews (Inspections/Technical Reviews):

  • Moderator leads the session, keeping focus on issue identification (not solution discussion)
  • Scribe documents all substantive comments and issues
  • Reviewers present findings from assigned perspectives
  • Author listens and asks clarifying questions, may close some issues immediately

For tool-supported reviews:

  • Modern platforms like DOORS Next and Jama Connect enable concurrent review
  • Reviewers can log feedback and approvals simultaneously
  • All feedback associates with specific artifact revisions
  • Status tracking shows progress across participants

For management reviews:

  • Focus on schedule, resources, risks, and significant technical issues
  • Review peer review summaries to extract issues requiring management attention
  • Track open action items and resolution progress

Phase 4: Reporting

Within one working day following the review, formal documentation is produced:

Review Summary includes:

  • List of identified issues (categorized by severity/priority)
  • Review participants and their roles
  • Product reviewed and version information
  • Date and duration of review
  • Disposition (Accept, Accept with changes, Re-review required)

For tool-based reviews, status transitions are tracked through lifecycle states: Draft → In Progress → Reviewed → Finalized (or Overdue if deadline passes).

Phase 5: Rework

The author addresses identified issues. For formal reviews, follow-up peer reviews may be conducted based on issue importance. Management monitors that issues receive proper resolution, and SQA verifies that affected documentation is updated appropriately.

Phase 6: Verification and Closeout

The moderator determines review status at conclusion. For significant issues requiring re-review, a follow-up review is scheduled. The review is formally closed when:

  • All identified issues are resolved
  • Required rework is verified
  • Updated documentation is complete
  • Status is recorded in project tracking systems

Peer Reviews: The Foundation of Quality

Peer reviews are the most frequent and technically focused reviews. Two types operate at different cadences:

One-on-One Peer Reviews

  • Frequency: Multiple times per week
  • Duration: 2 hours maximum
  • Process: Author works with one colleague familiar with the product
  • Recording: Author keeps notes; no formal issue list
  • Management oversight: Tracked through weekly status reports

Scheduled Peer Reviews

  • Frequency: Biweekly
  • Duration: 2 hours maximum
  • Participants: 3-6 technical staff (moderator, scribe, author/developer, reviewers)
  • Recording: Formal list of issues distributed to SQA and management
  • Closeout: No significant open issues remain

Roles in scheduled peer reviews:

RoleResponsibilities
ModeratorOrganizes review, selects participants, schedules, runs meeting, determines status
ScribeDocuments all substantive comments, produces review summary within one day
Author/DeveloperListens, asks clarifying questions, may close some issues at meeting conclusion
ReviewersTechnical peers assigned specific perspectives (reusability, standards compliance, testability, etc.)
Project ManagementEnsures reviews occur, allocates resources, monitors issue resolution
SQAParticipates as observer, verifies process compliance, tracks resolution

Management Reviews

Management does not attend peer reviews, but they are responsible for ensuring the review process functions effectively.

Project Management Reviews

  • Frequency: At least every two weeks
  • Participants: Project leads (Systems Engineering, Test, Software Engineering), SQA, SCM
  • Focus: Track progress against software development plan; address open significant issues
  • Output: Open action list maintained by SCM, distributed to project and line management

Line Management Reviews

  • Division Manager: Monthly reviews
  • Operations Manager: Quarterly reviews
  • Group Manager: Semi-annual reviews

These reviews evaluate project status, risks, and resource needs at organizational levels above the individual project.

Formal Milestone Reviews

Major project milestones require formal reviews that serve as decision gates. NASA defines these as event-based, occurring when entrance criteria are satisfied rather than on calendar schedules.

Common Milestone Reviews

ReviewTimingKey Assessment
Preliminary Design Review (PDR)After architecture/design completionIs the design feasible? Will requirements be met?
Critical Design Review (CDR)Before implementation beginsIs design complete and detailed enough for coding?
Test Readiness ReviewBefore formal testingAre test plans complete? Is system ready?
Production Readiness ReviewBefore releaseIs product ready for deployment?

Milestone Review Process

  1. Develop review plan – Define timing, participants, criteria
  2. Prepare materials – Progress reports, technical documents, test results
  3. Conduct review meeting – Project manager facilitates, team presents, stakeholders participate
  4. Provide evaluation – Assess against predetermined standards
  5. Create action plan – Address identified issues with owners and deadlines
  6. Track execution – Ensure actions complete before next milestone

Entrance and Exit Criteria

Effective milestone reviews use explicit criteria:

Entrance Criteria (must be satisfied before review begins):

  • Required documents complete and available
  • Required peer reviews conducted
  • Action items from previous reviews closed

Exit Criteria (must be satisfied to pass the review):

  • Review board approves product
  • Open issues documented with resolution plans
  • Recommendations accepted or rejected with rationale

Stakeholder Involvement

Regular reviews require participation from relevant stakeholders. NASA guidance identifies key stakeholder groups:

  • Quality assurance
  • Systems engineering
  • Independent testing
  • Operations
  • Independent Verification and Validation
  • Project and engineering management
  • Other organizations performing project activities

External stakeholders (principle investigators, science community, sponsors) typically participate only in major milestone reviews rather than regular internal reviews.

For agile and hybrid environments, the review strategy should be tailored: foundational requirements may follow milestone-based reviews while evolving requirements use continuous agile reviews.

Tool Support for Reviews

Modern review tools provide significant efficiency improvements:

CapabilityBenefit
Concurrent reviewMultiple reviewers provide feedback simultaneously
Revision trackingFeedback tied to specific versions
Status dashboardsVisibility into completion progress
Email notificationsAutomated updates for review events
E-signaturesFormal approval documentation
Baseline integrationReviews anchored to immutable baselines

IBM DOORS Next and Jama Connect are examples of platforms supporting formal review workflows with role-based permissions, status tracking, and reporting.

Tool tip: When creating reviews in editable streams, artifacts can change during review. For stable reviews requiring validation, create reviews from baselines instead.

Key Success Factors

For Effective Peer Reviews

  • Keep meetings to 2 hours maximum
  • Focus exclusively on identifying issues, not solving them
  • Assign specific perspectives to each reviewer
  • Management does not attend (prevents intimidation)
  • Track completion through weekly status reports

For Effective Management Reviews

  • Review peer review summaries to extract significant issues
  • Maintain open action lists with clear owners
  • Include SQA and SCM as mandatory participants
  • Distribute minutes and action items promptly

General Best Practices

  • Define clear, measurable review criteria upfront
  • Prepare thoroughly before review meetings
  • Maintain open, honest communication (no blame culture)
  • Convert findings to actionable items with owners and deadlines
  • Use lessons learned to continuously improve the review process

Common Pitfalls to Avoid

PitfallConsequence
No preparationReview wastes time; superficial findings
Too many participantsUnfocused discussion; scheduling difficulties
Solving problems in reviewMeeting runs long; author becomes defensive
Management attendance at peer reviewsTeam members hesitate to raise issues
No follow-up on findingsIssues persist; reviews lose credibility
Insufficient time allocatedRush through material; miss critical defects

Relationship to Other Processes

The review process interacts with several other project management disciplines:

  • Quality Management – Reviews are primary verification and validation activities
  • Configuration Management – Reviews establish baselines; baselines anchor reviews
  • Risk Management – Reviews identify and track risks
  • Issue Tracking – Review findings become tracked issues requiring resolution
  • Audit and Compliance – Formal reviews provide compliance evidence

Summary

The review process is a systematic, multi-level quality control mechanism that spans informal peer checks through formal milestone reviews. Effective implementation requires:

  1. Right-sizing formality to project risk and criticality
  2. Clear role definitions with trained participants
  3. Disciplined execution following defined processes
  4. Proper tool support for efficiency and traceability
  5. Management commitment to resource allocation and follow-through

When executed well, reviews find defects early, build shared understanding, provide stakeholder confidence, and drive continuous process improvement. When neglected or performed poorly, they become wasteful ceremonies that fail to prevent preventable failures.