DevDynamics
  • Overview
    • 💡Getting Started
    • ❓FAQ
    • Configuration
      • Best Practices
      • Setting Up Issue Management System in DevDynamics
      • Setting Up Version Control
      • Setting Up Investment Categories in DevDynamics
      • Configuring Communication Channels in DevDynamics
      • Change Failure
      • Deployment
      • Manage Contributors
      • Manage Teams
  • Integrations
    • GitLab
    • GitHub
    • BitBucket
    • On-Premises BitBucket
    • On-Premises GitHub
    • Jira
    • Slack
    • Linear
    • Pagerduty
    • Microsoft Teams
    • Sonar Cloud
    • Sonar Qube
    • CI/CD
    • Azure Repo
    • Azure Board
    • Opsgenie
    • ClickUp
    • Outlook
    • Google Calendar
    • Shortcut
    • OpenProject
    • Asana
    • Test Suite
  • Installations
    • On-Premises Agent Installation Guide
  • Features
    • Metrics
      • Cycle Time Metrics
      • DORA Metrics
      • Git Dashboard
        • Feedback Responsiveness Index (FRI)
        • Review Responsiveness Index (RRI)
        • Review Completion Index (RCI)
        • Reviewer Distribution
        • Average Active Days
        • PR Review Health
        • Open PR Age
        • Work Breakdown: Distribution of Work (Lines of Code)
      • Ticket Dashboard
        • Issue Throughput
        • Open Issue Age
        • Backward Momentum
        • Issue Cycle Time Spread
        • Requirement Stability Index
      • Issue Closed Count/Task Velocity
      • Story Points Completed
      • Bug Closed Count
      • Bug Opened Count
      • Pull Request Merged Count
      • PR Comment Count
      • Pull Request Size
      • Pull Request Open Count
      • Pull Request Review Count
      • Contributors Working on Non-Working Days
      • Contributors working out of office hours
      • Contributors getting burnout
    • Team Insights
    • Initiatives
Powered by GitBook
On this page
  • Reviewer Distribution
  • 1. Introduction:
  • 2. Definitions:
  • 3. Explanation of Charts:
  • 4. Interpretation:
  • 5. Key Points:
  • 6. Conclusion:
  • Additional Considerations:
  1. Features
  2. Metrics
  3. Git Dashboard

Reviewer Distribution

PreviousReview Completion Index (RCI)NextAverage Active Days

Last updated 11 months ago

Reviewer Distribution

1. Introduction:

This document explains how to analyze and visualize reviewer participation in your code review process using the Reviewer Distribution metric. It's presented as a Pareto chart, ranking reviewers by the number of reviews conducted and showing the cumulative percentage of the total reviews.

This metric provides valuable insights into ensuring balanced participation among team members and mitigating risks associated with reviewer workload and knowledge distribution.

2. Definitions:

  • Reviewer: Team members who are authorized to conduct code review for Pull Requests (PRs).

  • Review: A single instance of a code review completed on a Pull Request.

  • Pull request approved with feedback: PRs that reviewers have commented on at least once.

  • Pull request approved without feedback: PRs that do not have any comment by reviewers.

  • Total Members: The total number of team members added to the team(s) where code reviews are conducted.

3. Explanation of Charts:

KPI Card:

Detailed Chart:

The Pareto Chart displays the distribution of code review participation among team members.

  • Vertical Axis: Represents individual reviewers ranked by the number of reviews conducted (highest to lowest).

  • Horizontal Axis: Represents the cumulative percentage of total reviews completed.

  • Blue bar: PRs Approved without feedback

  • Yellow Bar: PRs Approved with feedback

  • Red Line: Cumulative Percentage PRs Reviewed

  • Tooltips: Hovering over a data point reveals additional information:

Color Indicators:

  • Green: 45% of reviewers or more are actively participating.

  • Amber: Less than 45% but equal to or more than 30% of reviewers are participating.

  • Red: Less than 30% of reviewers are actively participating.

4. Interpretation:

Identifying Trends: The Pareto chart helps visualize the distribution of code review participation over time.

  • Uneven Distribution (Red/Amber): This may indicate a small group of reviewers are handling a disproportionate workload, potentially leading to fatigue, knowledge silos, and bottlenecks.

  • Balanced Distribution (Green): This suggests reviews are spread relatively evenly among a moderate number of reviewers, promoting knowledge sharing and process efficiency.

5. Key Points:

  • Balanced Participation: Monitoring Reviewer Distribution ensures all team members contribute to the code review process, mitigating risks associated with reviewer overload and knowledge concentration.

  • Knowledge Sharing: Identifying reviewers with high participation helps understand where expertise lies and can inform knowledge sharing initiatives.

  • Targeted Training: Reviewer Distribution data can be used to identify team members who might benefit from additional review training, promoting wider participation.

6. Conclusion:

Monitoring Reviewer Distribution using a Pareto chart provides valuable insights into the health of your code review process. By analyzing the distribution and taking corrective actions when necessary, you can ensure balanced participation, promote knowledge sharing, and ultimately deliver higher quality software.

Related Metrics:

  • Quality: Bug Open count, Review Time, Cycle Time, Review Responsiveness, Review Completion

Additional Considerations:

  • Track changes in the distribution over time to identify trends and assess the effectiveness of implemented initiatives.

  • Conduct regular analysis to understand the reasons behind review concentration and identify specific solutions.