Team Analytics

Gain insights into team performance, code quality trends, and individual contributor metrics with comprehensive analytics dashboards.

Overview

Jasper's Team Analytics provides visibility into how your team is performing across code reviews. Track quality scores, productivity metrics, and identify trends over time. Use these insights for:

  • Performance reviews and 1:1 discussions
  • Identifying team strengths and areas for improvement
  • Tracking quality improvements over time
  • Recognizing top performers
  • Resource allocation and team planning

Accessing Analytics

Navigate to analytics from the main navigation:

  1. Click Analytics in the main menu
  2. Choose from available views: Overview, My Stats, Team, or Leaderboard

Permission Note: All team members can view their own stats and team aggregate metrics. Only admins and owners can view individual contributor details and export data.

Analytics Views

Overview Dashboard

The overview provides a high-level snapshot of your organization's code review activity:

  • Total PRs Reviewed - Number of pull requests analyzed
  • Average Quality Score - Mean quality score across all reviews
  • Pass Rate - Percentage of PRs passing quality gates
  • Active Contributors - Number of unique contributors
  • Issues Resolved - Total issues addressed after reviews

My Stats

View your personal performance metrics and compare against team averages:

  • Quality Score - Your average quality score with trend indicator
  • PRs Submitted - Number of pull requests you've created
  • Pass Rate - Your quality gate pass percentage
  • First-Time Pass Rate - PRs that passed without rework
  • vs Team Average - Badges showing how you compare to the team

Team Analytics

Aggregate metrics across all team members (admin view includes individual breakdowns):

  • Team Quality Trends - Line chart showing quality over time
  • Issue Distribution - Breakdown by severity (critical, high, medium, low)
  • PR Activity - Volume of PRs by day/week/month
  • Top Issue Categories - Most common types of issues found

Leaderboard

See how contributors rank across different metrics:

  • Quality Score - Highest average quality scores
  • PRs Submitted - Most active contributors
  • Pass Rate - Best quality gate success rate
  • First-Time Pass Rate - Cleanest code submissions
  • Lines Changed - Most code output
  • Active Days - Most consistent contributors

Metrics Explained

Quality Metrics

Metric Description How It's Calculated
Quality Score Overall code quality rating 0-100 score based on issue severity and count
Total Issues All issues found in reviews Sum of critical + high + medium + low issues
Resolution Rate Percentage of issues addressed Resolved issues / Total issues

Productivity Metrics

Metric Description How It's Calculated
PRs Submitted Pull requests created Count of PRs with reviews
Pass Rate Quality gate success Passed PRs / Total PRs
First-Time Pass PRs passing without rework No-rework PRs / Total PRs
Lines Changed Code output volume Lines added + Lines removed
Avg Processing Time Review completion speed Average seconds per review

Time Periods

Filter analytics by different time ranges:

  • Last 7 days - Recent activity snapshot
  • Last 30 days - Monthly performance (default)
  • Last 90 days - Quarterly trends
  • This Month - Current calendar month
  • This Quarter - Current quarter

Charts & Visualizations

Quality Trend Chart

Line chart showing quality score over time. Hover over data points to see exact values and dates. Compare individual performance against team average.

Issue Distribution Chart

Doughnut chart breaking down issues by severity. Useful for understanding what types of problems are most common in your codebase.

PR Activity Chart

Bar chart showing PR volume over time. Helps identify busy periods and team velocity patterns.

Contributor Details (Admin)

Admins can view detailed analytics for individual contributors:

  1. Go to Analytics → Team
  2. Click on a contributor's name
  3. View their complete performance history

The contributor detail page shows:

  • All metrics with trend indicators
  • Quality score history chart
  • Issue breakdown by category
  • Comparison to team averages
  • Recent review history

Exporting Data

Export analytics data for reporting or further analysis (admin only):

  1. Navigate to the analytics view you want to export
  2. Click the Export button
  3. Choose CSV format
  4. Download includes all visible data plus additional fields

Export includes:

  • Contributor identifier (GitHub username)
  • All quality and productivity metrics
  • Time period and snapshot date
  • Team comparison data

ClickUp Integration Metrics

If your organization has ClickUp integration enabled, additional metrics are available:

  • Task Link Rate - Percentage of PRs linked to ClickUp tasks
  • Ticket Quality Score - Quality of linked ticket descriptions
  • Valid Tickets - Tickets meeting quality standards
  • Warning Tickets - Tickets with minor issues
  • Invalid Tickets - Tickets failing quality checks

See ClickUp Integration for setup instructions.

Data Collection

Analytics data is collected and aggregated daily:

  • Snapshot Time - Data is aggregated at 2:00 AM UTC each day
  • Historical Data - Snapshots are retained based on your plan
  • Real-Time - Current day data is calculated on-demand

Backfilling Historical Data

If you need to regenerate historical analytics (e.g., after adjusting settings):

  1. Go to Analytics → Team
  2. Click Backfill Data (admin only)
  3. Select the number of days to regenerate
  4. Wait for the job to complete

Permissions

Action Required Permission
View own stats analytics.view (all users)
View team analytics analytics.team-view (all users)
View all contributors analytics.all-view (admin/owner)
Export analytics analytics.export (admin/owner)
Backfill data analytics.export (admin/owner)

Best Practices

Using Analytics Effectively

  • Focus on trends - Single data points can be misleading; look at patterns over time
  • Context matters - A low pass rate on a legacy codebase isn't necessarily bad
  • Celebrate improvement - Use the data to recognize growth, not just performance
  • Balance metrics - High volume + low quality isn't better than moderate volume + high quality

For Performance Reviews

  • Use 90-day or quarterly views for comprehensive assessment
  • Compare against team averages, not arbitrary standards
  • Consider the complexity of work assigned
  • Look at improvement trajectory, not just absolute numbers

For Team Planning

  • Identify patterns in quality issues to guide training
  • Use PR volume data for capacity planning
  • Track quality trends when rolling out new practices
  • Set realistic quality gate thresholds based on historical data