Measurement Framework

How to Measure Developer Onboarding: 8 Metrics That Matter

What gets measured gets improved. Most companies track time-to-hire but not time-to-productivity. These 8 engineering-specific metrics give you a complete picture of onboarding effectiveness with actionable benchmarks.

1

Time to First Commit

Calendar days from start date to first PR merged into the main branch.

Benchmark

Top teams: 1-3 days. Good: 3-5 days. Median: 7-14 days. Needs work: 15+ days.

The single best leading indicator of onboarding effectiveness. A long time-to-first-commit signals environment setup issues, access provisioning delays, or lack of pre-identified first tasks. It correlates strongly with overall ramp time: developers who commit early reach full velocity faster.

How to Measure

Track the date of the first merged PR for each new hire in your version control system. Most engineering analytics tools (LinearB, Jellyfish, Swarmia) can calculate this automatically.

How to Improve

Automate dev environment setup, pre-provision accounts, pre-identify a good first issue, and have the buddy ready for rapid code review.

2

Time to First Deploy

Calendar days from start date to first production deployment (or staging if production access is restricted).

Benchmark

Mature platforms: 1-5 days. Typical: 10-20 days. Lagging: 30+ days.

Measures the combined effectiveness of dev environment, CI/CD pipeline, deployment permissions, and onboarding documentation. If the developer cannot deploy by week two, something structural is blocking them.

How to Measure

Track first deployment via your CI/CD pipeline logs. Match new hire start dates against deployment records.

How to Improve

Self-service deployment pipelines, clear deployment documentation, and sandbox environments for safe first deploys.

3

Time to Full Velocity

Weeks from start date until the developer consistently delivers at 80-90% of the team average velocity (measured by story points, PRs merged, or comparable output metric).

Benchmark

Junior: 16-26 weeks. Mid-level: 10-16 weeks. Senior: 6-10 weeks. Staff: 12-24 weeks (different measure: impact, not code output).

The ultimate outcome metric for onboarding effectiveness. Directly determines the salary ramp-up cost. Every week shaved off this number saves $2,000-$5,000 depending on salary level.

How to Measure

Compare rolling 2-week velocity average against the team's average. Mark the point where the new hire consistently hits 80%+ for two consecutive sprints.

How to Improve

Structured 30-60-90 plans, buddy system, codebase documentation, and progressively scoped work (small tasks first, features later).

4

Ramp-Up Cost Ratio

Total onboarding cost divided by annual salary. Expressed as a percentage.

Benchmark

Well-structured programmes: 15-20%. Typical: 20-30%. Poor onboarding: 30-45%.

Normalises onboarding cost across salary levels, making it comparable across roles and seniority. A ratio above 30% suggests systemic onboarding inefficiency. Below 15% likely means you are under-investing.

How to Measure

Use the calculator on this site to estimate total cost, then divide by the annual salary.

How to Improve

Every strategy that reduces ramp time reduces this ratio. Focus on the highest-leverage interventions: automated setup, buddy system, structured plans.

5

Mentor Time Consumed

Hours per week of senior developer time dedicated to mentoring, pairing, and reviewing the new hire's work.

Benchmark

Week 1: 15-20 hrs (junior) / 4-8 hrs (senior). Should decline to under 3 hrs/week by month three for all levels.

Mentor time is the second-largest cost component of onboarding. If it is not declining over time, the onboarding programme may be insufficient. If it is too low in week one, the new hire may not be getting enough support.

How to Measure

Have mentors log hours weekly for the first 90 days. Even a rough estimate is better than no tracking.

How to Improve

Buddy rotation (spreads load), recorded walkthroughs (reduces repeated explanations), ADRs (reduces 'why' questions), and async-first question channels.

6

New Hire Onboarding NPS

Net Promoter Score from a survey asking: 'How likely are you to recommend our onboarding process to a friend joining the company?' (0-10 scale).

Benchmark

50+ is good. 70+ is excellent. Below 30 indicates significant problems.

Captures the subjective experience of onboarding, which correlates with retention and engagement. A technically efficient onboarding can still feel terrible to the new hire. This metric catches the human side.

How to Measure

Send a 5-question survey at day 30 and day 90. Include NPS, open-ended feedback, and specific satisfaction questions about buddy, documentation, and environment setup.

How to Improve

Act on the feedback. The most common complaints are: environment setup pain, unclear expectations, insufficient buddy availability, and too many meetings in week one.

7

90-Day Retention Rate

Percentage of new hires still employed at 90 days.

Benchmark

Excellent: 97%+. Good: 93-97%. Industry average: 78%. Poor: below 75%.

Early attrition destroys onboarding ROI. If 22% of hires leave within 90 days, you are re-paying the full recruiting and onboarding cost for nearly one in four hires. This is the lagging indicator that validates all other onboarding investments.

How to Measure

Track against hire date. Segment by team, manager, and seniority to identify patterns.

How to Improve

Structured onboarding (82% higher retention per Brandon Hall Group), regular check-ins, buddy system, and realistic job previews during recruiting.

8

Sprint Velocity Recovery

Weeks until team velocity returns to the pre-hire baseline after adding a new team member.

Benchmark

Good: 3-4 weeks. Typical: 5-8 weeks. Poor: 10+ weeks.

Adding a new team member temporarily reduces team velocity by 25-40%. How quickly the team recovers indicates how well the new hire is integrated and how minimal the disruption to existing workflows is.

How to Measure

Track team velocity (story points completed) for 4 weeks before the hire joins, then monitor the recovery curve.

How to Improve

Assign new hires to separate onboarding tasks initially, limit their code review load on the team, and have the buddy handle most of the integration overhead.

Metrics Dashboard Template

Track all 8 metrics in a single dashboard. Suggested layout:

Leading Indicators (top row)

Time to first commit, time to first deploy, mentor time consumed. These predict future onboarding quality.

Outcome Metrics (middle row)

Time to full velocity, ramp-up cost ratio, sprint velocity recovery. These measure actual onboarding impact.

Experience and Retention (bottom row)

New hire NPS, 90-day retention rate. These capture the human side and long-term return on onboarding investment.

Frequently Asked Questions

Which metric should I start tracking first?
Start with time-to-first-commit (#1) and 90-day retention rate (#7). These are the leading and lagging indicators that give you the best signal with the least effort. Time-to-first-commit is easy to measure from version control data, and retention is already tracked by HR.
How do these metrics relate to DORA metrics?
Time-to-first-commit and time-to-first-deploy are DORA-adjacent metrics. DORA measures team-level delivery performance (deployment frequency, lead time, MTTR, change failure rate). Onboarding metrics measure individual ramp within that team context. A team with strong DORA metrics will typically have faster onboarding because the infrastructure (CI/CD, testing, monitoring) is mature.
How often should I review onboarding metrics?
Track per-hire for time-to-first-commit and mentor time. Review aggregate metrics (velocity recovery, retention, NPS) quarterly. Report to engineering leadership quarterly with trends and proposed improvements. If you are hiring more than 5 engineers per quarter, monthly reviews are worthwhile.
What tools can track these metrics automatically?
Jellyfish, LinearB, and Swarmia can track time-to-first-commit and velocity automatically from Git data. For NPS and mentor time, use a simple spreadsheet or survey tool (Typeform, Google Forms). 90-day retention comes from HRIS data. See the tools comparison page for detailed platform evaluations.