TechLead
Lesson 18 of 25
5 min read
Engineering Leadership

Performance Reviews for Engineers

Run fair and effective performance review cycles with calibration, feedback, and growth-oriented conversations

The Purpose of Performance Reviews

Performance reviews exist to serve two purposes: providing feedback that helps engineers grow, and making fair decisions about compensation, promotion, and role changes. These two purposes are often in tension. A review process that optimizes purely for feedback may lack the rigor needed for fair compensation decisions; one that optimizes for decisions may create an environment where people game metrics rather than focus on genuine growth.

As a Tech Lead, you may not own the review process (that typically falls to the Engineering Manager), but you provide critical input on technical performance, and your feedback significantly influences review outcomes. Understanding how to assess and communicate technical performance fairly is essential.

Principles of Fair Reviews

  • No Surprises: Nothing in a review should be the first time the engineer hears it. All feedback should have been given in real-time throughout the review period.
  • Evidence-Based: Assessments should be backed by specific examples, not general impressions.
  • Calibrated: The same level of performance should receive the same rating regardless of who the reviewer is.
  • Growth-Oriented: The review should provide a clear path forward, not just a backward-looking assessment.
  • Bidirectional: Engineers should have the opportunity to self-assess and provide feedback on their management.

The Review Cycle

Typical Review Timeline

Phase Timeline Activities
Self-AssessmentWeek 1-2Engineer reflects on accomplishments and growth areas
Peer FeedbackWeek 1-2Collect 360 feedback from collaborators
Manager AssessmentWeek 2-3Manager writes review incorporating all inputs
CalibrationWeek 3Managers align ratings across teams for consistency
DeliveryWeek 41-on-1 meeting to deliver the review and discuss growth

Writing Effective Technical Assessments

When providing input on an engineer's technical performance, be specific and evidence-based. Use concrete examples from the review period.

## Technical Assessment: Good Example

### Technical Execution (Exceeds Expectations)
Led the migration of the payment service from REST to gRPC,
reducing p99 latency from 450ms to 120ms. Designed the
migration strategy that allowed zero-downtime cutover.
Code quality was consistently high with thorough test coverage.

### Technical Leadership (Meets Expectations)
Wrote 2 RFCs this cycle (caching strategy, API versioning).
The caching RFC was well-received and adopted. The API
versioning RFC needed significant revision after review,
suggesting more upfront research on alternatives would help.

### Mentorship (Below Expectations)
Committed to mentoring a junior engineer but canceled 3 of 6
scheduled pairing sessions. The junior engineer expressed
frustration about inconsistent support. This is an area for
focused improvement next quarter.

---

## Technical Assessment: Bad Example

### Technical Execution
"Did good work this quarter."

### Technical Leadership
"Wrote some RFCs."

### Mentorship
"Could be better at mentoring."

The Calibration Process

Calibration ensures that ratings are consistent across managers and teams. Without calibration, some managers rate generously and others rate harshly, creating unfair outcomes.

  • Level anchoring: Define what "Meets Expectations" looks like for each engineering level. A mid-level engineer meeting expectations looks different from a senior engineer meeting expectations.
  • Cross-team comparison: Managers present their ratings and supporting evidence to peer managers. Outliers are discussed and adjusted.
  • Promotion calibration: Promotion candidates are evaluated against a consistent bar. "Would this person's work qualify as their target level at another company?"
  • Bias checks: Look for patterns in ratings across demographics. Unconscious bias can systematically underrate certain groups.

Delivering the Review

  • Schedule a dedicated 45-60 minute meeting. Do not rush this.
  • Share the written review in advance so the engineer can read and process it before the conversation.
  • Start with the overall assessment and key themes, then dive into details.
  • For each area of improvement, provide a specific, actionable suggestion and offer support.
  • End with a forward-looking discussion: "What do you want to focus on in the next review period?"
  • Listen more than you talk. The engineer may have context you are missing.

Common Review Mistakes

  • Recency bias: Over-weighting the last month of the review period. Keep notes throughout the cycle.
  • Leniency bias: Giving everyone positive reviews to avoid difficult conversations. This hurts high performers who deserve differentiation.
  • Halo/horn effect: Letting one strong or weak area color the assessment of all areas.
  • Comparing to yourself: Evaluating engineers based on how you would do the work. Evaluate against the expectations for their level.
  • Ignoring glue work: Undervaluing essential but less visible work like documentation, mentoring, code reviews, and incident response.

Summary

Performance reviews are a powerful tool when done well and a source of demotivation when done poorly. The keys are specificity, fairness, consistency, and a genuine focus on growth. As a Tech Lead, your detailed technical input is essential for accurate assessments. Keep notes throughout the review period, provide real-time feedback, and ensure the formal review contains no surprises.

Continue Learning