Best Practices
Recommendations for running effective People Review campaigns and calibration sessions.
Campaign Design
Timing and Frequency
| Practice | Recommendation |
|---|---|
| Frequency | Annual for comprehensive reviews, bi-annual for high-potential populations |
| Duration | 4-6 weeks for reviews, 1-2 weeks for calibration |
| Timing | Align with business planning cycles |
| Avoid | Busy periods (year-end close, major launches) |
Scope Decisions
┌─────────────────────────────────────────────────────────────────┐
│ CAMPAIGN SCOPE GUIDELINES │
├─────────────────────────────────────────────────────────────────┤
│ │
│ Start Small, Scale Up │
│ ├── Pilot with one department │
│ ├── Expand to similar functions │
│ └── Roll out organization-wide │
│ │
│ Natural Groupings │
│ ├── Same level employees together │
│ ├── Same function for comparability │
│ └── Manageable size (30-80 per campaign) │
│ │
│ Avoid │
│ ├── Mixing very different job families │
│ ├── Too few (< 15) - not enough data │
│ └── Too many (> 100) - calibration unwieldy │
│ │
└─────────────────────────────────────────────────────────────────┘
Evaluation Criteria
Keep It Simple
✓ DO: Enable 2-4 core criteria
• Performance (almost always)
• Potential (for development focus)
• Risk of Loss (if retention is priority)
✗ DON'T: Enable everything
• Creates assessment fatigue
• Dilutes focus
• Extends completion time
Custom Fields Strategy
| Use Case | Field Type | Example |
|---|---|---|
| Qualitative insights | Text | "Development recommendations" |
| Specific competencies | Scale | "Leadership readiness (1-5)" |
| Yes/No decisions | Scale (2-point) | "Ready for promotion" |
Limit: 2-3 custom fields maximum per campaign
Matrix Configuration
Choosing the Right Size
9-Box (Recommended for most)
├── Simple to understand
├── Clear categories
├── Easy calibration discussions
└── Industry standard
16-Box / 25-Box (Use sparingly)
├── Large populations needing differentiation
├── Mature talent management cultures
├── Risk of over-complexity
└── Requires more calibration time
Axis Selection
┌─────────────────────────────────────────────────────────────────┐
│ RECOMMENDED AXIS COMBINATIONS │
├─────────────────────────────────────────────────────────────────┤
│ │
│ Standard Talent Review │
│ └── Performance (Y) vs Potential (X) │
│ │
│ Retention Focus │
│ └── Impact of Loss (Y) vs Risk of Loss (X) │
│ │
│ Development Focus │
│ └── Current Skills (Y) vs Learning Agility (X) │
│ │
│ Leadership Pipeline │
│ └── Results Delivery (Y) vs Leadership Capability (X) │
│ │
└─────────────────────────────────────────────────────────────────┘
Review Assignment
Hierarchy Level Selection
| Level | When to Use | Considerations |
|---|---|---|
| N (Direct Manager) | Default choice | Most familiar with employee |
| N+1 (Skip-level) | For senior roles | Broader perspective |
| N+2 | Executive reviews | Strategic view needed |
Handling Exceptions
New Employees (< 3 months)
├── Include with caveat
├── Manager notes limited observation
└── Consider excluding from calibration decisions
Matrix Organizations
├── Identify primary reporting relationship
├── Consider input from secondary managers
└── Document relationship type
Vacant Manager Positions
├── Assign acting manager
├── Document temporary arrangement
└── Follow up after role filled
Calibration Sessions
Session Structure
┌─────────────────────────────────────────────────────────────────┐
│ EFFECTIVE CALIBRATION AGENDA │
├─────────────────────────────────────────────────────────────────┤
│ │
│ Opening (10 min) │
│ ├── Review objectives │
│ ├── Set ground rules │
│ └── Clarify decision authority │
│ │
│ Distribution Overview (15 min) │
│ ├── Review aggregate placement │
│ ├── Compare to expectations │
│ └── Identify focus areas │
│ │
│ Cell Reviews (60-90 min) │
│ ├── Start with extremes (top-right, bottom-left) │
│ ├── Discuss borderline cases │
│ └── Make and document decisions │
│ │
│ Final Review (15 min) │
│ ├── Review final distribution │
│ ├── Confirm all decisions │
│ └── Note action items │
│ │
│ Closing (10 min) │
│ ├── Summarize outcomes │
│ ├── Confirm next steps │
│ └── Thank participants │
│ │
└─────────────────────────────────────────────────────────────────┘
Facilitation Tips
DO ✓
├── Start with clear ground rules
├── Focus on behaviors and results, not personality
├── Ask for specific examples
├── Encourage diverse viewpoints
├── Document all movement decisions
└── Keep to time
DON'T ✗
├── Let one voice dominate
├── Accept vague justifications
├── Skip documentation
├── Rush through important discussions
├── Allow personal attacks
└── Override data without evidence
Handling Disagreements
When Calibrators Disagree:
1. Seek Understanding
"Help me understand your perspective..."
2. Request Evidence
"What specific examples support that view?"
3. Find Common Ground
"Where do we agree about this person?"
4. Make a Decision
"Given the evidence, the placement is..."
5. Document the Discussion
"The rationale for this decision was..."
Data Quality
Rating Distribution Guidelines
┌─────────────────────────────────────────────────────────────────┐
│ HEALTHY DISTRIBUTION (Guideline, not mandate) │
├─────────────────────────────────────────────────────────────────┤
│ │
│ Top performers (Stars, etc.) 10-20% │
│ Solid contributors 50-70% │
│ Needs development 10-25% │
│ Performance concerns 5-10% │
│ │
│ Warning Signs: │
│ ├── > 30% in top cell (rating inflation) │
│ ├── 0% in bottom cells (avoidance) │
│ ├── All scores = 3 (differentiation failure) │
│ └── Perfect correlation across managers (copying) │
│ │
└─────────────────────────────────────────────────────────────────┘
Comment Quality Standards
STRONG COMMENTS:
• Specific examples of performance
• Clear development recommendations
• Connected to business outcomes
• Balanced perspective (strengths AND development areas)
Example:
"Successfully led Q3 product launch, exceeding targets by 15%.
Should develop stakeholder management skills for next level."
WEAK COMMENTS:
• Generic ("Good employee")
• No examples
• Only positive OR only negative
• Copy-pasted from previous reviews
Example:
"Meets expectations."
Communication
Timeline Communications
| When | What | Who |
|---|---|---|
| 2 weeks before | Campaign overview, expectations | All participants |
| Campaign start | Access instructions, deadline reminder | Reviewers |
| Midpoint | Progress update, completion reminder | Non-completers |
| 5 days before | Final deadline reminder | Non-completers |
| After calibration | Results communication guidance | Managers |
Messaging Tips
DO:
├── Be clear about purpose and expectations
├── Explain the "why" behind the process
├── Provide support resources
├── Set realistic timeframes
└── Follow up consistently
DON'T:
├── Surprise people with assignments
├── Leave questions unanswered
├── Send too many reminders (3-4 max)
├── Promise outcomes you can't deliver
└── Forget to close the loop
Continuous Improvement
Post-Campaign Review
┌─────────────────────────────────────────────────────────────────┐
│ RETROSPECTIVE QUESTIONS │
├─────────────────────────────────────────────────────────────────┤
│ │
│ Process │
│ ├── Was the timeline appropriate? │
│ ├── Were criteria clear and useful? │
│ └── Was the calibration session effective? │
│ │
│ Participation │
│ ├── What was the completion rate? │
│ ├── What barriers did participants face? │
│ └── Was manager engagement high? │
│ │
│ Quality │
│ ├── Was differentiation meaningful? │
│ ├── Were comments actionable? │
│ └── Did calibration add value? │
│ │
│ Outcomes │
│ ├── Did results lead to action? │
│ ├── Were stakeholders satisfied? │
│ └── What would we change? │
│ │
└─────────────────────────────────────────────────────────────────┘
Navigation
- Previous: Campaign Completion
- Next: Common Mistakes
- Back to: Documentation Index