Leading vs lagging indicators¶
There are two kinds of measurements you can take on any project:
- Lagging indicators tell you what already happened. Revenue last quarter. Tasks closed last month. Customer churn last year.
- Leading indicators tell you what's about to happen. Pipeline coverage. Oldest open task age. Account-health score.
The distinction is from Robert Kaplan and David Norton's Balanced Scorecard (Kaplan & Norton, 1996). Most teams measure lagging indicators because they are easy and obvious. But lagging indicators are by definition too late to act on.
The PMO craft is paying attention to the leading ones — the early-warning signals — so you intervene while it's still cheap.
A simple example¶
You run a small SaaS business. Two metrics you could track:
| Metric | Lagging or leading |
|---|---|
| MRR this month | Lagging |
| Trial signups this week | Leading |
If MRR is down this month, the damage is done. If trial signups are down this week, you have 30 days to react before MRR is hit.
The same principle applies to project work.
Project leading indicators¶
Things to watch in dooer that predict problems before they show up in delivery metrics:
| Leading indicator | What it predicts | Where to find it in dooer |
|---|---|---|
| Oldest open task age | Scope creep + abandoned work | Manager Reports → filter by project, sort by created_at asc |
| Approvals queue depth | Lead becoming a bottleneck | /approvals count |
| Feedback Register count | Issues accumulating without resolution | Project Feedback tab |
| Effort estimate variance | Estimation discipline rotting | Effort approval logs in audit |
| Notification bell saturation | Team disengaging from the system | Per-user bell counts |
| Project status staleness | Project unloved or stuck | Last status change date per project |
Project lagging indicators¶
What dooer is also showing you, but these are after-the-fact:
- Tasks completed per week (throughput).
- Project completion rate per quarter.
- Total effort_hours actually spent.
- Team velocity (story points if used).
These are valuable for reporting up (to a sponsor or executive). They are not useful for operating a project — by the time they move, the cause is weeks behind.
How OKRs fit in¶
John Doerr's Measure What Matters (Doerr, 2018) introduced Objectives and Key Results (OKRs) as a way to combine the two:
- Objective — a qualitative goal (the leading aspiration).
- Key Result — a quantitative measure of progress toward the objective (a leading metric, ideally).
The OKR discipline is choosing 3–5 Key Results per Objective that, if they move, you'll believe the Objective is being met. A team-lead OKR for Q3 might look like:
Objective: "Ship the new pricing flow with high confidence." KR1: 100% of pricing-flow tasks have full briefs (leading — predicts quality). KR2: Average pricing-task age stays under 14 days (leading — predicts on-time delivery). KR3: Customer migration completed for 80% of active accounts (lagging — measures outcome).
The mix of leading and lagging KRs is the trick. Pure-lagging OKRs measure the past. Pure-leading OKRs measure activity, not outcome. Both kinds together measure the system.
How to instrument leading indicators in dooer¶
- Build the portfolio view (T-3.1). Each project gets one row with its key leading indicators.
- Schedule a weekly check. Open the view. Look for outliers — anything red or trending bad.
- Act on the leading signal. Open a project where oldest-task-age is climbing. Find out why. Adjust before the lagging metrics suffer.
This is what a senior PMO operator does every Monday morning. The rest of the week is execution.
Where to read more¶
- Kaplan, R. & Norton, D. (1996). The Balanced Scorecard: Translating Strategy into Action. Harvard Business Review Press.
- Doerr, J. (2018). Measure What Matters. Portfolio. OKRs popularised.
- Goldratt, E. (1990). Theory of Constraints. North River Press. A different lens on what to measure — the constraint, and nothing else.