
Quality engagement vs vanity metrics
Why real engagement can’t be measured in likes
In an age of dashboards, analytics, and real-time reporting, it has never been easier to measure activity.
And it has never been easier to mistake that activity for engagement.
Likes, impressions, reach, views, shares — these numbers dominate how communication success is reported. They are neat, comparable, and reassuring. They also tell us very little about whether engagement has actually occurred.
Because visibility is not trust.
And attention is not legitimacy.
The rise of vanity metrics
Vanity metrics persist because they answer easy questions:
- How many people saw this?
- How many reacted?
- How far did the message travel?
These metrics are useful in marketing contexts, where awareness is often the primary objective. But when imported uncritically into community and stakeholder engagement, they become a poor proxy for success.
A project can achieve impressive reach and still leave communities confused, sceptical, or disengaged. A post can perform well and yet fail to surface the concerns that matter most.
In engagement work, what matters is not how many people noticed a message — but how people experienced the process.
What vanity metrics don’t capture
Vanity metrics are silent on the questions that actually determine long-term outcomes:
- Did people understand the issue, not just see it?
- Were concerns raised early enough to influence decisions?
- Did trust improve, decline, or fracture?
- Did feedback materially change thinking, design, or approach?
Perhaps most importantly, vanity metrics cannot distinguish between:
- quiet understanding and quiet disengagement
- informed consent and resigned silence
Silence is often interpreted as acceptance. In reality, it is frequently a warning sign.
What quality engagement looks like
Quality engagement is harder to measure precisely because it deals with human complexity rather than surface behaviour.
It shows up as:
- informed questions that challenge assumptions
- disagreement that is visible, documented, and taken seriously
- participation that is sustained over time, not just at peak moments
- evidence that feedback influenced decisions, not just messaging
Quality engagement is slower.
It is messier.
And it is often uncomfortable for decision-makers.
But it is also where risks are identified early, trust is built incrementally, and legitimacy is earned rather than assumed.
Why organisations chase the wrong signals
Vanity metrics persist not because they are meaningful, but because they are convenient.
They are:
- fast to collect
- easy to compare
- simple to report upward
Quality engagement, by contrast, surfaces tension. It exposes trade-offs. It challenges preferred narratives. It rarely fits neatly into a slide deck.
Yet those uncomfortable signals — dissent, uncertainty, conditional support — are precisely the signals that protect projects, policies, and institutions from long-term failure.
Reframing what “success” means
If we are serious about engagement, success needs to be reframed.
The key question is not:
“How many people did we reach?”
But:
“How well did we listen — and what changed as a result?”
Communities do not judge engagement by how visible it was.
They judge it by whether it felt fair, transparent, and consequential.
In the long run, legitimacy is not built through performance metrics.
It is built through listening, follow-through, and trust.
And those are the only metrics that truly last.
