What is the Watermelon Effect?

It is the representation showing IT service metrics as green status but the end-user of those services considers them red. These metrics occur in almost every part of an IT landscape not just the delivery of services. In Experience Management, we refer to the Watermelon Effect as context to how people see traditional Service Level Agreements (SLAs) and infrastructure metrics and measures.
SLAs and measures appear to be green, yet the consumer of those services is left unfulfilled and unhappy. That is why it is crucial to have XLAs.
Watermelon Effect In Action.
A real-life example of the Watermelon Effect happened to me on my last airplane trip. The airline measures time to check in, time to onboard the flight, time waiting on baggage, and on-time percentage of flights. While these are great metrics, in general, they fail to fully capture whether I enjoyed my flight or would fly with that airline again. The airline met these metrics just fine, but my experience with the airline was lacking for a few reasons, including rude personnel, inconsistent mask policy enforcement, and nickel-and-dime pricing. The metrics need to be based on the passenger's perspective.
When we think about truly measuring IT services from the consumer’s perspective, we must include measures that give context to the SLAs and measures. In the end, it is the consumer’s happiness that matters. It will dictate whether they purchase again from you in the future.
Those "green" metrics are still important and are probably accurate. So, what do we take from this? Frankly, we must not be measuring something else we ought to be. Maybe we are missing valuable context in the form of metrics.
In a world demanding empathy and a focus on human interaction, experience management is here to stay. It helps us strive toward excellence all the way through.
If your organization wants to have happy employees, happy customers, or just want to deliver excellent services, contact us today for an initial conversation.
The "watermelon effect" illustrates a situation where all targets are hit, and service level reports are "green", while users and customers demonstrate "red" levels of satisfaction. Several factors cause this effect:
- SLAs differing from consumer expectations.
- Targets being met by cheating. For example, closing incident records before the target resolution time, then opening new records with the same information and continuing the work in a new timeframe.
- Not accounting for culture, geography, season, age, or other personal factors when investigating users’ experience.
- Not including user experience in SLAs and therefore not monitoring, measuring, nor analysing it. The obvious solution is to include user satisfaction targets in the SLA.
This is likely to be a useful amendment to the SLAs. However, it is important to remember service providers may adjust the satisfaction rates artificially.
More importantly:
- measuring satisfaction does not equate to understanding users' experiences
- measuring satisfaction does not replace measuring service quality
- adding satisfaction targets does not turn an SLA into an XLA.
