There’s a tension in game development between creative vision and data. Lean too far toward intuition and you risk shipping a game that frustrates players in ways you never anticipated. Lean too far toward data and you optimise the soul out of the experience.

The best approach, as with most things in development, is somewhere in the middle. Analytics should inform creative decisions without dictating them. At Relish Games, we use data as a diagnostic tool — it tells us where problems exist, but the solutions still come from design sensibility.

What to Measure (and What Not To)

The first mistake most developers make with analytics is measuring everything. Every click, every frame, every player action — piped into a database that nobody knows how to interpret. The result is data paralysis rather than data insight.

High-Value Metrics

Session length distribution: How long are players playing in a single session? Short sessions might indicate frustration or natural stopping points. Very long sessions in games designed for shorter play might indicate compulsion mechanics rather than enjoyment.

Level/area completion rates: Where do players stop progressing? A sudden drop in completion rate at a specific level signals a difficulty spike, unclear objective, or technical issue.

Feature discovery rates: Are players finding the mechanics you spent months building? If 80% of players never discover a core ability, the tutorial or signposting has failed.

Return rate: Do players come back after their first session? After their third? Retention curves reveal whether the game has staying power or just a strong first impression.

Low-Value Metrics

Total play time (without context): Meaningless without understanding whether the time was enjoyable or frustrated.

Achievement completion rates: These reflect achievement design quality, not game design quality.

Social sharing rates: More influenced by platform mechanics than game quality.

Setting Up Analytics for Indie Games

You don’t need enterprise-grade analytics infrastructure. A pragmatic indie approach:

Minimal Viable Analytics

  1. Session tracking: Start time, end time, platform, device info
  2. Milestone events: Level starts, level completes, deaths, boss encounters, key item acquisitions
  3. Quit points: Where exactly was the player when they stopped playing?
  4. Error tracking: Crashes, soft-locks, and edge-case failures

Implementation Approaches

Local logging during development: Before you set up any cloud analytics, log events to local files during playtesting. This is free, private, and gives you data immediately.

Lightweight event systems: A simple HTTP endpoint that receives JSON events is enough. You don’t need a third-party analytics SDK with its own privacy implications.

Heatmaps for spatial games: For 2D games where player position matters — platformers, exploration games, action games — recording position data and visualising it as a heatmap reveals movement patterns that verbal feedback never captures.

Interpreting Data Without Losing Your Mind

The Funnel Trap

It’s tempting to treat every metric drop as a problem to fix. But some players are supposed to quit at certain points. A hard optional boss that only 15% of players beat might be working exactly as intended if it creates stories and motivates replays.

Context matters:

  • Is the drop-off at a mandatory progression point? That’s likely a problem.
  • Is it at an optional challenge? That might be the desired difficulty curve.
  • Is it in the first 10 minutes? That’s critical — first impressions determine retention.
  • Is it after 20+ hours? That might just be natural content completion.

Player Segments

Aggregate data hides individual stories. A level with a 60% completion rate might have two completely different player groups:

  • 70% who breeze through and 30% who are completely stuck (skill gap issue)
  • 60% who complete it in 5 minutes and 40% who complete it in 30 minutes (pacing issue)

The same metric, two very different problems, two very different solutions.

A/B Testing for Indies

Full A/B testing requires significant player volume, which most indie games don’t have at launch. But you can still use the principle:

  • Ship different difficulty curves in different demo builds
  • Test two tutorial approaches with separate playtester groups
  • Compare session metrics before and after a patch that changes a mechanic

Practical Examples

Example 1: The Death Map

In a 2D platformer, record every player death location. Overlay those deaths on the level map. Clusters reveal problem areas. A dense cluster at a specific jump might mean the jump is unfair, or the visual cues don’t communicate the timing correctly.

What data tells you: where players die. What design tells you: why they die and what to change.

Example 2: The Item Usage Matrix

Track which items or abilities players actually use. If a weapon is used by 3% of players, either it’s underpowered, poorly signposted, or in a location nobody finds. The data identifies the neglected item; playtesting reveals which problem applies.

Example 3: The Pause-and-Quit Pattern

Players who pause and then quit within 30 seconds are likely frustrated. Players who pause and quit after several minutes may be responding to real-life interruptions. The difference matters for how you interpret quit data.

Ethical Considerations

Analytics in games carry ethical responsibilities:

  • Be transparent: Tell players what you collect and why
  • Minimise data: Only collect what you’ll actually use
  • Anonymise by default: You don’t need to identify individual players for most analytics
  • Respect opt-outs: Let players disable analytics without penalty
  • Local processing where possible: Aggregate data locally and send summaries rather than raw event streams

Based on our experience:

  1. Start with playtesting, not analytics — watch real humans play your game before building any data pipeline
  2. Instrument the critical path — track the journey through your game’s mandatory content first
  3. Add spatial data for 2D games — position heatmaps are uniquely valuable for level-based games
  4. Review data weekly, not daily — daily checks encourage reactive changes; weekly reviews reveal trends
  5. Never change a design that’s working just because the data is “interesting”

The HGE engine provides the system state hooks and frame-level callbacks that make integrating lightweight analytics straightforward without disrupting your game loop.

The Balance

Data-driven design at its best illuminates blind spots. It shows you the things players experience that you, as the developer, can’t see because you know the game too well. At its worst, it reduces creative decisions to optimisation problems.

Use data to find problems. Use craft to solve them.

Discuss analytics approaches with other developers in our forum, or explore how game systems are structured in the HGE documentation.