Skip to main content

Game development is incredibly competitive, and one of the fastest ways studios gain an edge today is through how efficiently they process player data. Traditional playtesting generates huge amounts of qualitative data—primarily player recordings—which often create analysis bottlenecks. Researchers end up manually combing through hours of footage, slowing down development decisions and delaying action.

Human-in-the-Loop (HITL) AI analysis offers a more efficient approach. By combining the speed of artificial intelligence with the contextual understanding of human researchers, teams can remove analysis bottlenecks without sacrificing depth or accuracy. This article explains how PlaytestCloud’s AI works within this HITL system, and how it accelerates player insight while keeping research grounded in real player behaviour.

Speeding Up Player Data Analysis

In our workflow, AI acts as a high-powered research assistant, taking on the repetitive, time-consuming parts of data processing so researchers and developers can focus on interpretation and action.

The AI doesn’t replace human analysis. Instead, it clears the path for humans to do what they do best: understand why players behave the way they do.

AI Annotations: Picking Out and Summarising Key Moments

Our AI reviews player recordings and automatically identifies important moments in the footage. This includes instances of:

  • Frustration: failures, confusion, and unexpected outcomes
  • Success: moments where players achieve goals or progress smoothly
  • Appreciation: positive or negative sentiment, and comments on music, design, or aesthetics

Instead of long, unstructured recordings, you get structured, searchable data for every participant. A 30-minute video becomes a concise collection of meaningful moments, making it far faster to review.

Summarising Key Findings From the Playtest Video

Beyond individual annotations, AI can summarise each video’s key insights. This gives you a fast, high-level overview of what stood out in a session, without needing to watch it from start to finish.

How AI Speeds Up Analysis

Annotations

By identifying and summarising significant moments, the AI allows researchers to jump straight to the most insightful sections of each recording. You can review a long session in minutes by focusing only on the automatically compiled highlight reel.

These clips often reveal issues more clearly than text summaries, helping teams understand problems quickly and align around visual evidence.

This is especially valuable in large-scale studies, where usual constraints would force teams to rely more heavily on surveys or quantitative metrics. AI-enabled analysis brings qualitative observation back into the process, at scale.

First Findings

Once all videos are processed, the AI synthesises the common themes across the entire playtest group. These are delivered as First Findings: a preliminary report highlighting the most frequent issues, positive signals, and trends observed across all participants.

This immediate, high-level overview gives your team actionable starting points as soon as your playtest finishes, long before the full human-led analysis is complete. It helps teams prioritise where to focus their attention in the next iteration.

Humans Involved in the Process at Every Stage

While AI speeds things up, the analysis always remains rooted in human expertise.

Expert Oversight

PlaytestCloud’s games user researchers help design the AI workflow to ensure the outputs are useful, accurate, and aligned with the needs of game developers.

Grounded in Real Player Data

All insights come directly from representative players within our pool of 1.5 million users. AI never hallucinated insights, it summarises real behaviours observed across verified players.

Human Interpretation

Game developers and researchers still extract deeper meaning from the data. They interpret why patterns appear, provide strategic recommendations, and communicate findings to their teams. AI gives them a head start; humans provide the insight.

TL;DR

The combination of AI-powered annotation and summarisation with expert human oversight represents a significant leap forward in qualitative research. This Human-in-the-Loop approach delivers:

  • Fast insights: preliminary, actionable findings immediately after your playtest
  • Accurate context: grounded in human interpretation and real player behaviour
  • Deeper analysis: with AI handling the heavy lifting so researchers can focus on meaning, not mechanics

By streamlining analysis, teams can react to player feedback faster and make more confident decisions, ultimately leading to better, more player-centred games.

Tags:

AI
Jozef Kulik
Jozef Kulik
December 22, 2025 at 1:04 PM
Senior User Research and Accessibility Professional in the gaming industry with a focus on establishing effective feedback loops alongside people with disabilities to assist studios in making better and accessible games. I have earned both academic and industry accolades, including the esteemed 'Best Paper Awards' at the Foundation of Digital Games Conference in 2020 and the 'Best Academic Research in Accessibility' by Can I Play That in 2023. I hold a master's degree in Clinical Neuropsychology with Distinction, fostering a multidisciplinary approach to my work in accessibility and gaming., as well as an undergraduate degree in Psychology from Bournemouth University, securing 'Best Undergraduate Research' and a fully funded Master's scholarship. As a proud member of both the disability and neurodivergent community, my personal journey fuels my passion for creating inclusive experiences within the gaming world.