How to use "Bad" Playtests
At PlaytestCloud we will always replace any video you deem unusable – either because of low audio quality or because of the incorrect player targeting. But have you ever tried to approach these “leftovers” scientifically and use them to further improve your game? With PlaytestCloud you don’t have to feel like reading the tea leaves – we’re more than happy to help you better understand all byproducts of your playtest. Below you’ll find a few examples on how developers and researchers are utilizing “bad apples” to learn from ill wind.
When players have churned
In every longitudinal playtest we always have at least a few players who have churned - there’s no better lesson in retention! Maybe the player had expressed that they were bored throughout the game? Maybe there is a specific action in the game that threw them off? Maybe the gameplay was too simplistic for the target audience you are aiming for? You can learn all of that while watching the videos of churned players.
When players express negative comments
Unfiltered comments are a great way to learn about your game – negative feedback may sting at first, but if you will take a step back you will see plenty of valuable insights. Try to imagine what’s the source of player frustration: is the gameplay too repetitive or too challenging for this stage of the game? Does the mechanic work as expected by your target audience, or maybe it’s anti-intuitive? Are some features too hard to find or is one of them pushed too much? Track the gameplay problem to its roots by listening to the way the players are expressing their annoyance.
When players are incorrectly targeted
If a player who doesn’t fit your target audience found their way to your playtest you can often hear words of confusion. The mechanics can be more difficult to navigate and the player doesn’t reach the point of the game as fast as they should. There is an advantage of testing with someone outside of your core target audience though – maybe a fresh pair of eyes can reveal some UI flaws regular players wouldn’t notice?
When players don’t follow instructions
If you want players to explore a specific part of the game, a task-based playtest is a good way to go. But every now and then some of the players will skip a task or two, which also can be a great lesson about the flow of the game. Is this step in the process actually necessary? Is there a way to make it more understandable for players so you wouldn’t have to use this task in the first place? Maybe the player has actually explored this step in another part of the gameplay when it seemed more natural?
When you just want the results to make sense
Many studios have found the value in videos of churned players or other “wrong” videos. If you feel like you’re not ready to analyze such findings on your own, contact us and we will be happy to connect you with one of our external researchers. These excellent specialists will dig through the weirdest of findings and provide you with a comprehensive report explaining how to use these results for good.