A few days ago, Blizzard released an infographic detailing some statistics behind Hearthstone's Arena. Included among them was information on the best Arena players around the world, some of the most popular cards, and other various fun facts. It was well received by the community, but nonetheless, players took the chance to ask the development team for even more communication. Game Designer Dean "Iksar" Ayala, one of the usual developers leading communication efforts, was one of the first to respond.
He began by simply offering clarification on some of the stats:
Would love more feedback here as to what statistics you all think best represent who the top arena players are.
For some added clarity, the top 3 players were sorted by average wins per run with a minimum of 100 runs over that 9 month period. There were players with higher wins per run over 9 months, but most of them had <25 runs. Ranking arena players is a bit tricky because there is no MMR to rank them by. We have wins per run as a statistic, but have to pick a minimum run amount in order to have wins per run represent the best players. (One is too few, 1000 is probably too many) I'll be reading everything here and go over some of things we tried a little while later after some more feedback rolls through.
By far the most requested in the thread was a monthly report similar to what the Chinese server does every month. Merps, one of the biggest English Arena streamers brings up a few good points including the competitive nature of Arena. He explains that because there is no scheduled release of infographics like this or any kind of leaderboard, players like him are constantly forced to choose between having fun (by playing low tier classes) and hoping someday that their skills are eventually proven by something Blizzard may or may not release.
I think a monthly release on the web (similar to constructed) is a reasonable request. I wonder what is the best way to track those players. Here are a couple ideas...
Highest Win Rate over X Runs Monthly. Maybe X is around 30.
Highest SCORE over X Runs Monthly. Where score is determined by a formula we think is a fair representation of the best arena players.
Some examples of SCORE Formula:
A: Wins / Losses +12.
The idea behind formula A would be to not have a minimum or maximum amount of runs, but to have a score formula that rewards consistency over many runs at a reasonable rate. The difference between 10 runs and 100 runs with the given formula is huge, but the difference between 200 runs and 300 runs is pretty small. This is the formula we had originally, but ended up swapping for an easier to understand one. The original top player using this formula was Chessdude, who had something like a 9.3 win average over 25ish runs in 9 months.
Formula (sort of) B:
Add your highest win totals with each class together. Bonuses for multiple 12 wins with one class. Example:
If you have one 12-win run with each class, your monthly score would be 108. So you are motivated to keep playing after achieving this (really difficult to achieve) we could award a bonus (maybe 1-2 Score) for each successive 12-win run with that class. The upside to this score is that it motivates playing with each class, the downside is it awards playing more runs as opposed to having a high win rate. Of course, in order to achieve a high score here you would have to be an insane player, but the downsides still apply. Anyway, still looking forward to more feedback thanks for everything so far!
In regards to not releasing detailed statistics on global win-rates, Iksar provides pretty solid reasoning, admitting that doing so would color people's perception of the game mode and likely skew the meta as a result.
In general I think it's best for us to not bias the class you choose to pick because the perception that that class is 1% better on average across all players. We don't matchmake based on MMR in arena, so class power level perception plays a fairly big role in the win rates of each class.
Just for example, consider the following scenario. Player A is an awesome arena player and averages around 7 wins per run. Player A follows the Hearthstone scene and is of the opinion that Mage is the best arena class by far, and picks that class whenever they see it. Player B is a player who is less experienced, and doesn't follow the Hearthstone scene at all. They don't have any idea which are the 'best' and 'worst' arena classes, and just pick whatever class they think is the coolest. Player A picks Mage, and Player B picks another class, say Shaman.... and they match against each other.
At this point, it doesn't really matter what the actual power level of Mage or Shaman are (to a reasonable degree) because the Mage player is going to be favored due to a more experienced and informed player playing the deck. Whenever you have a system that is not matching players by some form of true skill, this is something to take into consideration. I think when Mage gets to around 52-53% win rate it will probably be 'balanced', but it will take awhile for the perception surrounding Mage and the other classes to change enough that the best players aren't picking that class at a much higher rate than the others.
He continues explaining himself in the same thread saying:
|I think the point I'm trying to make is if there is a post Mage had the highest win rate, the people that read that post (engaged, informed, researching players) will likely play Mage more often. Because those players play Mage more often, win rates rise, then we post new data, then the same cycle repeats itself in a slightly lower capacity.|
And while he admits there's already a certain amount of bias because of the perceived power levels and the development team isn't yet sure if having that knowledge be public and affecting the evolution of the Arena is something they want to do.
|It appears that way yes, I think it's a question of whether or not perpetuating the cycle I describe is a good thing or not. If you played WoW at all, there are some similarities there when it comes to ranking different specs against one another. Simulation sites might say Fire Mage does 100k DPS and Frost Mage does 95k DPS. Other sites like worldoflogs would then display actual performances of Fire vs Frost and would show Fire outdps'ing Frost by a margin significantly higher than 5%. This isn't necessarily because the sims are wrong (they sometimes are) but more as a result that the informed and researching population is all playing fire, and the less informed and researched population chooses frost at a higher rate. This of course isn't the only example of how perception bias affects actual data, but it's the particular example in which I learned about it. To be clear, this sort of thing isn't a situation where we throw our hands up in the air and say "welp, it's not us it's the players!" but just another point we have to consider when trying to get arena to a place where most of the classes have a similar play rate.|