The 2020 IF Reviews Competition Results are in!

Edit:Results on message 10 in this thread

As discussed in this thread:

I am running the Best Reviews competition for the second time (the last time was in 2018).

This is sort of an experiment, so if things go terribly wrong I’ll probably not try it again. But hopefully, it works out well.

There are two categories to vote in:

Best Coverage

This is for reviewers who you feel did a great job on review coverage, meaning they either did many good reviews or helped review lesser-played games.

Scores are assigned from 1-10. You do not need to vote for each reviewer. Scores will be computed via the Bayesian method, meaning that several ‘okay’ scores are better than one score of 10.

Google form for Best Coverage:

Best Individual Reviews

This is for high-quality reviewers who made great in-depth analyses of games. Even having one great review is good enough reason to score someone highly in this category. There is no text entry for which reviews in particular you liked, but you can add that if you like in the comments at the bottom.

Scores are calculated similarly.

Google form for Best Individual Reviews:


I hope this goes well. Feel free to add to the comments any positive things you want me to pass on. Reviewers can vote for each other but not for themselves. If someone would lose because they voted for someone else, I’ll reverse that vote, unless we get a Gift of the Magi situation where two reviewers make each other win, in which case I’ll add a special prize for both.

Good luck!


Excited this is going live, hope the reviewers get some good attention out of it! One quick question – by when on the 2nd must ratings go in? 11:59 Eastern, as with the comp, or is there some other deadline?

That sounds like a great deadline; lets make it that!

I’ve had 8 votes in one poll and 7 votes in another so far, which is above average of what I was expecting. It would be great to have at least 20 people vote in each category.

1 Like


I’m diving in and it look like Walter Sandsquish isn’t included in the voting forms – is that an oversight? He only had four reviews, but others with a smaller number are listed…

That was an oversight. You can write in a score for him, however, and if I can edit it in I will.

1 Like

Is there a Miss Congeniality side contest here?

1 Like

Pretty much the only people voting are other reviewers, so it’s more or less all miss congeniality right now.


Haha. I hadn’t thought of that. Makes sense. :slight_smile:


Okay, so there were 13 voters total in the competition. I’ll tabulate the results and put them up later today!


Here are the results! With 10 voters participating, using a Naive Bayesian scoring method, the top 5 scores for Best Coverage are:

Mike Spivey 8.08
Carl Muckenhoupt 8.01
Stian 7.64
Victor Gijsbers 7.611
DeathByTroggles 7.56

and the top 5 scorers for Best Individual Reviews are:

8.11 Victor Gijsbers
8.09 Mike Spivey
7.939 Patrick Mooney
7.933 Wade Clark
7.8 Carl Muckenhoupt

I plan on contacting winners individually for selection from the prize pool. Prizes will be awarded one at a time, offered to Best Coverage’s winner first, then Best Individual Reviews, and back and forth. Prizes can be declined, and any unused prizes will be incinerated.

Congratulations to the winners and thanks to those who participated!


Congrats to the winners who are much-deserving, and all the reviewers! I tried to read as many as I could over the past couple of days, and there was some really insightful feedback and legit great writing in many of the the reviews.

Also since apparently my role in this thread is to be an annoying nitpicker, should Carl’s coverage rating actually be 8.11 rather than 8.011, since Mike S is in second with 8.08?


Sorry, it is probably just a typo, but from this table, Mike Spivey has a better score than Calr Muckenhoupt.


Oh no! You’re right. I’m so sorry, I’ll correct that right now

1 Like

Sorry to keep being That Guy, but now the top two are both Mike Spivey.

Maybe this is for the best though, since I realized that in my earlier post I neglected to say thanks to you, Brian, for organizing this – keeping on top of all of this must have been a bunch of work on top of everything you did for the Comp itself, but as an author, it’s been so awesome to have a bunch of great reviews so fostering more of them is deeply appreciated. And this has been a great way to help pass the time between the Comp closing and the results post. So kudos and thanks!


Thank you! I also plan on passing on the additional positive comments from the forms. Thanks for your help!


Congrats @Spike, well deserved! I personally thought your review of my game was written better than my game itself! :smile:


Congratulations @Spike, I’ve enjoyed reading all of your reviews this year!


The window of opportunity for voting was really short, between the time the main competition ended and the results of this side competition were announced. I kind of missed the window.

Looking back, I’m not sure how I would have evaluated the reviews anyway. I’ve only read a small number of the other reviewers (mostly those that promote reviews within this forum) and only a small number of reviews (mostly those I’d already finished playing).

I am duly impressed with Mike, Victor and Carl for placing in both categories. However, I remain with the same nagging sense I had at the beginning of this that the goals of “wide coverage” and “deep thoughtful reviews” are at odds with one another. Reading several of Patrick Mooney’s reviews (which came to my attention only after his work was recognized in this side competition) I think they are amazingly thoughtful and entertaining. Best of show. But understandably, he only published nine of them (by the spreadsheet count). Good reviews take time to write, in some cases more time than was spent playing the actual game.


It all depends on our desirable outcomes. My desirable outcome was that each person who entered IFComp would have several people write reviews about their game. In other words, I guess my biggest hope was to increase the minimum number of reviews per game.

In that sense, the competition failed. In 2019, three games got 3 ratings, the lowest of that year. This year, 5 of 6 games got only 3 ratings, so the number of ‘low-review’ games actually increased.

In fact, my true goal was never to increase the quality of reviews, only to make the lower-scoring people feel noticed. But in all the discussion leading up to the comp, feedback was that people wanted reviewers evaluated on quality more than quantity.

I don’t know if I’ll run it again in the future, but I would definitely let someone take it up and transform it.


There were a more than 25% increase in games this year and reviews per game were already on a downwards trend.
That the number of reviews pretty much kept up this year is surely a success!