The 2020 IF Reviews Competition Results are in!

Pretty much the only people voting are other reviewers, so it’s more or less all miss congeniality right now.


Haha. I hadn’t thought of that. Makes sense. :slight_smile:


Okay, so there were 13 voters total in the competition. I’ll tabulate the results and put them up later today!


Here are the results! With 10 voters participating, using a Naive Bayesian scoring method, the top 5 scores for Best Coverage are:

Mike Spivey 8.08
Carl Muckenhoupt 8.01
Stian 7.64
Victor Gijsbers 7.611
DeathByTroggles 7.56

and the top 5 scorers for Best Individual Reviews are:

8.11 Victor Gijsbers
8.09 Mike Spivey
7.939 Patrick Mooney
7.933 Wade Clark
7.8 Carl Muckenhoupt

I plan on contacting winners individually for selection from the prize pool. Prizes will be awarded one at a time, offered to Best Coverage’s winner first, then Best Individual Reviews, and back and forth. Prizes can be declined, and any unused prizes will be incinerated.

Congratulations to the winners and thanks to those who participated!


Congrats to the winners who are much-deserving, and all the reviewers! I tried to read as many as I could over the past couple of days, and there was some really insightful feedback and legit great writing in many of the the reviews.

Also since apparently my role in this thread is to be an annoying nitpicker, should Carl’s coverage rating actually be 8.11 rather than 8.011, since Mike S is in second with 8.08?


Sorry, it is probably just a typo, but from this table, Mike Spivey has a better score than Calr Muckenhoupt.


Oh no! You’re right. I’m so sorry, I’ll correct that right now

1 Like

Sorry to keep being That Guy, but now the top two are both Mike Spivey.

Maybe this is for the best though, since I realized that in my earlier post I neglected to say thanks to you, Brian, for organizing this – keeping on top of all of this must have been a bunch of work on top of everything you did for the Comp itself, but as an author, it’s been so awesome to have a bunch of great reviews so fostering more of them is deeply appreciated. And this has been a great way to help pass the time between the Comp closing and the results post. So kudos and thanks!


Thank you! I also plan on passing on the additional positive comments from the forms. Thanks for your help!


Congrats @Spike, well deserved! I personally thought your review of my game was written better than my game itself! :smile:


Congratulations @Spike, I’ve enjoyed reading all of your reviews this year!


The window of opportunity for voting was really short, between the time the main competition ended and the results of this side competition were announced. I kind of missed the window.

Looking back, I’m not sure how I would have evaluated the reviews anyway. I’ve only read a small number of the other reviewers (mostly those that promote reviews within this forum) and only a small number of reviews (mostly those I’d already finished playing).

I am duly impressed with Mike, Victor and Carl for placing in both categories. However, I remain with the same nagging sense I had at the beginning of this that the goals of “wide coverage” and “deep thoughtful reviews” are at odds with one another. Reading several of Patrick Mooney’s reviews (which came to my attention only after his work was recognized in this side competition) I think they are amazingly thoughtful and entertaining. Best of show. But understandably, he only published nine of them (by the spreadsheet count). Good reviews take time to write, in some cases more time than was spent playing the actual game.


It all depends on our desirable outcomes. My desirable outcome was that each person who entered IFComp would have several people write reviews about their game. In other words, I guess my biggest hope was to increase the minimum number of reviews per game.

In that sense, the competition failed. In 2019, three games got 3 ratings, the lowest of that year. This year, 5 of 6 games got only 3 ratings, so the number of ‘low-review’ games actually increased.

In fact, my true goal was never to increase the quality of reviews, only to make the lower-scoring people feel noticed. But in all the discussion leading up to the comp, feedback was that people wanted reviewers evaluated on quality more than quantity.

I don’t know if I’ll run it again in the future, but I would definitely let someone take it up and transform it.


There were a more than 25% increase in games this year and reviews per game were already on a downwards trend.
That the number of reviews pretty much kept up this year is surely a success!


I definitely think the individual reviewers excelled themselves and did better than past years, and we even drew in some new reviewers who I really admire (like @RadioactiveCrow). So you’re right!


Then I feel like I’ve helped you achieve your goal, having reviewed eight of the games with five or fewer ratings (marked yellow and orange on the spread sheet).

The distribution of reviews is itself an interesting topic for reflection. Short, well written games garner far more reviews than longer, even better written games. (Cursed Pickle of Shireton, only five reviews.
Little Girl in Monsterland, three reviews.
Where the wind once blew free, five reviews.

vs Congee (seventeen) and Amazing Quest (twenty one!)
To this observation, I admired The0didactus’s goal of reviewing only massive games. However looking now, I’m not sure whether those reviews made it onto the spreadsheet, or how many were completed.

Games playable within the browser get played more often than those that require a download. I would like to play Jim Aikin’s game (captivity) eventually, but I have avoided it up to now (five reviews).


First of all, many thanks @mathbrush, for your compliment and encouragement, and for running this side comp. I also really appreciate you passing along the feedback you received to me, it was helpful and good to hear. I definitely think the side comp did its job of driving engagement in the main comp and bringing attention to writing reviews as a valuable part of the community. I also loved being introduced to reviewers that I didn’t know existed, that do the reviews primarily on IntFiciton or their own blogs. I really hope the competition returns in one form or another for next year.

I will say that when it finally came time to vote though that it was a little overwhelming. There were so many reviewers to vote on, most of whom I didn’t know and had never read before, that I felt like I needed to read a ton of reviews before voting to be well informed. This was an exhausting proposition after having spent two months playing and reviewing 40 games. In the end I sampled from some of the reviewers who had written more then 30 reviews, reading their reviews of my favorite games, rating them and leaving the others blank. I’m glad I did as I discovered some reviewers I hadn’t read before that I will definitely be going back to in the future. But I probably only got through 5-6 in this manner before just running out of energy for it.

I had couple ideas for changes have popped into my head, so I’ll just throw them out there and see if y’all like any of them.

  1. Maybe the Best Coverage award can just be who wrote the most reviews, just list the top 5 most prolific reviewers.
  2. Or maybe the Best Coverage prize can be awarded by lottery where you get one entry for each review you write.
  3. Best Reviewer could be done Spring Thing style, where other reviewers and comp authors (together or have separate categories for each) nominate reviewers for the award and the top five reviewers with the most nominations go to the final vote.
  4. Or Best Reviewer could be done like Victor’s Top 50 list, submit up to 10 names of reviewers you like, each mention counts as one vote, most votes wins.

As a judge who upped the number of reviews I wrote this year from 5 to 15, I’d say part of the reason I did it was due to the momentum you helped create to get more public reviews written. So I think this effort had a positive impact compared to if you hadn’t done it—if not in the minimum number of reviews per game, then at least in the total number of reviews. For the next comp where I don’t have a game in the mix, I’m hoping to target 25 reviews!

Btw, the reason I didn’t vote in this side comp was that I didn’t feel like I could evaluate the large number of reviews for each reviewer (sometimes 50+ each) without spending a lot of time going through them—and if I hypothetically had that time, I could spend it instead reviewing more games. That’s not a knock on the side comp, just how things played out for me. I appreciate the awareness it created, and I hope the comp organizers and folks in the community continue looking for ways to incentive judging and reviewing.


Congratulations to all participants! This is a great event in which I sadly did not take part this year. I’ll make up for that by judging in IFComp2021 and reviewing the games as I play them.

I have certainly enjoyed the flood of reviews about the new games. Maybe this challenge was helpful in generating so much of them?.. I’ll have to take a few days off next week to peruse the lot of them.


Voting on the prolific bunch, I just tried to vote on the threads I was still following as of the start of the comp, which by definition became the ones I liked. So people who started reviewing later missed out with me. Patrick Mooney’s reviews especially, I didn’t know about 'til this thread (I know he’s in the detailed camp like me, not the prolific camp).

I have no stats to back up my observations: My impression has been that as the number of games in each comp started to rise more sharply, I felt there was a dip in reviewing for a couple / a few years, but then there was an upswing to match the number of games. If most games get more and longer reviews than they used to, that seems to be pretty obviously good, and that’s where I feel we’re at.

Thanks for running it Brian and thanks for forwarding me the comments.