The IFComp 2020 Best Reviews Competition (Feedback requested in post 38)

I tend to hang out on IFDB more than here, and so I really appreciate @DeathByTroggles reviews. He is pretty prolific, but his reviews are not skimpy. He obviously puts a deserved amount of effort and thought in it to each one and he frequently catches something that I missed.

Even though I haven’t read a ton of the review threads here as much as IFDB, I have read @VictorGijsbers reviews. I love his philosopher’s approach and appreciate his robust reviews as well.

5 Likes

Thanks for the kind words! I regret losing steam halfway through the competition and not reviewing nearly as many games as I wanted to. (It was a combination of a break away from the computer, lots of work, lots to do in the family, and I think in the end also a sort of growing and paradoxical dread that I wouldn’t be able to review as many games as I wanted to.)

I’ve been trying not to be too hard on myself.

6 Likes

Let’s try and fill the spreadsheet out as much as possible. I’ve been checking up on some of the people already listed and found out they had more reviews and I could use some help.

The spreadsheet will be how I make the voting form, with only people listed on the spreadsheet eligible for votes.

2 Likes

Same here. Had delusions of getting through them all during the first week of the comp, then wanted to do half. I think I’ll hit 40. My drive definitely fell off after about a month, due to many of the same reasons.

4 Likes

I’ve kept mine up to date. Could I be helpful in getting someone else’s review list up to date as well? Or should each reviewer fill out their own part of the spreadsheet?

1 Like

It could help to look over people who don’t frequent the forums. If there’s a name you don’t recognize, you could check the links to see if anything has popped up on that site.

I just finished a scrub of the review spreadsheet. I checked all reviewers that published their reviews on IFDB or blogs, plus I double checked The Short Game’s entries (in the running for my vote for Best Coverage). Figured those posting reviews here probably have seen the reminders and not sure how to double check those that review on Twitch (though they have my thanks for extending IF to that platform).

For those I did look at I checked to make sure the most recent review they had posted on IFDB or their blog was logged properly, if it wasn’t I logged it and then kept going backwards until I started finding reviews that had been logged. I ended up adding seven to the spreadsheet (basically all the public reviews posted since my review of Tavern Crawler at the moment).

We are well over 1,000 total reviews (774 already public). That’s amazing! And every single game has a minimum of 3 public reviews!

7 Likes

Wonderful! Thank you so much. I added at least one Twitch reviewer’s reviews, so I’m glad I could help. That’s a lot of work, and I appreciate it!

Just a few thoughts before the Best Reviews Competition starts:

The point system of this competition system appears to be the same as IFComp. Does it also apply to how the final result is calculated?

I don’t intend to downvote any reviewer, just want to mention some possible implications.

I can imagine most people only give points to the reviewers they like. However, if the final result is based on average scores (like IFComp), then some might feel that it is fairer if they rate all reviewers they have read, both good and less good reviewers. But most might not feel like downrating any reviewers, even if they feel they don’t deserve a high rating(?). This could skew the result a lot.

So perhaps it is better to base the result on the sum of scores, instead of the average score? (perhaps you already have decided that, just wanted to mention this)

I think the average score is excellent for game comps, but for reviewers I think the sum of ratings is better as all ratings then reflect something positive. What do you think?

1 Like

I’ll chime in to say that @FriendOfFred wrote an excellent review and analysis of The Eidolon’s Escape.

3 Likes

That’s an interesting question. What if some of the reviewers are better-known than others?

Perhaps it would be better to make it like XYZZY voting, where you vote in multiple stages, picking two favorites in the first and one favorite in the second.

Although I think I should keep it the same way I advertised (although I didn’t realize that I was supposed to make it tomorrow, and was planning on making it today). If it runs again, I might change it.

1 Like

Looking through the spreadsheet now, I discovered a blog of great reviews I hadn’t seen before: The Stack by Carl Muckenhoupt. His reviews are not on IFDB and his blog is not on Planet IF, otherwise I would surely have read them already. Is it customary for anyone to submit blogs to be included on Planet IF or only for the blog’s author to do so?

3 Likes

Also, I can’t see @patrick_mooney’s reviews, which I enjoyed a lot, in the spreadsheet. Is this deliberate, or have I perhaps seen wrong?

2 Likes

I don’t believe it’s deliberate. The spreadsheet is community-run, so the community just seems to have not picked up on his, probably because they (I think?) started later.

So I’ll add him to the voting list.

I think what I’ll do for the votes (to address Denk’s concern) is that I’ll use Bayesian averaging, like the IFDB top 100, where you ‘add’ an average number of average votes to each score at the end. That makes 4 scores of 9 better than 1 score of 10, for instance.

I plan on making the form now, and who knows how long it’ll take, but if anyone wants to add any other reviewers last-minute, let me know!

2 Likes

Ooh what if you take the first letter of every review that’s not on the spreadsheet and then that’s an anagram for the metapuzzle?

6 Likes

Thanks, Stian. I missed the initial announcement that there was a spreadsheet this year, but I’ve added them to the sheet now.

3 Likes

Results are now available:

5 Likes

Now that the review comp is over, it’d be useful to get and give feedback.

On my end, the biggest problems were trying to collect data and report it accurately. Collecting data with google forms was fairly easy, but being accurate didn’t work well for me. I omitted some people from the forms on accident (luckily Patrick Mooney was later added and even won a prize). Worse, I calculated the winner wrong, so those are things I’d have to work on, perhaps bringing on an extra person to help check for accuracy.

Getting voters was tough; only 13 people voted in the end, almost all of them other reviewers.

On your end, how did things go? Was the competition beneficial to you? Did it cause any problems? Is it a nice way to recognize helpful people or a divisive way to split people into have’s and have-not’s?

And,

Should it run again next year?

1 Like

To me, it looked like there were lots of positive feedback to all reviewers who posted on the forum at least, so for the purposes of recognising the reviewer’s effort, I’m not sure it added much, for most of us, anyway, though I’m happy for the winners and hope it felt like an extra bonus recognition for them.

What I would appreciate more is constructive criticism, reviews of reviews. It was indeed one of the prizes, but it is something every reviewer could benefit from.

3 Likes

One reason I didn’t participate in voting was that I’ve still only played a fraction of the games and I’ve avoided reading tons of reviews to avoid spoilers. So I didn’t feel like I had an informed enough opinion to vote fairly.

5 Likes