Thanks so much everyone for participating in this year’s SGS! As usual, we’re looking for feedback on how things went this year and how we could improve the comp going forward.
Changes this year
We dropped the number of entries per author from 3 to 2. This didn’t result in a bump in votes per game through Itch, but did finally get the ribbon category votes up! We received a comparable number of votes on the google form to the average number of votes per game, which is excellent. I haven’t crunched the numbers yet but there seems to be better agreement this year between the raw scores and Itch’s calculated scores. We’d like to know if people are happy with this change, or if they’d prefer to go back to the increased number of submissions allowed per author with the understanding that some games will get less attention.
Volunteering
Having volunteers assist with vetting was a huge help this year! Thanks so much to everyone who participated. Please let us know how your experience was (either here or via DM) and if you have any suggestions for streamlining/improving the volunteer experience or suggest additional ways you think volunteers could help out.
A complicated situation
I will be blunt here: this is the second year in a row we’ve had people try to cheat the ribbon awards via voting with sock accounts.
We didn’t disclose this last year as we thought it was a one-off, there was no evidence the author who benefitted was the one who submitted the fraudulent votes, and frankly it was done incredibly badly. EJ and I discussed it amongst ourselves and decided that discretion was the best policy here – delete the votes, don’t draw attention to the fact that it’s very easy to cheat in the SGS, and move on. So that’s what we did.
Well, it happened again this year. Someone (almost certainly a different someone from last year) submitted very obviously fraudulent duplicate votes for what was almost certainly their own work. We are not publicly disclosing who the culprit was but we will be disqualifying them from participating in the SGS again.
Between this and the likely manipulation of this year’s ParserComp results (also done via Google Forms voting) we are increasingly worried that if anyone halfway smart tries to cheat the system as it stands they just might get away with it. There’s a couple different things we could do about this and none of them are perfect, so we’d like to get the community’s opinion on potential paths forward.
Path 1: Keep things the same and continue to administer the ribbon votes through Google Forms. If we go this route, we will make anonymized voting data available to the public and have a disclosure on the Google form that this will be done.
Path 2: Run the voting for all categories through Itch. On the one hand, it is much more effort to fraudulently vote through Itch. On the other hand, Itch’s category voting is very much not set up for what we want to do and would require significant changes to how the comp is run. We’d have to eliminate some ribbon categories (as we have more currently than Itch supports), all games would compete in all categories, and category scores would affect final rankings. So a puzzle-focused game that wasn’t trying to have a plot would get dinged for scoring low in the Story category, and a kinetic novel would get dinged in the Puzzles category. Best Previously Unawarded Game would have to go away entirely as there’s no way to selectively apply award categories to particular games through Itch.
Path 3: Is there something else we haven’t thought of? Please let us know.
Thanks for your time, and despite that hiccup plus a couple others (like not getting the jam indexed until the last minute
) we’re overall happy with how this year went! Please let us know what you thought, and if you have any questions/comments/concerns about something we haven’t brought up so far please don’t be shy in sharing those either.