I see two distinct issues:
- The current NSFW tagging option is not being used as much as it should due to it being hidden as metadata
- The current NSFW tagging option, regardless of its visibility, is too vaguely defined, as people have different definitions for what constitutes NSFW content
To me, both these have individual answers. Let's look at the first one. I went to a game that I believe should be marked as NSFW, which was You Must be 18 or Older to Enter. Nowhere on this page is a warning flagged for NSFW content, nowhere in any expandable section is this apparent (the developer did include tags for Erotic and Porn, however), and nowhere on the page when I download the game am I warned of its potential content. Only in reading the game's description and its title am I made aware that this game might not be appropriate for all ages. To me, that is an issue on the front-end design and the user experience, but that is a bit out of scope for this comment.
I propose we move the current NSFW checkbox from metadata to the main page where you edit the title, description, etc. For now this would be a simple checkbox, just as it is already. In making this move, though the definition of what constitutes NSFW may still be arbitrary, it should at least lead to more developers selecting the option, thus increasing the set of games on itch tagged NSFW.
After some time passes, statistics can be aggregated. How many new games uploaded were tagged NSFW versus before? How many older games that were not tagged NSFW had updates and then were tagged NSFW as a result of the change in visibility? To me, this is useful information in understanding how developers currently interact with the tools that already exist. While on this thread we have some limited insight and suggestions, I think I am most interested in understanding why a developer would click the NSFW button. What content did they feel defined NSFW? We can speculate on the obvious ones like nudity, but perhaps in probing the developers who do have games marked NSFW, we can better understand their decision making process. We can similarly probe users (with accounts) who play NSFW games, and survey them as well. Naturally, the surveys should be anonymous.
In probing for this information, we can then better learn what filters for NSFW should exist. As stated before, nudity is an obvious one, but I believe some not-so-obvious options exist in this space, too. Once an encompassing set of NSFW filters/tags is determined, then the developer dashboard can be updated such that those fields are available.
I would rather have a static set of NSFW filters, ones that the developer cannot add to, unlike the freeform and very arbitrary tagging system that exists currently. This is because if I want to filter out certain NSFW games or books, I wish to do so rigorously. If everybody can tag things arbitrarily, then somebody can tag something with a typo, or include a specific tag that I may not have heard of, whereas rigid filters like "nudity" or "drug use" are more universal.
I do not think specifying age guidelines like ESRB is a good idea unless forced by law. If the law forces this decision, then a much more rigorous process must be taken, akin to Google Play's questionnaire all developers must fill out for their games. The resultant ratings that emerge differ by country, sometimes in surprising ways, and the entire process is slow and not great to go through. It already is daunting for a developer to update their game and fill out every field that exists, but I would not want to introduce too many changes too quickly, as I do not wish to burden the developer with more noise they must work through when their end goal is to simply release their game.