Skip to main content

Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
TagsGame Engines
A jam submission

Measuring Gender Bias in Text-to-Image Models using Object DetectionView project page

A novel strategy and framework for measure bias in text-to-image models
Submitted by harvey.mannering — 6 hours, 59 seconds before the deadline
Add to collection

Play project

Measuring Gender Bias in Text-to-Image Models using Object Detection's itch.io page

Results

CriteriaRankScore*Raw Score
Overall#134.0004.000
Generality#263.5003.500
Topic#353.0003.000
Novelty#412.5002.500

Ranked from 2 ratings. Score is adjusted from raw score by the median number of ratings per game in the jam.

Judge feedback

Judge feedback is anonymous.

  • This is quite an interesting experiment on I2T bias. Despite the slight relevance to AI governance, it's one of the better technical works from the ideathon. Completely unrelated, the YoloV3 paper is quite possibly the best paper ever ;-) The great part seems to be its generality due to the automatic generation of images, and there might be a way to combine this both with text and images as well. The best part would be to directly get the "bias score" of the model out, something policymakers would probably be happy to see be very low (or high, depending on the design).

What are the full names of your participants?
Harvey Mannering

Which case is this for?

Custom case

Which jam site are you at?

Online

Leave a comment

Log in with itch.io to leave a comment.

Comments

No one has posted a comment yet