Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
Tags
A jam submission

Backup Transformer Heads are Robust to Ablation DistributionView project page

An investigation on backup heads in GPT-2 for the Indirect Object Identification task.
Submitted by satojk — 4 hours, 54 minutes before the deadline
Add to collection

Play project

Backup Transformer Heads are Robust to Ablation Distribution's itch.io page

Results

CriteriaRankScore*Raw Score
Interpretability#14.2224.222
Reproducibility#14.5564.556
Judge's choice#2n/an/a
Generality#102.7782.778
ML Safety#102.7782.778
Novelty#112.8892.889

Ranked from 9 ratings. Score is adjusted from raw score by the median number of ratings per game in the jam.

Judge feedback

Judge feedback is anonymous.

  • Cool project! The direction that feels most exciting to me is understanding WHY backup (or backup backup!) heads react the way they do - is there a specific direction that matters? What happens if we replace the ablated head with the average of that head across a bunch of inputs of the form A & B ... A ... -> B for diff names? How are backup or backup backup heads different - does attn change? Does it have significant self-attention? The bit I found most exciting about this work is the discovery of backup backup heads - this is: a) Hilarious b) Fascinating and unexpected. (Also, hi Lucas!) -Neel

Where are you participating from?
["Online"]

What are the names of your team member?
Lucas Sato, Gabe Mukobi, Mishika Govil

What are the email addresses of all your team members?
lucasjks@gmail.com, gmukobi@stanford.edu, mishgov@stanford.edu

What is your team name?
Klein Bottle

Leave a comment

Log in with itch.io to leave a comment.

Comments

No one has posted a comment yet