It was fun to be a part of this. We analyze a dataset of student-LLM interactions on programming tasks and ask: what do students get wrong when prompting?
Reposted from
Carolyn Anderson
excited to say that our Substance Beats Style paper was accepted to NAACL! We investigate *why* student-written programming prompts don't work well for LLMs, and find that while students think it's because of technical vocabulary gaps, it's actually information content that matters
Comments