University of Notre Dame confronted a student-developed AI tool marketed as study support that faculty and administrators described as a potential cheating instrument. A 20-year-old freshman pitched an AI agent that could connect to Canvas, identify gaps, and generate study roadmaps. Within an hour, the university deleted the email, disabled the account, and opened an investigation. The episode is the latest in a pattern of AI systems created by students forcing academic integrity responses. Caden Chuang’s project, Kerra, generates study guides, detailed notes, and assignment drafts while tracking deadlines and sending reminder texts, with paid premium features. Faculty criticism emphasized that productivity framing may still undermine learning goals. The case is likely to intensify debates on how institutions distinguish assistive study aids from automation that substitutes for coursework.
Get the Daily Brief