New warnings from academics and policymakers say synthetic media has matured to the point ordinary viewers will be routinely fooled. A computer scientist tracking deepfakes reported that voice cloning and video realism crossed an “indistinguishable” threshold in 2025, with an estimated 8 million deepfakes online, and predicts real-time synthetic performers in 2026. New York Assemblymember and former Palantir employee Alex Bores urged adoption of cryptographic provenance standards—like the Coalition for Content Provenance and Authenticity (C2PA)—as a scalable defense, arguing creators and platforms should attach tamper-evident metadata to images, audio and video. For campuses, publishers and research labs, the convergence of near-perfect deepfakes and nascent provenance standards creates immediate operational questions for media verification, academic integrity, and election security on campus.