Policymakers and researchers escalated warnings about synthetic media this week, urging technical fixes and regulatory steps to blunt the deepfake threat on campuses and in public life. New York Assemblymember Alex Bores called for cryptographic provenance standards—similar to HTTPS certificates—to tag authentic media, while researchers reported voice cloning and video synthesis have reached near-indistinguishable quality. Bores pointed to the open C2PA metadata standard as a practical mechanism to record provenance and editing history; he urged platforms and creators to make provenance defaults. Separately, a deepfake researcher said that advances in temporal consistency and voice cloning mean synthetic media will increasingly fool nonexpert viewers and institutions in 2026. For university IT, communications teams, academic integrity offices and media-studies programs, the developments press for cryptographic provenance adoption, updated verification training, and new pedagogies to teach students and staff how to assess and label media authenticity.