Elon's "nudification" mess: How X normalized abuse
Автор: On with Kara Swisher
Загружено: 2026-01-22
Просмотров: 6060
Описание:
On Christmas Eve, Elon Musk’s X rolled out an in-app image editing tool that lets users alter other people’s photos and post the results directly in replies. With virtually no safeguards, it quickly became a pipeline for sexualized, non-consensual deepfakes — including imagery involving minors — delivered straight into victims’ notifications.
Renée DiResta, Hany Farid, and Casey Newton join Kara Swisher to break down the scale of the harm, why app stores and regulators failed to act quickly, and how “free speech” rhetoric is being used to defend abuse. Kara explores what real accountability could look like — and what comes next as AI image tools grow more powerful and more dangerous.
Guests:
Renée DiResta is the former technical research manager at Stanford’s Internet Observatory, where she researched online child sexual abuse material for years. She is one of the world’s leading experts on online disinformation and propaganda and the author of Invisible Rulers: The People Who Turn Lies into Reality.
Hany Farid is a professor of computer science and engineering at the University of California, Berkeley. Often described as the father of digital image forensics, he has spent decades developing tools to detect and combat CSAM.
Casey Newton is the founder of the tech newsletter Platformer and co-host of The New York Times podcast Hard Fork.
When reached for comment, a spokesperson for X referred us to a a statement post on X, which reads in part:
We remain committed to making X a safe platform for everyone and continue to have zero tolerance for any forms of child sexual exploitation, non-consensual nudity, and unwanted sexual content. We take action to remove high-priority violative content, including Child Sexual Abuse Material (CSAM) and non-consensual nudity, taking appropriate action against accounts that violate our X Rules. We also report accounts seeking Child Sexual Exploitation materials to law enforcement authorities as necessary.
00:00 Intro
02:38 How Grok normalized sexualized deepfakes
07:06 Why CSAM is not a “free speech” issue
08:56 Elon Musk's failed X guardrails
10:27 The misuse of “censorship” as a defense
17:10 Why it's so hard for victims to fight back
25:19 Harassment as a tool to silence women
27:23 The hypocrisy of “free speech” absolutism
37:05 What real AI safety would require
41:47 What worries experts most about what’s next with AI
Subscribe: https://goo.gl/FRleYo
Apple Podcasts: https://podcasts.apple.com/us/podcast...
Spotify: https://open.spotify.com/show/42ntT7X...
FOLLOW US
Instagram: https://www.instagram.com/onwithkaras...
TikTok: https://www.tiktok.com/@onwithkaraswi...
LinkedIn: / kara-swisher-b7213
Threads: https://www.threads.com/@onwithkarasw...
Bluesky: https://bsky.app/profile/onwithkarasw...
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: