Get the latest gossip
Taylor Swift deepfake pornography sparks renewed calls for US legislation
Fake but convincing explicit images of pop singer were viewed tens of millions of times on X and Telegram, prompting outcry from US politicians
The rapid online spread of “deepfake” pornographic images of Taylor Swift has renewed calls, including from US politicians, to criminalise the practice, in which artificial intelligence is used to synthesise fake but convincing explicit imagery. But the technology is overwhelmingly targeted at women, and in a sexually exploitative way: a 2019 study by DeepTrace Labs, cited in the proposed US legislation, found that 96% of deepfake video content was non-consenting pornographic material. The UK government made nonconsensual deepfake pornography illegal in December 2022, in an amendment to the Online Safety Bill that also outlawed any explicit imagery taken without someone’s consent, including so-called “downblouse” photos.
Or read this on The Guardian