The banalization of deepfake: how 'fake porn' involving famous people became content on adult sites
The rise of deepfake technology has led to the normalization of non-consensual pornography, making it increasingly accessible and marketed as a convenience.
The article discusses the alarming rise of deepfake technology in the realm of non-consensual pornography, emphasizing how artificial intelligence has turned this illicit content into something that is readily available and disturbingly normalized. In 2026, it is feared that the conversation will not only focus on the technical sophistication of such tools but also on the disturbing ease with which platforms and applications have begun to market these abuses as products, reflecting a grave societal issue.
The proliferation of services that allow individuals to manipulate images for the creation of fake pornographic content is notable, particularly as such tools are marketed through advertisements on adult content websites. A survey indicates that as of January, over 50 applications dedicated to this purpose were available for download on Google Play, alongside 47 similar apps on the App Store in the United States. The blatant commercialization of these applications contributes to the commodification of non-consensual intimate imagery, transforming it from a fringe activity into a mainstream consumer product.
The implications of this trend are far-reaching, affecting everyone from celebrities to ordinary individuals, as the same logic is applied to synthetic, non-consensual intimate imagery across various demographics. The normalization of deepfake pornography raises serious ethical questions about consent and personal privacy in the digital age, demanding urgent attention and action from policymakers and society at large to combat this growing threat.