South Korean stars are being stripped. New law aims to protect them from artificial intelligence
South Korean celebrities face a rising issue of fake nude photos created through artificial intelligence, prompting the introduction of a new law to safeguard them.
South Korean idols are contending with a growing problem of AI-generated fake nude photos, which are reportedly affecting them more than any other group globally. According to IT security firm Security Hero, South Korean singers and actresses accounted for over half of all deepfake images created worldwide in 2023. The prevalence of these maliciously created images has sparked significant concern among the public regarding privacy and personal dignity, especially in a society that strongly values public image.
In response to this troubling trend, South Korea is attempting to address the issue with a groundbreaking 'constitution for artificial intelligence' which is designed to regulate the use of AI technologies and safeguard individuals from such infringements. However, despite the efforts reflected in the new legislation, there is apprehension among citizens that the government may be focusing too heavily on AI as a growth engine, rather than adequately addressing the associated risks and ethical ramifications.
Experts in South Korea express a nuanced perspective regarding the legislation, emphasizing the importance of striking a balance between technological innovation and individual rights. They call for a more comprehensive view of AI that includes the potential harms and abuses it can facilitate, particularly in the entertainment industry where celebrities are increasingly vulnerable to exploitation through technology. The ongoing debate highlights the challenges that arise when rapidly advancing technology intersects with privacy rights and the well-being of individuals in the digital age.