South Korean pop stars are being stripped bare by artificial intelligence. New law aims to protect them
South Korean idols are facing a surge in fake nude photos created with artificial intelligence, prompting the government to introduce a new law to protect them.
South Korean idols are increasingly falling victim to artificial intelligence-generated fake nude photos, which have surged in prevalence. A recent estimate by cybersecurity firm Security Hero indicates that over half of all deepfake images created globally in 2023 targeted South Korean singers and actresses. This alarming trend has raised concerns regarding privacy and security for these prominent figures within the entertainment industry.
In response to this issue, the South Korean government is proposing a new 'Constitution for Artificial Intelligence' aimed at combating various challenges posed by AI technology, including the proliferation of deepfake content. This legislative effort seeks to provide frameworks and protections for individuals affected by the misuse of AI, particularly in the realm of digital harassment and identity theft. However, the implementation of law may not fully address public concerns.
Despite these efforts, there appears to be a perception among citizens that the government is focusing too narrowly on the potential of artificial intelligence as an economic growth driver, rather than adequately considering the ethical implications and risks associated with AI technologies like deepfakes. An expert has noted that there's a growing need for a broader discourse on the impact of AI, beyond just its benefits, to ensure comprehensive protection for individuals against such technological threats.