However, about 29% of these requests, totaling 270,000 cases, were not addressed. The lack of sufficient response personnel is a major issue, but there is also a growing demand for stronger regulations requiring companies to comply with removal requests.
Between 2020 and June 2024, the Digital Sexual Crime Victim Support Center received 938,651 requests to remove illegal content, such as deepfake and sexually abusive images. Out of these, 269,917 cases (28.8%) could not be deleted. The number of unresolved cases increased by 79.7% over two years, with 75,000 cases remaining unaddressed in 2023 alone.
The rise in deepfake creation, driven by advancements in technology, has led to a significant increase in such crimes. For instance, content created by K-pop fans, such as “fancam” videos, has been targeted, with agencies like JYP Entertainment and YG Entertainment taking legal action in response to the deepfake victimization of their artists.
Even ordinary individuals have been victimized, as in the case of a teacher whose photos were manipulated and spread in online chat groups. Despite the increasing severity of these incidents, current measures are inadequate due to the lack of enforcement power. Experts and lawmakers are calling for stricter regulations, including fines and service suspensions for platforms that fail to remove illegal content.
Source: Daum