Google lists sites that create fake intimate images of women and children, says study
A study by FGV in Rio reveals that Google searches index sites and apps that allow for the non-consensual manipulation of images of women and children, raising concerns about online gender violence and child abuse.
A recent study conducted by the Getulio Vargas Foundation (FGV) in Rio de Janeiro has unveiled that Google's search engine indexes websites and applications that enable the creation of non-consensual nude images of women and children. These platforms utilize generative artificial intelligence and deepfake techniques to produce sexualized fake images, allowing users to manipulate photographs to simulate nudity or sexual acts. The findings indicate that the autocomplete feature of Google also promotes access to these harmful sites, furthering their reach and impact.
The implications of this study are alarming as it highlights the significant risk posed by the indexing of such sites. Yasmin Curzi, the professor who led the research, notes that this form of content amplification exponentially increases the potential for online gender violence and child abuse on a large scale. The study calls for immediate intervention by Google, drawing parallels to the company's prior actions to curb the distribution of child sexual abuse images and terrorism-related content.
In light of these findings, the research raises pressing questions about the responsibilities of tech companies in preventing the dissemination of harmful content. As the discussion surrounding online safety and digital ethics continues to evolve, the need for stringent controls and regulations becomes imperative to protect vulnerable individuals from exploitation and abuse.