The lawsuit claims that Snapchat’s design and policies are “inherently flawed” and contribute to the spread of child sexual abuse material (CSAM). The lawsuit argues that Snapchat’s features, such as disappearing messages, encourage users to share content that could be harmful, particularly when combined with the platform’s lack of robust content moderation. This is a significant development in the ongoing battle against child sexual abuse material online. The lawsuit also highlights the company’s failure to adequately address the issue of child sexual exploitation. The attorney general’s office has been actively involved in investigating and prosecuting cases of child sexual exploitation in New Mexico.
This is a serious issue that has been brought to light by the social media platform’s own employees. Internal documents obtained by the news outlet, The Guardian, revealed that Snap employees have been raising concerns about the platform’s security for years. These concerns have been ignored by the company, leading to a dangerous situation for children. The Guardian’s investigation revealed that Snap’s messaging system, which is designed to be ephemeral, has been exploited by predators. The system allows users to send photos and videos that disappear after a certain period of time.
This investigation revealed that Snapchat was aware of the abuse and was actively engaging in a cover-up. The investigation uncovered a pattern of Snapchat’s failure to report child sexual abuse material (CSAM) to the National Center for Missing and Exploited Children (NCMEC). This failure was due to a combination of factors, including a lack of resources, inadequate training, and a reluctance to take action.