May 25, 2022

Last year, a grieving mother sued the abuse platform Snapchat for not doing enough to protect young users after a teen committed suicide when he got bored. Another lawsuit involving yet another suicide followed last month. In response to the first, Snap banned anonymous messaging apps that allowed for online bullying and promised to update its policy to determine what types of Snapchatters use its developer tools. The company today released the results of its policy review and the changes it is making.

With immediate effect for new developers using the Snap Kit platform, Snap will ban anonymous messaging apps and anyone creating friend-finding apps for users aged 18 and over. Existing developers will have 30 days to comply with the new policy.

These changes are limited to third-party apps integrated into Snapchat, but don’t address Snap’s broader security concerns.

Snap says the policy update will affect a small portion of their community of more than 1,500 developers. The ban on anonymous messaging apps will only affect 2% of developers, with another 3% affected by the new requirement to make their apps obsolete. The company also noted that developers who remove anonymous messages from their apps can re-evaluate their apps and remain Snap Kit partners.

An app that has benefited greatly from previous bans on the anonymous messaging apps YOLO and LMK, Sendit is one app that needs to be amended to continue working with Snapchat. In the months following the ban, Sendit received millions of downloads from teenagers who still wanted to post anonymous Q&A.

The appeal of anonymous social apps is undeniable, especially for young people. But over the years, it has been proven time and time again that such applications cannot be used responsibly, but they can have disastrous consequences. From the early days of MySpace to teen suicides involving Ask.fm and well-funded anonymous apps like Secret and Yik Yak (neither of which survived), anonymity in the hands of young people has been tested and consistently failed. Given this history, given the core demographic of teenagers and young adults, it would be irresponsible to allow this kind of activity on Snapchat.

In addition to banning anonymous messages, Snap will now restrict the use of friend-finding apps by adult users aged 18 and over.

Friend finder apps are designed to connect users with strangers on Snapchat, encourage people to share their personal information, and are a common way for child predators to get at younger and more vulnerable Snapchat users. Often apps are used for dating or sexting, not forFriend-finding” and can be filled with porn bots. For years, law enforcement and child safety experts have been warning about child predators on Snapchat and a friend-finding app called “Tinder for Teens.”

The problem of these applications remains today. For example, a study published last month by The Times details widespread sexual harassment and racism on one such app, Yubo.

Restrictions on anonymous messaging and restrictions on friend-finding apps are just two major policy changes Snap is making today, but the company notes that developer apps still need to go through a review process where they can learn more about it, ask questions. , cases and their intended integration. Snap also stated that it will periodically review it every six months to ensure that the app’s functionality has not changed in a way that would violate policy. Any developer who intentionally tries to evade Snap will be removed from the Snap Kit and the developer platform as a whole.

“As a platform that collaborates with a wide range of developers, we are committed to developing an ecosystem that helps applications protect the security, privacy and well-being of users, and providing products for developers opens up opportunities for innovation and helps them grow their business,” he said. , Regarding the policy update. “We believe we can and will continue to regularly review our policies, monitor application compliance, and work with developers to better protect the well-being of our community.”

Snap platform security still needs work

While the changes will affect third-party apps integrated into Snapchat, the company has yet to address widespread child safety concerns on its platform with other features, such as an age-appropriate experience or the promised parental controls. On the contrary, today’s changes represent the first step in a direction that could mean more work for the company in the area of ​​child safety.

But platform security is already a top priority for social media across the industry due to increasing regulatory pressure. In his case, the picture was taken last fall before Congress to answer questions from lawmakers about a range of safety concerns affecting minors and young adults using the app, including the prevalence of content about eating disorders and youth. users are enabled but not blocked by age gates.

In January, Snap was also sued by another family who lost a child after committing suicide under pressure to send sexually explicit photos that were later leaked to her classmates. The complaint alleges that Snapchat’s lack of verification of the child’s age and the use of missed messages contributed to his death. In addition, the lawsuit mentions the role played by anonymous messages, although it does not explicitly mention the use of anonymous third-party applications.

The same month, Snap fixed other issues with its friend recommendation feature that made it harder for drug dealers to connect with teens on the app. The matter was the subject of an NBC News investigation that linked Snapchat sales of fentanyl pills to teens and young adults in more than a dozen states.

The company has previously faced lawsuits over its “speed filter” which allowed users to take pictures showing how fast they were moving. The filter has contributed to numerous car accidents, injuries and even deaths over the years, but didn’t turn off due to traffic speed until 2021. (Snap declined to comment on this issue as the lawsuit is pending.)

With lawmakers finally trying to rein in Big Tech’s wild west days when growth and engagement were invariably prioritized over user safety, Snap is gearing up for change. In September, the company hired Jacqueline Boucher, the platform’s first head of security.

Snap CEO Evan Spiegel also said in October that the company is developing parental control tools. Coming with the launch of parental controls on TikTok and this week on Instagram, these tools will allow parents to see who their teens are talking to on the app.

Snap didn’t say if the tools will address other parent issues, including missing messages for parents to disable a child’s access to send or receive, restrict friend requests or require approval, includes a way to prevent or hide a child from sharing photos and other media. . Adult-oriented (and often clickbait) content that features prominently in the Discover section of an app.

Snap’s spokesperson for parental controls said, “We want to provide ways for parents and teens to work together to keep them safe and well online – with parents ready to help their kids in real life.” “We hope these new tools will serve as a conversation starter between parents and their teens about online safety.”

The company said the first set of parental controls should be launched this year. The developer policy changes are now in effect.

If you or someone you know is struggling with depression or has thoughts of harming or suicidal thoughts, visit the National Suicide Prevention Helpline (1-800-273-8255), which is open 24 hours a day, 7 days a week. Free confidential support is provided. , as well as best practices for professionals and resources to assist in prevention and crisis situations.

Leave a Reply

Your email address will not be published.