Apple is all in on CSAM but report suggests the App Store is not a child safe place

According to a new report, Apple’s App Store platform allows children to access adult-only applications and says that CSAM policies are not effective enough. The report claims that the current age restrictions and other cold safety controls don’t work as effectively as many thought they could. 

The Tech Transparency Project posted the report:

The investigation reveals major holes in the App Store’s child safety measures, showing how easy it is for young teens to access adult apps that offer dating, random chats, casual sex, and gambling, even when Apple knows the user is a minor.

The results undermine Apple’s promise that its App Store is a “safe place for kids” and that it rejects apps that are “over the line — especially when it puts children at risk.”

The group set up an Apple ID with the age of 14 years and presumable without parental controls. They tested nearly 80 applications which were limited by people 17 and older. The result was that “the underage user could easily evade age restrictions in the vast majority of the cases.” The report said that some of the applications had a pop-up to confirm the user’s age, but if a child clicked “OK,” they were allowed to access everything without any limitations. All of this, despite the Apple ID being associated with a 14-year-old user.

“When the underage test account tried to download these adult apps, Apple served a pop-up message asking the user to confirm they were 17 or older. But if our 14-year-old user clicked “yes” to say they were 17+, Apple did nothing to prevent the download, despite knowing, by virtue of the Apple ID, that the user wasn’t old enough. That puts the onus on the apps themselves to prevent access by minors—and TTP found this system is far from reliable.”

The report also pointed out some “flaws and inconsistencies” in child safety:

“For example, 37 of the adult apps allowed registration with an Apple ID, and in each case, our 14-year-old user was able to use that method to sign up, even though Apple knew the ID was underage. If the apps asked for the user’s age, the minor would simply enter 18—again, with no intervention by Apple. The apps included HOO — Adult Hook Up & Friend Finder, Hahanono – Chat & Get Naughty, and Tinder.”

TPP also mentioned that the findings “strongly suggest” that Apple doesn’t share user age data with the apps in the App Store. The report also mentioned that some apps have 17+ ratings, but their functionality didn’t reflect it:

“Another interesting case is the chat app Yubo, which has been dubbed “Tinder for Teens.” It’s restricted to users 17 and up in the App Store, but the app itself allows users as young as 13 to register, and even says so in its terms of service.”

The report also claimed that some apps tested “appeared to be designed to minimize the possibility of learning if a user was underage”. This was seen in apps like Grindr, which told a 14-year-old to “come back later” when they entered their “true” date of birth. It’s clear that some of the CSAM policies are not being applied equally to all applications, and it remains to be seen what changes Apple will make to address these issues.

Roland Udvarlaki

Roland is a technology enthusiast and software engineer based in United Kingdom. He is also a content creator and writer, and is best known under the name “Techusiast”.

Source link

We will be happy to hear your thoughts

Leave a reply
Enable registration in settings - general
Compare items
  • Total (0)
Shopping cart