Site icon KahawaTungu

TikTok Bans Private Messaging For Users Under 16

TikTok will no longer allow private messaging for users under the age of 16 years. This is the first time a social media platform has blocked private messaging. The app, which is popular among teenagers had 13 percent of its users between 12 to 15 years old in 2019, according to a survey by UK regulator, Ofcom.

Until now, all users have been able to send and receive messages as long as their accounts follow each other. In the new move, all under 16s will not be able to send or receive messages under any circumstances. They will still be able to post publicly on the comments section of videos.

The age limit is set when signing up and users are required to enter their date of birth. There is no way to ascertain that the information a user gives is true and the approval is based purely on trust. Those already affected will receive in-app notifications soon and lose all access to direct messages on April 30.

Read: TikTok : The New Social Media Giant?

“The interesting thing here is that TikTok’s biggest group of users are teenagers. This restriction will impact a large number of their demographic. Also, blocking use of a core feature such as messaging between its biggest sub-set of users is a bold move,” said Matt Navarra, a social media consultant.

“Depending on how cynical you are, you could view this as TikTok following the same strategy as Facebook and others, whereby they launch new ‘digital wellbeing’ or safety features in advance of any potential regulatory hearings or investigations. It gives these platforms something to fight back with. It’s possible TikTok has observed some concerning incidents or activity on the platform and is now trying to get ahead of the issue with this new restriction,” he added

National Society for the Prevention of Cruelty to Children (NSPCC) child safety online policy head Andy Burrows said: “This is a bold move by TikTok as we know that groomers use direct messaging to cast the net widely and contact large numbers of children. Offenders are taking advantage of the current climate to target children spending more time online. But this shows proactive steps can be taken to make sites safer and frustrate groomers from being able to exploit unsafe design choices. It’s time tech firms did more to identify which of their users are children and make sure they are given the safest accounts by default.”

Read: Azziad, Kenyan “Tik-Tok Queen” Claims To Have Received 13,000 Messages From Kenyan Men After Phone Number Leaked

British Children’s Charities’ Coalition on Internet Safety secretary John Carr said that research showed that years back when Facebook was the dominant app among children, about 80% of children above 8 years old had a Facebook account in some countries, with about two thirds of this number in the UK.

“It’s good that TikTok are showing an awareness of these issues but without having any meaningful way of checking children’s ages it’s a lot less than it appears to be. No one has researched specifically for TikTok but all the evidence that we have shows there are gigantic numbers of under-age children on the site,” he said.

Social media platforms have in the recent years undertaken steps to regulate their use especially among young vulnerable people. In 2018, Facebook introduced rules to make Whatsapp available to over-16s only across the EU.

Email your news TIPS to Editor@kahawatungu.com or WhatsApp +254707482874. You can also find us on Telegram through www.t.me/kahawatungu

Email your news TIPS to Editor@kahawatungu.com
Exit mobile version