The video-sharing app TikTok is under investigation in the UK for how it handles the personal data of its young users, and whether it prioritises the safety of children on its social network.

Elizabeth Denham, the information commissioner, told a parliamentary committee the investigation began in February, prompted by a multi-million dollar fine from the US Federal Trade Commission (FTC) for similar violations.

“We are looking at the transparency tools for children,” Denham said on Tuesday. “We’re looking at the messaging system, which is completely open, we’re looking at the kind of videos that are collected and shared by children online. We do have an active investigation into TikTok right now, so watch this space.”

As well as general concerns about how private data was collected, the commissioner said there were concerns about how the open messaging system allowed any adult to message any child.

She said the company was potentially violating the general data protection regulation (GDPR) which “requires the company to provide different services and different protections for children”.

Children and tech

Laws governing children’s relationship with technology vary worldwide, and are rarely enforced. The de facto age for many online services is 13, set by the US Children’s Online Privacy Protection Act in 1998, which prevents sites from targeting children, or knowingly allowing children to provide information online without parental consent. The burden of garnering that consent and the low returns for building services for children has meant, however, that providers tended to turn a blind eye to under-13s on their sites, neither catering for them nor policing their presence.

That said, tech aimed more explicitly at children has blossomed recently, and legislation that aims to protect children from potential harm has been passed. Schoolchildren in France are barred by law from using their phones in school.

Such laws are countered by efforts on the part of companies such as Facebook and Google to attract new users while young. Facebook offers Messenger Kids, which lets children speak to contacts vetted by their parents, while Google’s YouTube has a Kids app that offers copious parental controls and the ability to filter videos for all but the most child-safe content – although the filters, which are run by an algorithm, haven’t always been successful, prompting the company to announce a human-curated version.

In February, Bytedance, the Chinese firm that owns TikTok, was fined a record ($5.7m) £4.2m for illegally collecting personal information from children under 13. The FTC said the company had previously been aware that “a significant percentage of users were younger than 13”, the age at which US laws mandate strict data protections, “and received thousands of complaints from parents that their children under 13 had created Musical.ly accounts”. Musical.ly was the previous name of TikTok.

Despite that, the FTC’s chair, Joe Simons, said, “they still failed to seek parental consent before collecting names, email addresses and other personal information from users under the age of 13”.

Bytedance, a private startup based in Beijing, has a valuation of $75bn, based primarily on the extraordinary growth of TikTok, and its Chinese equivalent, Douyin. The app is popular among teenagers and pre-teens for its combination of music and meme-based humour.

In April this year, Lil Nas X found fame overnight when his track Old Town Road was used extensively on 15-sec clips on the social network. That enthusiasm took the artist to the top of the US Billboard Hot 100 chart.

A company can be fined up to €20m, or 4% of revenue, whichever is higher, for violating the GDPR. As a private company, Bytedance does not have to disclose revenue so it is unknown how high such a fine could be.

TikTok has been approached for comment.



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here