TikTok has been fined €14.5 million by the UK's data watchdog because it "did not do enough" to protect children’s data and also to ensure underage users were not on its platform in the first place.
The Information Commissioner's Office (ICO) confirmed that TikTok allowed up to 1.4 million kids under the age of 13 to use its platform in 2020 - which was against its terms of service.
It found that the social media platform breached the UK General Data Protection Regulation between May 2018 and July 2020.
READ MORE: Irish holidaymakers encouraged to visit new resort with 84c pints, 30C heat and killer animals
However, the total fine is less than half the €30.9 million that the ICO originally suggested as a fine for these breaches.
The regulator slashed the potential fine, which it first announced in September last year, after deciding not to pursue an initial finding that the company had unlawfully used "special category data".
Special data includes ethnic and racial origin, political opinions, religious beliefs, sexual orientation, trade union membership, genetic and biometric data, or health data.
But the regulator upheld its findings that TikTok failed to ensure that users under 13 had permission from their parents or carers to use the platform.
It also did not carry out adequate checks to identify and remove these children from its site despite concerns being raised to senior staff members.
"There are laws in place to ensure our children are as safe in the digital world as they are in the physical world. TikTok did not abide by those laws," said information commissioner John Edwards.
"As a consequence, an estimated one million under-13s were inappropriately granted access to the platform, with TikTok collecting and using their personal data.
"That means that their data may have been used to track them and profile them, potentially delivering harmful, inappropriate content at their very next scroll.
"TikTok should have known better. TikTok should have done better. Our £12.7m fine reflects the serious impact their failures may have had.
"They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform”.
In a statement, TikTok said that it disagreed with the ICO's decision but is pleased that the fine has been reduced.
A spokesperson said: "TikTok is a platform for users aged 13 and over. We invest heavily to help keep under 13s off the platform and our 40,000-strong safety team works around the clock to help keep the platform safe for our community.
"While we disagree with the ICO's decision, which relates to May 2018 - July 2020, we are pleased that the fine announced today has been reduced to under half the amount proposed last year.
"We will continue to review the decision and are considering next steps”.
READ NEXT:
'I asked ChatGPT to find me some money - now €200 is in my account for free'
Girl, 4, hospitalised after being attacked by pack of 'dangerous dogs'
Get news updates direct to your inbox by signing up to our daily newsletter here