Is TikTok safe to use? Project Clover launched to avoid further European bans
hile social networks like Facebook and Twitter are stagnating, TikTok is growing fast. With such growth comes plenty of attention, and TikTok is now facing scrutiny in several areas, from the algorithmic distribution of harmful content to how it handles user data.
The US has already banned the viral video app on government devices in more than 20 states because of spying allegations stemming from its Chinese ownership. The European Commission and Council of the EU followed suit on February 23 by prohibiting staff from using TikTok on work devices, and personal devices containing work-related apps, over security concerns. While US Universities in Oklahama, Alabama and Texas are restricting students from accessing the app via campus wi-fi networks.
There is little sign of an form of ban in the UK, though. Secretary of State for Science, Innovation and Technology Michelle Donelan has said it should be a matter of personal choice for government officials whether they use the app or not. TikTok has also recently attempted to assuage the privacy fears of European users with an initiative called Project Clover. This will see user data stored in servers located in Norway and Ireland, rather than further afield or in China. However, similar moves were made in the US in 2022, to seemingly little effect.
Should you quit TikTok or curb your kids’ usage? Below we’ve outlined the arguments so that you can make up your own mind.
Is TikTok harmful?
The case against TikTok can broadly be split into two categories: harmful content and privacy concerns due to its Chinese ownership.
For the former, there are good reasons to be alarmed. The Center for Countering Digital Hate has found TikTok will show children harmful content as soon as they show an interest in related topics.
Its researchers generated accounts in the US, UK, Canada, and Australia, on behalf of fictional 13-year-olds. They “liked” and interacted with videos related to mental health and body image, to assess how this would affect the content shown in the app’s For You feed.
The accounts were shown self-harm or eating disorder content every 206 seconds on average, and more extreme content was shown to accounts intended to represent vulnerable youths, with references to weight loss in their usernames.
TikTok has also been demonstrated to be a hive of misinformation. In September 2022, Newsguard found that when searching “2022 election”, “mRNA vaccine”, and “Uvalde tx conspiracy”, 20 per cent of TikTok posts contained false or misleading information. Newsguard is a service that rates news and information websites based on how trustworthy they are.
The concern about Chinese ownership is the bigger picture. Governments – including in the United States where a campaign to ban TikTok is gathering pace – are concerned about it as a national security risk because it is owned by Chinese company ByteDance.
“There is clearly bipartisan support to do something about TikTok and the continued reports about harmful content and misinformation being served to users – particularly young people – will only add fuel to the fire,” said Insider Intelligence principal analyst Jasmine Enberg.
A paper by cybersecurity firm Internet 2.0 claimed the TikTok app uses “excessive” data harvesting, reaping information on user location, the contents of direct messages and more, and stores it – in part – on servers in mainland China.
TikTok admitted in November that Chinese staff can and do access user data.
A spokesperson said: “We have never provided any data to the Chinese government. We believe in the importance of storing European user data in Europe; keeping data flows outside of the region to a minimum,” the spokesperson said.
A Forbes report claimed ByteDance planned to “monitor the personal location of some specific American citizens”. TikTok denied the claims made in the article, but later would sack four employees for accessing personal data of journalists in an attempt to track down sources. This was enough for Alicia Kearns, a Conservative MP who is the chair of the foreign affairs select committee, to advocate Brits deleting the app. “What TikTok does is it gives away the data that makes you most vulnerable: who are you friends with; what are your interests; what are the interests you have that you may not want publicly disclosed; who you are having private conversations with; the locations you go to,” she told Sophie Ridge of Sky News.
“Our data is a key vulnerability and China is building a tech totalitarian state on the back of our data.”
Where the two concerns – harmful content and Chinese ownership – meet is if the Chinese government has any say in how the algorithms surface content. Could it apply pressure on ByteDance to spread propaganda or harmful content to British teens? It’s not completely far-fetched, given what we know about Russian troll farms spreading disinformation in the West and the alarming fact that seven per cent of UK adults now get their news from TikTok.
Is the criticism fair? Is TikTok safe?
These are serious points worth highlighting but it is worth pointing out how many of these safety concerns apply to other companies. If you side with US politicians who seek to ban TikTok, and effectively booted Huawei out of the US in 2019, you should probably also avoid a host of other brands. These include Honor, Xiaomi, OnePlus, Oppo, Lenovo, Realme, and ZTE among others. They are all Chinese.
As for harmful content, it is not as if Facebook, YouTube and Twitter haven’t had their fair share of content scandals where fake news, dangerous disinformation, and scams spread freely with harmful real-world consequences. Social media algorithms value attention and engagement above all else, and that leads to a dark place.
TikTok seems to get special attention because of its Chinese ownership, which can feel somewhat Sinophobic. If you are willing to discount the idea the Chinese government looms over the app 24/7, TikTok starts to look like just another social network.
In October, even the GCHQ’s director, Jeremy Fleming, said that he wouldn’t be concerned if his own children used TikTok.
But his follow-up was just as important. He said he would “speak to my child about the way in which they think about their personal data on their device”. That’s useful advice whether about TikTok or any other part of the social web.
Sir Jeremy Fleming, the head of GCHQ, would not tell his children not to use TikTok
/ Joe Giddens / PA
How to make TikTok safer for your children
For all its problems, the upcoming Online Safety Bill is at least taking the issue of harmful content seriously. If the legislation passes, it mandates that social media companies must actively look for illegal content, rather than relying on a reporting system to dig it out. However, provisions on “legal but harmful” content, which includes some videos related to self-harm, have been removed.
But what about the here and now? Blocking your children from using TikTok isn’t a problem-free solution. If all their friends use the platform, then you’re making their social life harder, after all. So what can you do to make it as safe as possible?
One tip is to ensure they are registered with the correct date of birth in the app. This won’t allow those under the age of 13 to register, and you shouldn’t let them sidestep that restriction: it’s there for a reason, after all.
But there’s another reason to be truthful with ages. Accounts for those aged 13 to 15 are set to “private” by default, meaning any content posted cannot be viewed by anyone else. And even if these accounts are made public, their content will not be shared to the For You feed.
Direct messaging is switched off for under-16s, and those under 18 cannot live stream or receive Virtual Gifts, which make up TikTok’s tipping economy.