TikTok Wednesday revealed some of the elusive workings of the prized algorithm that keeps hundreds of millions of users worldwide hooked on the viral video app.
TikTok’s algorithm uses machine learning to determine what content a user is most likely to engage with and serve them more of it, by finding videos that are similar or that are liked by people with similar user preferences.
- When users open TikTok for the first time, they are shown 8 popular videos featuring different trends, music, and topics. After that, the algorithm will continue to serve the user new iterations of 8 videos based on which videos the user engages with and what the user does.
- The algorithm identifies similar videos to those that have engaged a user based on video information, which could include details like captions, hashtags or sounds. Recommendations also take into account user device and account settings, which include data like language preference, country setting, and device type.
- Once TikTok collects enough data about the user, the app is able to map a user’s preferences in relation to similar users and group them into “clusters.” Simultaneously, it also groups videos into “clusters” based on similar themes, like “basketball” or “bunnies.”
- Using machine learning, the algorithm serves videos to users based on their proximity to other clusters of users and content that they like.
- TikTok’s logic aims to avoid redundancies that could bore the user, like seeing multiple videos with the same music or from the same creator.
Yes, but: TikTok concedes that its ability to nail users’ preferences so effectively means that its algorithm can produce “filter bubbles,” reinforcing users’ existing preferences rather than showing them more varied content, widening their horizons, or offering them opposing viewpoints.
- The company says that it’s studying filter bubbles, including how long they last and how a user encounters them, to get better at breaking them when necessary.
- Since filter bubbles can reinforce conspiracy theories, hoaxes and other misinformation, TikTok’s product and policy teams study which accounts and video information — themes, hashtags, captions, and so on — might be linked to misinformation.
- Videos or creators linked to misinformation are sent to the company’s global content reviewers so they can be managed before they are distributed to users on the main feed, which is called the “For You” page.
The briefing also featured updates about TikTok’s data, privacy and security practices.
- The company says it tries to triage and prevent incidents on its platform before they happen by working to detect patterns of problems before they spread.
- TikTok’s chief security officer, Roland Cloutier, said it plans to hire more than 100 data, security and privacy experts by year’s end in the U.S.
- He also said that the company will be building a monitoring, response and investigative response center in Washington D.C. to actively detect and respond to critical incidents in real time.
The big picture: Beckerman says that TikTok’s transparency efforts are meant to position the company as a leader in Silicon Valley.
- “We want to take a leadership position and show more about how the app works,” he said. “For us, we’re new, and we want to do this because we don’t have anything to hide. The more we’re talking to and meeting with lawmakers, the more comfortable they are with the product. That’s the way it should be.”