The Federal Trade Commission said last week it found that several social media and streaming services engaged in a “vast surveillance” of consumers, including minors, collecting and sharing more personal information than most users realized.

The findings come from a study of how nine companies — including Meta, YouTube and TikTok — collected and used consumer data. The sites, which mostly offer free services, profited off the data by feeding it into advertising that targets specific users by demographics, according to the report. The companies also failed to protect users, especially children and teens.

The FTC said it began its study nearly four years ago to offer the first holistic look into the opaque business practices of some of the biggest online platforms that have created multibillion-dollar ad businesses using consumer data. The agency said the report showed the need for federal privacy legislation and restrictions on how companies collect and use data.

“Surveillance practices can endanger people’s privacy, threaten their freedoms, and expose them to a host of harms, from identify theft to stalking,” said Lina Kahn, the FTC’s chair, in a statement.

Legislation has failed

Tech giants are under intense scrutiny for abuses of privacy and have in recent years been blamed in part for contributing to a mental health crisis among young people and children. But despite multiple proposals in Congress for stricter privacy and children’s online safety protections, nearly all legislative attempts at regulating Big Tech have failed.

Efforts by the companies to police themselves also haven’t worked, the FTC concluded in its report. “Self-regulation has been a failure,” it added.

Google, which owns YouTube, “has the strictest privacy policy in our industry — we never sell people’s personal information and we don’t use sensitive information to serve ads,” said José Castañeda, a spokesperson for Google. He added, “We prohibit ad personalization for users under 18 and we don’t personalize ads to anyone watching ‘made for content’ on YouTube.”

Discord’s head of U.S. and Canadian public policy, Kate Sheerin, said in a statement that the FTC’s report “lumps very different models into one bucket and paints a broad brush.” She added that Discord does not run a formal digital advertising service.

TikTok and Meta, which owns Instagram, WhatsApp, Messenger and Facebook, did not respond to requests for comment.

The study

In December 2020, the agency opened its inquiry into the nine companies that operate 13 platforms. The FTC requested data from each company for operations between 2019 and 2020, and then studied how the companies had collected, used and retained that data.

Included in the study were the streaming platform Twitch, which is owned by Amazon, the messaging service Discord, the photo- and video-sharing app Snapchat, and the message board Reddit. Twitter, now renamed X, also provided data.

The study did not disclose company-by-company findings. Twitch, Snap, Reddit and X did not respond to requests for comment.

Companies also have argued that they have tightened their data collection policies since the studies were conducted.

What was collected

The FTC found that the companies voraciously consumed data about users, and often bought information about people who weren’t users through data brokers. They also gathered information from accounts linked to other services.

Most of the companies collected the age, gender and the language spoken by users. Many platforms also obtained information on education, income and marital status. The companies didn’t give users easy ways to opt-out of data collection and often retained sensitive information much longer than consumers would expect, the agency said.

The companies used data to create profiles on users — often merging the information they gathered with information on habits collected on other sites — to serve up ads.

The agency also found that many of the sites claimed they restricted access to users younger than 13, but many children remained on the platforms. Teens were also treated like adults on many of the apps, subjecting them to the same data collection as adults.

Many of the companies couldn’t tell the FTC how much data they were collecting, according to the study.