<p>Washington: The Federal Trade Commission said Thursday it found that several social media and streaming services engaged in a "vast surveillance" of consumers, including minors, collecting and sharing more personal information than most users realized.</p>.<p>The findings come from a study of how nine companies -- including Meta, YouTube and TikTok -- collected and used consumer data. The sites, which mostly offer free services, profited off the data by feeding it into advertising that targets specific users by demographics, according to the report. The companies also failed to protect users, especially children and teens.</p>.<p>The FTC said it began its study nearly four years ago to offer the first holistic look into the opaque business practices of some of the biggest online platforms that have created multibillion-dollar ad businesses using consumer data. The agency said the report showed the need for federal privacy legislation and restrictions on how companies collect and use data.</p>.<p>"Surveillance practices can endanger people's privacy, threaten their freedoms, and expose them to a host of harms, from identify theft to stalking," said Lina Kahn, the FTC's chair, in a statement.</p>.<p>Tech giants are under intense scrutiny for abuses of privacy and have in recent years been blamed in part for contributing to a mental health crisis among young people and children that has been linked by some social scientists and the surgeon general to the rampant use of social media and smartphones. But despite multiple proposals in Congress for stricter privacy and children's online safety protections, nearly all legislative attempts at regulating Big Tech have failed.</p>.<p>Efforts by the companies to police themselves also haven't worked, the FTC concluded in its report. "Self-regulation has been a failure," it added.</p>.<p>Google, which owns YouTube, "has the strictest privacy policy in our industry -- we never sell people's personal information and we don't use sensitive information to serve ads," said José Castañeda, a spokesperson for Google. He added, "We prohibit ad personalization for users under 18 and we don't personalize ads to anyone watching 'made for content' on YouTube."</p>.<p>Discord's head of U.S. and Canadian public policy, Kate Sheerin, said in a statement that the FTC's report "lumps very different models into one bucket and paints a broad brush." She added that Discord does not run a formal digital advertising service.</p>.<p>TikTok and Meta, which owns Instagram, WhatsApp, Messenger and Facebook, did not immediately respond to requests for comment.</p>.<p>In December 2020, the agency opened its inquiry into the nine companies that operate 13 platforms. The FTC requested data from each company for operations between 2019 and 2020, and then studied how the companies had collected, used and retained that data.</p>.<p>Included in the study were the streaming platform Twitch, which is owned by Amazon, the messaging service Discord, the photo- and video-sharing app Snapchat, and the message board Reddit. Twitter, now renamed X, also provided data.</p>.<p>The study did not disclose company-by-company findings. Twitch, Snap, Reddit and X did not immediately respond to requests for comment.</p>.<p>Companies have argued that they have tightened their data collection policies since the studies were conducted. This week, Meta announced that the accounts of Instagram users younger than 18 will be made private by default in the coming weeks, which means that only followers approved by an account-holder may see their posts.</p>.<p>The FTC found that the companies voraciously consumed data about users, and often bought information about people who weren't users through data brokers. They also gathered information from accounts linked to other services.</p>.<p>Most of the companies collected the age, gender and the language spoken by users. Many platforms also obtained information on education, income and marital status. The companies didn't give users easy ways to opt-out of data collection and often retained sensitive information much longer than consumers would expect, the agency said.</p>.<p>The companies used data to create profiles on users -- often merging the information they gathered with information on habits collected on other sites -- to serve up ads.</p>.<p>The agency also found that many of the sites claimed they restricted access to users younger than 13, but many children remained on the platforms. Teens were also treated like adults on many of the apps, subjecting them to the same data collection as adults.</p>.<p>Many of the companies couldn't tell the FTC how much data they were collecting, according to the study.</p>.<p>The FTC last year proposed changes to strengthen child privacy regulations and lawmakers are seeking to raise child privacy protections to users younger than 18. In 2022, Khan opened a regulatory effort to create rules for companies that show advertising based on users' browsing or search history.</p>.<p>The agency has previously filed complaints against several tech companies for privacy violations, and reached a settlement of $520 million with Epic Games in late 2022 for allegedly violating a child privacy law and deceiving consumers with unwarranted charges. That same year, the FTC fined Twitter $150 million for using security data about users for behavioral advertising.</p>
<p>Washington: The Federal Trade Commission said Thursday it found that several social media and streaming services engaged in a "vast surveillance" of consumers, including minors, collecting and sharing more personal information than most users realized.</p>.<p>The findings come from a study of how nine companies -- including Meta, YouTube and TikTok -- collected and used consumer data. The sites, which mostly offer free services, profited off the data by feeding it into advertising that targets specific users by demographics, according to the report. The companies also failed to protect users, especially children and teens.</p>.<p>The FTC said it began its study nearly four years ago to offer the first holistic look into the opaque business practices of some of the biggest online platforms that have created multibillion-dollar ad businesses using consumer data. The agency said the report showed the need for federal privacy legislation and restrictions on how companies collect and use data.</p>.<p>"Surveillance practices can endanger people's privacy, threaten their freedoms, and expose them to a host of harms, from identify theft to stalking," said Lina Kahn, the FTC's chair, in a statement.</p>.<p>Tech giants are under intense scrutiny for abuses of privacy and have in recent years been blamed in part for contributing to a mental health crisis among young people and children that has been linked by some social scientists and the surgeon general to the rampant use of social media and smartphones. But despite multiple proposals in Congress for stricter privacy and children's online safety protections, nearly all legislative attempts at regulating Big Tech have failed.</p>.<p>Efforts by the companies to police themselves also haven't worked, the FTC concluded in its report. "Self-regulation has been a failure," it added.</p>.<p>Google, which owns YouTube, "has the strictest privacy policy in our industry -- we never sell people's personal information and we don't use sensitive information to serve ads," said José Castañeda, a spokesperson for Google. He added, "We prohibit ad personalization for users under 18 and we don't personalize ads to anyone watching 'made for content' on YouTube."</p>.<p>Discord's head of U.S. and Canadian public policy, Kate Sheerin, said in a statement that the FTC's report "lumps very different models into one bucket and paints a broad brush." She added that Discord does not run a formal digital advertising service.</p>.<p>TikTok and Meta, which owns Instagram, WhatsApp, Messenger and Facebook, did not immediately respond to requests for comment.</p>.<p>In December 2020, the agency opened its inquiry into the nine companies that operate 13 platforms. The FTC requested data from each company for operations between 2019 and 2020, and then studied how the companies had collected, used and retained that data.</p>.<p>Included in the study were the streaming platform Twitch, which is owned by Amazon, the messaging service Discord, the photo- and video-sharing app Snapchat, and the message board Reddit. Twitter, now renamed X, also provided data.</p>.<p>The study did not disclose company-by-company findings. Twitch, Snap, Reddit and X did not immediately respond to requests for comment.</p>.<p>Companies have argued that they have tightened their data collection policies since the studies were conducted. This week, Meta announced that the accounts of Instagram users younger than 18 will be made private by default in the coming weeks, which means that only followers approved by an account-holder may see their posts.</p>.<p>The FTC found that the companies voraciously consumed data about users, and often bought information about people who weren't users through data brokers. They also gathered information from accounts linked to other services.</p>.<p>Most of the companies collected the age, gender and the language spoken by users. Many platforms also obtained information on education, income and marital status. The companies didn't give users easy ways to opt-out of data collection and often retained sensitive information much longer than consumers would expect, the agency said.</p>.<p>The companies used data to create profiles on users -- often merging the information they gathered with information on habits collected on other sites -- to serve up ads.</p>.<p>The agency also found that many of the sites claimed they restricted access to users younger than 13, but many children remained on the platforms. Teens were also treated like adults on many of the apps, subjecting them to the same data collection as adults.</p>.<p>Many of the companies couldn't tell the FTC how much data they were collecting, according to the study.</p>.<p>The FTC last year proposed changes to strengthen child privacy regulations and lawmakers are seeking to raise child privacy protections to users younger than 18. In 2022, Khan opened a regulatory effort to create rules for companies that show advertising based on users' browsing or search history.</p>.<p>The agency has previously filed complaints against several tech companies for privacy violations, and reached a settlement of $520 million with Epic Games in late 2022 for allegedly violating a child privacy law and deceiving consumers with unwarranted charges. That same year, the FTC fined Twitter $150 million for using security data about users for behavioral advertising.</p>