A United States court has brought up a lawsuit against a social media platform TikTok holding it responsible for the death of a 10-year-old girl.
According to a report by The Times of India, filed by the girl's mother Nylah Anderson, the lawsuit claims that TikTok recommended a dangerous 'Blackout Challenge' to the girl who died after attempting it.
The lawsuit alleged that the social media platform's algorithm played a crucial role in recommending the challenge to the girl.
Though internet companies typically enjoy legal protection from liability for user-generated content, the court has in this case asserted that this protection doesn't cover algorithmic recommendations.
Due to earlier rulings, social media platforms were shielded from the liability of failing to stop user from accessing harmful content.
Recently the Supreme Court had passed a ruling on practices related to social media platforms' content moderation.
The Supreme Court had concluded that TikTok's algorithm reflects the company's editorial judgements, and therefore it cannot be protected by the existing legal shield.
According to TOI, the apex court ruled that the social media platform will be held responsible for the content it suggests to its users.
Nepal which had earlier banned the app in November, citing concerns around its misuse, recently lifted the ban on Thursday.
Stating that TikTok had violated the Children's Online Privacy Protection Act that requires services aimed at children to obtain parental consent to collect personal information from users under age 13, the US Justice Department has also filed a lawsuit against the social media platform and parent company Byte Dance for failing to protect children's privacy.