Union Minister Ashwini Vaishnaw Calls for Stricter Accountability for Social Media Platforms
Government's Stance on Social Media Responsibility
New Delhi: Union Minister Ashwini Vaishnaw emphasized the need for social media companies to be accountable for the content they disseminate. He stated that a standing committee has already proposed stringent legislation to ensure these platforms are held responsible.
Earlier this week, the government cautioned online platforms, particularly social media networks, about potential legal repercussions if they do not address obscene, vulgar, and illegal content.
"Social media must take responsibility for the material they share. Intervention is necessary," Vaishnaw remarked during a Ministry of Electronics and IT (Meity) event.
His comments came in response to concerns regarding the AI application Grok, which has been generating inappropriate images of women.
Rajya Sabha member Priyanka Chaturvedi has also reached out to the minister, urging immediate action against the rising misuse of AI applications to create and share indecent images of women on social media.
"The standing committee has suggested the formulation of a robust law to ensure social media platforms are accountable for their published content," Vaishnaw added.
The Parliamentary Standing Committee on Information and Broadcasting has recommended that the government enhance accountability for social media and intermediary platforms concerning the spread of fake news and misinformation.
The committee supports stakeholder views on enforcing algorithm transparency, imposing stricter fines for repeat offenders, establishing an independent regulatory authority, and utilizing AI technology to combat misinformation.
On December 29, Meity instructed social media companies to promptly reassess their compliance measures and take action against inappropriate content, warning that failure to do so could lead to legal action.
This advisory was issued after Meity observed that social media platforms were not adequately addressing obscene and unlawful content.
Dhruv Garg, a partner at public policy firm IGAP, noted that the MeitY advisory does not introduce new legal obligations but reinforces that safe harbor protections depend on strict compliance with the IT Rules, 2021.
"Major social media intermediaries must adhere to stricter due diligence standards and implement automated content moderation tools. The advisory indicates that in light of the rampant circulation of obscene content, merely reactive content removal is insufficient; platforms must proactively meet their legal obligations or risk criminal prosecution," he stated.
Sanjeev Kumar, Senior Partner at Luthra and Luthra Law Offices India, highlighted that the MeitY advisory clearly states that non-compliance with the IT Act and IT Rules, 2021 could lead to legal consequences, including prosecution under the IT Act, the Bharatiya Nyaya Sanhita, 2023 (BNS), and other relevant criminal laws, affecting intermediaries, platforms, and users alike.
"This could also result in the loss of safe harbor protections under Section 79, exposing non-compliant entities to direct liability. The cumulative effect of these regulations increases legal, financial, and reputational risks, making compliance not just a legal obligation but a business necessity," he added.