Why is it in the news?
- On September 24, Telegram CEO Pavel Durov announced significant changes to the platform’s privacy policy, stating that user data, including phone numbers and IP addresses, will now be shared with authorities in response to valid legal requests.
- This marks a notable departure from previous guidelines that prioritized the protection of private chats.
More about the news
- In early September, Telegram updated its FAQ, removing assurances about private chat protection and encouraging users to report illegal content.
- Additionally, the ‘People Nearby’ feature has been replaced with a “Businesses Nearby” feature, allowing verified businesses to showcase products and accept payments. Durov stated that these changes aim to deter criminal misuse of the platform, particularly for selling illegal goods.
- Previously, Telegram’s cooperation with authorities was limited to terror-related inquiries, but this policy now covers all criminal activities. The platform also plans to use AI to identify and remove problematic content from its search features.
Comparative Approaches to Content Moderation
- In contrast to Telegram, end-to-end messaging apps like Signal emphasize user privacy and do not actively monitor usage. Signal, a non-profit that collects minimal user data, has faced internal scrutiny regarding its ability to prevent abuse.
- Telegram’s features, including large group sizes and a searchable platform, make it more appealing to anti-social elements, raising concerns about content moderation. While Signal limits group sizes to 1,000, Telegram allows groups of up to 200,000 members, facilitating the spread of illegal content.
- Unlike Signal, which does not promote such groups, Telegram’s search functions make accessing problematic content easier, attracting heightened scrutiny.
- WhatsApp, despite claiming end-to-end encryption, collects user metadata and has content moderators who can review messages upon receiving reports, illustrating a different approach to data privacy.
Regulatory Obligations for Intermediaries in India
- In India, intermediaries like Telegram must comply with national regulations and respond promptly to complaints about unlawful content.
- The Information Technology Act of 2000 provides some legal protection for executives, allowing them to avoid liability for third-party content if they can demonstrate a lack of knowledge or show that they exercised due diligence.
- This means Durov could argue that he is not responsible for unlawful user-posted content. However, upon notification, Telegram is required to promptly remove such content and implement preventive measures.
- The Indian government can also mandate the removal of unlawful content, raising concerns about censorship and potential overreach.
- To comply with these regulations, Telegram has appointed a grievance officer to address reports of content violations, similar to Meta and Google, which also offer channels for Indian users to report issues.