What Telegram’s recent policy shift means for cyber crime


Since its launch in August 2013, Telegram has become the go-to messaging app for privacy-focused users. To start using the app, users can sign up using either their real phone number or an anonymous number purchased from the Fragment blockchain marketplace. In the case of the latter, Telegram cannot be linked to the user’s real phone number or any other personally identifiable information (PII).


Telegram has also long been known for its hands-off moderation policy. The platform explicitly stated in its FAQ that private chats were entirely off-limits for moderation. Content moderation was instead user-driven, and reporting illegal activities was left primarily to the users themselves. By contrast, many of its peers, such as WhatsApp, invest heavily in moderating content and cooperation with law enforcement.


These characteristics have also made Telegram the messaging app of choice for cyber crime and other illegal activity. This includes distributing malware, selling illegal goods and services, recruiting associates and coordinating cyberattacks. For more organized cyber crime groups, Telegram is a hub for sharing operational intelligence and amplifying illicit business in much the same way as legitimate organizations do on mainstream channels.


However, Telegram’s approach to user privacy and content moderation changed significantly following CEO Pavel Durov’s arrest in France on August 24, 2024, with the company quietly changing its FAQ page and privacy policy in the following weeks. Although the app’s source code hasn’t changed, according to Telegram spokesperson Remy Vaughn, users can now report illegal activity for automated takedown or manual moderation. Furthermor ..

Support the originator by clicking the read the rest link below.