Regulators
TRAI hosts ‘Responsible AI in Telecom’ session at India AI Summit
Chair Anil Kumar Lahoti stresses trust and guardrails as AI integrates into networks on 20 Feb 2026.
MUMBAI: AI in telecom isn’t just calling the shots anymore, it’s running the whole network show, and TRAI wants to make sure the director stays human. The Telecom Regulatory Authority of India (TRAI) convened a dedicated session on “Responsible AI in Telecom” at the India AI Impact Summit 2026, held on 20 February at Sushma Swaraj Bhawan in New Delhi. The gathering drew senior executives from telecom operators, global tech giants like Ericsson, Qualcomm, and Nokia, industry bodies such as GSMA, government arms including DoT and C-DOT, plus international stakeholders for frank talks on weaving AI responsibly into networks and customer-facing services.
TRAI chairman Anil Kumar Lahoti kicked off proceedings with a clear message, “Artificial Intelligence is no longer a peripheral technology for telecom, it is becoming integral to how networks are designed, managed and experienced.” He stressed that as AI influences decisions at population scale, optimising 5G performance, predicting faults, slashing energy use, boosting customer experience, and cracking down on spam trust must be the cornerstone. Efficiency gains, he said, need transparency, accountability, human oversight, and firm guardrails to guarantee fairness, unbiased results, resilience, security, and public good.
Lahoti highlighted telecom’s role as India’s AI backbone, given the massive subscriber base, making AI-driven automation essential. He pointed to ongoing work like strengthened spam enforcement, AI filtering, and digital consent frameworks for verifiable commercial messaging. TRAI’s approach remains risk-based, favouring regulatory sandboxes to foster innovation while protecting consumers.
Two punchy panel discussions followed. “Preparing Telecom Networks for AI Era,” chaired by TRAI Member Ritu Ranjan Mittar, featured Ericsson CTO Magnus Ewerbring, Qualcomm VP Vinesh Sukumar, Nokia SVP Pasi Toivanen, and Tejas Sr VP Shantigram Jagannath. They unpacked AI adoption in networks, transparency in explainable systems, responsibility-by-design, environmental sustainability, security, and AI-native architectures reshaping 5G management.
The second panel, “Building Customer Trust through AI-driven Operations,” led by TRAI Member Dr M P Tangirala, included GSMA APAC head Julian Gorman, C-DOT CEO Rajkumar Upadhyay, Vodafone India CTSO Mathan Babu Kasilingam, and DoT TEC Sr DDG Syed Tausif Abbas. Topics ranged from accountability in automated decisions, transparency in customer engagement, ethical frameworks for spam prevention, and standards for an AI incident database especially vital for critical infrastructure plus responsible scaling in 5G/6G for fraud detection and analytics.
The session wrapped as a timely reminder, AI can supercharge telecom, but only if trust, collaboration between regulators, industry, and tech players, and balanced governance keep pace. In a country where networks touch billions daily, getting this right isn’t optional, it’s the line between seamless connectivity and digital chaos.
Regulators
WhatsApp agrees to follow CCI data safeguards after Supreme Court hearing
Company assures compliance by 16 March, ending long privacy row
NEW DELHI: In a notable climbdown, WhatsApp and its parent company Meta have told the Supreme Court of India that they will comply with the data privacy safeguards ordered by the Competition Commission of India by 16 March 2026.
The assurance draws a curtain, at least for now, on the long-running dispute over WhatsApp’s 2021 privacy policy update that sparked outrage for its so-called take-it-or-leave-it approach to data sharing.
At the heart of the controversy was a simple question: should users be forced to share their data with other Meta platforms such as Facebook and Instagram in order to keep using WhatsApp? Regulators said no. WhatsApp has now agreed.
Under the revised framework, users will get meaningful consent options. They can opt in or opt out of sharing their data with Meta companies for purposes beyond running WhatsApp’s core services, including advertising. The company must also spell out, in plain terms, what data is being shared, with whom, and why.
Crucially, consent will not be a one-way street. Users will be able to withdraw it at any time. Access to WhatsApp cannot be tied to agreeing to share data for non-essential purposes. In short, no more bundled bargains.
The case has taken a winding legal route. In November 2024, the CCI fined Meta Rs 213.14 crore for abusing its market dominance and imposed a five-year ban on data sharing for advertising. A year later, the National Company Law Appellate Tribunal upheld the fine but replaced the blanket ban with a regime centred on user consent safeguards, later clarifying that these apply to both advertising and non-advertising data.
Earlier this month, the Supreme Court delivered a sharp rebuke, reportedly calling the mandatory data sharing a mockery of constitutionalism and likening it to a polite form of theft. Facing the risk of having their appeal dismissed, Meta and WhatsApp withdrew their plea for interim relief and agreed to implement the safeguards by mid-March.
For millions of Indian users, it is about control. The decision promises greater say over how personal data travels across the Meta ecosystem. For rivals in the digital advertising market, it is about fair play. The CCI had argued that WhatsApp’s vast user base gave Meta an edge that competitors could not match.
The broader backdrop is India’s Digital Personal Data Protection Act, still in the process of full implementation. This case sets an early benchmark for how global technology firms may be expected to handle user data in the country.
The main appeal challenging the legality of the 2021 policy and the Rs 213 crore fine is still pending before the Supreme Court. For now, however, WhatsApp has agreed to rewrite the rules of consent in India. Whether that marks a turning point or merely a pause in the battle remains to be seen.






