Connect with us

iWorld

MCA and Meta collaborate to introduce WhatsApp Helpline

Published

on

Mumbai: The Misinformation Combat Alliance (MCA) and Meta are working on launching a dedicated fact-checking helpline on WhatsApp in an effort to combat media generated using artificial intelligence which may deceive people on matters of public importance, commonly known as deepfakes, and help people connect with verified and credible information. The helpline will be available for the public to use in March 2024.

The industry-leading initiative will allow MCA and its associated network of independent fact-checkers and research organisations to address viral misinformation – particularly deepfakes. People will be able to flag deepfakes by sending it to the WhatsApp chatbot which will offer multilingual support in English and three regional languages (Hindi, Tamil, Telugu).

The MCA will set up a central ‘deepfake analysis unit’ to manage all inbound messages they receive on the WhatsApp helpline. They will work closely with member fact-checking organizations as well as industry partners and digital labs to assess and verify the content and respond to the messages accordingly, debunking false claims and misinformation.

Advertisement

The focus of the program is to implement a four-pillar approach – detection, prevention, reporting and driving awareness around the escalating spread of deepfakes along with building a critical instrument that allows citizens to access reliable information to fight the spread of such misinformation. With millions of Indians using WhatsApp, the collaboration between Meta and MCA represents a continued effort to empower users with tools to verify information on its service.  

Commenting on the partnership, Meta director, public policy India Shivnath Thukral, “We recognize the concerns around AI-generated misinformation and believe combatting this requires concrete and cooperative measures across the industry. Our collaboration with MCA to launch a WhatsApp helpline dedicated to debunking deepfakes that can materially deceive people is consistent with our pledge under the Tech Accord to Combat Deceptive Use of AI in 2024 Elections. As a company that has been at the cutting edge of AI development for more than a decade, we remain committed to work with industry stakeholders to introduce common technical standards for AI detection, transparency solutions and policies, along with empowering people on our platforms with resources and tools that make it simpler for them to identify content that has been generated using AI tools and curb the spread of misinformation.”

“The Deepfakes Analysis Unit (DAU) will serve as a critical and timely intervention to arrest the spread of AI-enabled disinformation among social media and internet users in India. Its formation highlights the collaboration and whole-of-society approach to foster a healthy information ecosystem that the MCA was set up for. The initiative will see IFCN signatory fact-checkers, journalists, civic tech professionals, research labs and forensic experts come together, with Meta’s support. We hope the DAU will become a trusted resource for the public to discern between real and AI generated media and we invite more stakeholders to be a part of the initiative.” said Misinformation Combat Alliance president Bharat Gupta.

Advertisement

Meta’s robust fact-checking program in India includes partnerships with 11 independent fact-checking organizations that help users to identify, review, verify information and help prevent the spread of misinformation on its platforms. On WhatsApp, we encourage users to double-check information that sounds suspicious or inaccurate by sending it to WhatsApp tiplines. People can also follow dedicated fact-checking organizations on WhatsApp Channels to receive verified, accurate and timely updates. In addition to the fact-checking program, WhatsApp addresses misinformation by limiting forwards and actively constraining virality on the platform.

Our approach to addressing deceptive synthetic media at Meta has several components, including working to investigate deceptive behaviors like fake accounts and misleading manipulated media; our third-party fact-checking program, in which fact checkers rate misinformation, including content that has been edited or synthesized in a way that could mislead people; and engaging with academia, government and industry. We have recently announced an AI labeling policy. In the coming months, we will label images that users post to Facebook, Instagram and Threads when we can detect industry-standard indicators that they are AI-generated.

We have also pledged to help prevent deceptive AI content from interfering with this year’s global elections. The “Tech Accord to Combat Deceptive Use of AI in 2024 Elections” is a set of commitments to deploy technology countering harmful AI-generated content meant to deceive voters. Signatories, including Meta, pledge to work collaboratively on tools to detect and address online distribution of such AI content, drive educational campaigns, and provide transparency, among other concrete steps.

Advertisement

MCA is a cross-industry alliance bringing companies, organizations, institutions, industry associations, and entities together to collectively fight misinformation and its impact. Currently, MCA has 16 members including fact-checking organizations, media outlets, and civic tech and are inviting strategic partners to collaborate in this industry-wide initiative to combat misinformation and create an enlightened and informed society. 

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

iWorld

WPP Opendoor and Snapchat launch AI Lens for Prime Video India

Generative AI Lens personalises content discovery with real-time user integration.

Published

on

MUMBAI: In the age of main characters, Prime Video is handing users the script and the spotlight. WPP Opendoor, WPP’s dedicated Amazon unit, has teamed up with Snapchat to roll out an India-first generative AI-powered Lens for Prime Video’s latest campaign, ‘Stories for Your Every Era… it’s on Amazon Prime’. The activation taps into the rising “era-core” trend, where identities shift with moods, moments and mindsets and content is expected to keep up.

The Lens does exactly that. Using generative AI, it places users directly into the worlds of popular Prime Video titles such as Maxton Hall, Beast Games, The Boys and The Traitors, embedding their faces into key visuals in real time. The result is less browsing, more becoming.

The idea is rooted in a behavioural shift: audiences increasingly see themselves as the centre of their own narratives, especially on social platforms. By turning viewers into participants, the campaign blurs the line between content discovery and content experience.

Advertisement

It also introduces a layer of personalisation that goes beyond algorithms. Whether someone identifies with a “trust no-one era” or an “infinite aura era”, the Lens curates recommendations that align with that evolving identity making discovery feel intuitive rather than instructed.

This marks a shift in how streaming platforms approach engagement. Instead of pushing titles, the focus is on pulling users into the story itself transforming passive scrolling into interactive storytelling.

The collaboration also underscores how platforms like Snapchat are becoming key playgrounds for content marketing, particularly when paired with emerging technologies like generative AI. The format is native, immersive and built for participation three things traditional discovery often struggles to deliver.

Advertisement

In a crowded streaming landscape, where attention is the real currency, Prime Video’s bet is clear, if viewers feel like the story is about them, they are far more likely to press play.

Continue Reading

Advertisement News18
Advertisement
Advertisement
Advertisement
Advertisement Whtasapp
Advertisement Year Enders

Indian Television Dot Com Pvt Ltd

Signup for news and special offers!

Copyright © 2026 Indian Television Dot Com PVT LTD

This will close in 10 seconds