iWorld
Instagram cracks the code on iPhone video brilliance
MUMBAI: Every iPhone video carries a secret ingredient: tiny packets of data that tell screens exactly how bright and vivid to display footage. For three years, Instagram binned them. Not anymore.
Chris Ellsworth, Cosmin Stejerean and Hassene Tmar—engineers at Meta—have cracked a thorny problem that made high dynamic range (HDR) videos look drab on Instagram’s iOS app, particularly when viewed in dim light or at low screen brightness. The culprits were two pieces of metadata embedded in iPhone recordings: Dolby Vision, which enhances colour, brightness and contrast, and ambient viewing environment (amve), which adjusts rendering based on lighting conditions.
Since 2022, Instagram has supported HDR video. But its encoding pipeline, built on FFmpeg, an open-source tool, stripped out both metadata types. The result? Pictures that looked nothing like their creators intended.
The fix required surgical precision across three stages of video processing. First, the team convinced FFmpeg’s developers to add amve support in 2024. The data proved remarkably consistent—every frame carried identical values—allowing a quick workaround whilst proper support landed.
Dolby Vision proved trickier. iPhones encode video in HEVC format using profile 8.4, but Instagram delivers AV1 and VP9 codecs instead. Meta collaborated with Dolby and FFmpeg developers to implement profile 10, allowing Dolby Vision metadata to travel within AV1 streams. They also built custom extraction tools to feed metadata into Apple’s display layer, since Instagram decodes video independently rather than using Apple’s standard player.
Then came a nasty surprise. Initial tests showed viewers watched less video with Dolby Vision enabled. The metadata added roughly 100 kilobits per second to file sizes—enough to slow loading times and send impatient scrollers elsewhere. Meta’s solution: implement a compressed format that slashed overhead to 25 kbps, requiring another 2,000 lines of code for compression and decompression.
The second test vindicated the effort. Viewers spent more time watching HDR videos, particularly in low-light conditions where proper metadata reduced eye strain. Instagram for iOS now delivers Dolby Vision metadata on all AV1 encodings derived from iPhone HDR uploads, making it Meta’s first app with full support. Facebook Reels is next in line.
The broader web remains a problem child. Browser and display support for Dolby Vision is patchy, meaning most readers cannot see the difference on this page. For that, you will need an iPhone and Instagram.
Three years to fix discarded data. But for anyone squinting at their screen in bed, scrolling through Reels at 2am, it matters rather a lot.
iWorld
Meta signs multiyear AI deal with News Corp
Agreement worth up to $50 million annually covers WSJ, New York Post and UK titles.
MUMBAI: Meta just bought itself a front-row seat to the newsroom because when AI needs facts, even Zuckerberg is willing to pay the subscription fee. Meta Platforms has signed a multiyear artificial intelligence content licensing agreement with News Corp that could be worth up to $50 million (£39 million) a year, The Wall Street Journal reported on 25 February 2026. The deal, expected to run for at least three years, grants Meta access to News Corp’s US and UK content including The Wall Street Journal and New York Post for training AI models and powering real-time information retrieval in its products.
Australian mastheads such as the Daily Telegraph and Herald Sun are not included. News Corp CEO Robert Thomson revealed the arrangement during a Morgan Stanley technology conference in San Francisco, describing news organisations as a vital “input company” in the AI ecosystem. “We’re essentially an input company,” he said. “The great threat in the age of AI is going to be to what you might call output companies.”
Thomson emphasised the value of reliable journalism as foundational infrastructure for AI systems, noting regular conversations with Meta CEO Mark Zuckerberg via Whatsapp and ongoing talks with OpenAI’s Sam Altman. He added that News Corp is in “advanced stage” negotiations for additional deals, promising further announcements soon.
The agreement follows News Corp’s 2024 five-year partnership with OpenAI (reportedly worth more than $250 million) and reflects Meta’s broader push to secure content licences. The company has already confirmed deals with People Inc, USA Today, CNN and Fox News, though financial terms remain undisclosed.
Publishers remain divided, some pursue partnerships for revenue, while others litigate. News Corp subsidiaries have sued Perplexity over copyright infringement, The New York Times is suing OpenAI and Microsoft, yet the same NYT struck a separate AI licensing deal with Amazon reportedly worth $20–25 million annually.
Thomson summed up the dual strategy as “woo or sue” seeking commercial agreements where possible, legal action when content is used without permission.
In an AI race where data is oxygen, Meta isn’t just training models, it’s buying the raw material for tomorrow’s answers, one headline at a time.





