Connect with us

iWorld

Instagram cracks the code on iPhone video brilliance

Published

on

MUMBAI: Every iPhone video carries a secret ingredient: tiny packets of data that tell screens exactly how bright and vivid to display footage. For three years, Instagram binned them. Not anymore.

Chris Ellsworth, Cosmin Stejerean and Hassene Tmar—engineers at Meta—have cracked a thorny problem that made high dynamic range (HDR) videos look drab on Instagram’s iOS app, particularly when viewed in dim light or at low screen brightness. The culprits were two pieces of metadata embedded in iPhone recordings: Dolby Vision, which enhances colour, brightness and contrast, and ambient viewing environment (amve), which adjusts rendering based on lighting conditions.

Since 2022, Instagram has supported HDR video. But its encoding pipeline, built on FFmpeg, an open-source tool, stripped out both metadata types. The result? Pictures that looked nothing like their creators intended.
The fix required surgical precision across three stages of video processing. First, the team convinced FFmpeg’s developers to add amve support in 2024. The data proved remarkably consistent—every frame carried identical values—allowing a quick workaround whilst proper support landed.

Advertisement

Dolby Vision proved trickier. iPhones encode video in HEVC format using profile 8.4, but Instagram delivers AV1 and VP9 codecs instead. Meta collaborated with Dolby and FFmpeg developers to implement profile 10, allowing Dolby Vision metadata to travel within AV1 streams. They also built custom extraction tools to feed metadata into Apple’s display layer, since Instagram decodes video independently rather than using Apple’s standard player.

Then came a nasty surprise. Initial tests showed viewers watched less video with Dolby Vision enabled. The metadata added roughly 100 kilobits per second to file sizes—enough to slow loading times and send impatient scrollers elsewhere. Meta’s solution: implement a compressed format that slashed overhead to 25 kbps, requiring another 2,000 lines of code for compression and decompression.

The second test vindicated the effort. Viewers spent more time watching HDR videos, particularly in low-light conditions where proper metadata reduced eye strain. Instagram for iOS now delivers Dolby Vision metadata on all AV1 encodings derived from iPhone HDR uploads, making it Meta’s first app with full support. Facebook Reels is next in line.

Advertisement

The broader web remains a problem child. Browser and display support for Dolby Vision is patchy, meaning most readers cannot see the difference on this page. For that, you will need an iPhone and Instagram.
Three years to fix discarded data. But for anyone squinting at their screen in bed, scrolling through Reels at 2am, it matters rather a lot.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

iWorld

Meta plans 8,000 layoffs in new AI-led restructuring wave

First phase from May 20 may cut 10 per cent workforce amid AI pivot.

Published

on

MUMBAI: At Meta, the future may be artificial but the cuts are very real. The social media giant is reportedly preparing a fresh round of layoffs, with an initial wave expected to impact around 8,000 employees as it doubles down on its artificial intelligence ambitions. According to a Reuters report, the first phase of job cuts is slated to begin on May 20, targeting roughly 10 per cent of Meta’s global workforce. With nearly 79,000 employees on its rolls as of December 31, the move marks one of the company’s most significant workforce reductions in recent years.

And this may only be the beginning. Sources indicate that additional layoffs are being planned for the second half of the year, although the scale and timing remain fluid, likely to be shaped by how Meta’s AI capabilities evolve in the coming months. Earlier reports had suggested that total cuts in 2026 could reach 20 per cent or more of its workforce.

The restructuring comes as chief executive Mark Zuckerberg continues to steer the company towards an AI-first operating model, committing hundreds of billions of dollars to the transition. Internally, this shift is already visible: teams within Reality Labs have been reorganised, engineers have been moved into a newly formed Applied AI unit, and a Meta Small Business division has been created to align with broader structural changes.

Advertisement

The trend is hardly isolated. Across the tech sector, companies are trimming headcount while investing aggressively in automation. Amazon, for instance, has reportedly cut around 30,000 corporate roles nearly 10 per cent of its white-collar workforce citing efficiency gains driven by AI. Data from Layoffs.fyi shows over 73,000 tech employees have already lost jobs this year, compared with 153,000 in all of 2024.

For Meta, the move echoes its earlier “year of efficiency” in 2022–23, when about 21,000 roles were eliminated amid slowing growth and market pressures. This time, however, the backdrop is different. The company is financially stronger, generating over $200 billion in revenue and $60 billion in profit last year, with shares up 3.68 per cent year-to-date though still below last summer’s peak.

That contrast underlines the shift underway. These layoffs are less about survival and more about reinvention. As Meta restructures itself around AI from autonomous coding agents to advanced machine learning systems, the question is no longer whether the company will change, but how many roles will be left unchanged when it does.

Advertisement
Continue Reading

Advertisement News18
Advertisement
Advertisement
Advertisement
Advertisement Whtasapp
Advertisement Year Enders

Indian Television Dot Com Pvt Ltd

Signup for news and special offers!

Copyright © 2026 Indian Television Dot Com PVT LTD