iWorld
Hotstar and the art of managing traffic spikes
MUMBAI: Ajit Mohan sits back in his chair on the 26 floor of Urmi Estate in Mumbai his chest swelling with pride as he reads what he has just posted on LinkedIn. “Reading about YouTube TV crashing for the England vs Croatia game and being reminded again that scaling for live is no accident. Feel proud about Hotstar Tech and our VIVO IPL 2018 scale,” the post states.
What the CEO of Hotstar is referring to is the huge spike of 10.33 million concurrent viewers that India’s leading video streamer could handle during the IPL 2018 finals.
“Over the past three and a half years, we have built live tech that is truly world class and that can handle massive surges in traffic. It is not about just scaling the video infrastructure , it is about making sure all parts of our tech can scale, including the gaming and social TV experience. I do think we have built something unique and special in live tech and we are proud that the bar has been set by an Indian service,” he says.
In fact, Ajit has over the past year or so invested heavily in tech resources – in terms of teams and in-house hardware, monitoring tools, and what have you. So much so that most of the Hotstar tech today is run by its own engineers with very little reliance on third parties.
Today Hotstar’s command centre in Mumbai hosts more than 100 techies, most of them youngsters between the ages of 23 and 35 only. “It’s the youngsters who are driving leapfrogs in innovation,” says Star India managing director Sanjay Gupta.
Cubicles are buzzing with data scientists, programmers and hardware and software geeks peering over and at screens monitoring hotspots where traffic is unusually high and ensuring that Hotstar stays up at all times. “We want to be and probably are the gold standard in streaming experience – not just in India but the world as well,” says Ajit.
It is this almost maniacal obsession with giving Hotstar users a consistent streaming experience while they are watching live cricket or shows from its linear channels that has made it the envy of the likes of leading media tech company Netflix’s CEO Reed Hastings who has referred to it on several occasions during investor calls and briefings.
As compared to that, larger companies such as Optus down under simply collapsed unable to bear the weight of a few thousand subscribers during the group phase of the FIFA World Cup 2018. Customers were subject to repeated drop outs or blurred and low quality streaming with the spinning progress wheel continuing for minutes at an end. They came out in hordes slamming the service labeling it #FloptusSport. . So much so that it was forced to turn off the pay button and give free access to subs until 31 August and even issue refunds. Optus will also be offering customers the first three rounds of the Premier League for free.
Another major which simply disintegrated during the current football frenzy was media tech titan Google’s YouTube TV which costs viewers a hefy $40 a month. Customers were once again left frustrated when the service got logjammed unable to handle the thousands of concurrent live streams. YouTube aplogised profusely but to no avail. Soccer fans took it to the cleaners. Tweeted one of them: “..it’s completely down. If Google can’t keep it online in a surge like this, nobody can.”
Google engineers could probably try knocking at Hotstar’s doors and learn a trick or two from Ajit and his tech team. That would probably give their customers a better video experience.
|
Akash Saksena, one of the Hotstar engineers, posted on a blog what went into making Hostar the smooth streamer it turned out to be during the World Cup. Read on to find out more. “Your cloud provider also has physical limits of how much you can auto-scale. Work with them closely and ensure you make the right projections ahead of time. Even then, nothing can make it better for you if you are inefficient per server. This calls for rabid tuning of all your system parameters. Moving from development to production environments requires knowledge of what hardware your code will run on and then tweaking it to suit that system. Be lean on your single server and yield results with more room to scale horizontally. Review all your configurations with a fine tooth, it’ll save you the blushes in production. Each system must be tuned specifically to the traffic pattern and hardware you choose to run it on. No Dumb Clients Three pillars Our platform has three core pillars, the subscription engine, meta-data engine and our streaming infrastructure. Each of these have unique scale needs and were tweaked separately. We built pessimistic traffic models for each of these basis which we came up with ladders that controlled server farms depending on the estimated concurrency. Knowing what your key pillars are and what kind of patterns they are going to experience is pivotal when it comes to tuning. One size does not fit all. Once Only Scaling effectively at such a scale means that you drive away as much traffic as possible from the origin servers. Depending on your business patterns, using caching strategies on the serving layer as well as smart TTL controls on the client end, one can give breathing room to their server systems. Reject Early Security is a key tenet, and we leverage this layer to also drive away traffic that doesn’t need to come to us at the top of the funnel. Using a combination of white listing and industry best practices, we drive away a lot of spurious traffic up front itself. The Telescope Like any other subscription platform, we’re ultimately beholden to the processing rates that our payment partners provide us. Sometimes during a peak, this might mean adding a jitter to our funnel to allow customers to follow through at an acceptable rate to enable a higher success rate overall. Again, these funnels / telescopes are designed keeping in mind the traffic patterns that your platform will experience. Often these decisions will need to involve customer experience changes to account for being gentler on the infrastructure. The Latency Problem As the leading OTT player in India, we’ve been steadily making improvements to our streaming infrastructure. It remains a simple motto of leaner on the wire, faster than broadcast. As simple as this sounds, its one of the most complex things to get right. Through the year we have brought down our latency numbers from being roughly 55s behind broadcast, to approximately 15–20s behind broadcast and only a couple of seconds behind on our re-done web platform. This was a result of highly meticulous measurement of how much time each segment of our encoding workflow took and then tweaking operations and encoder settings to do better. We did this by applying profiling of the workflow to instrument each segment. This is another classical tenet, tuning cannot happen without instrumentation. We continue to tweak bit-rate settings to provide a un-compromised experience to our customers while at the same time be efficient in bandwidth consumption for Indian conditions. Lower latencies and smarter use of player controls to provide a smooth viewing experience to customers also helps with smoother traffic patterns as fewer customers are repeating the funnel, which can cause a lot of ripple through the whole system with it’s retries and consequent additional events that pass through the system. Server Morghulis (aka Client Resiliency) The Hotstar client applications have been downloaded multiple hundred million times so far. Suffice to say that when game time comes, millions of folks are using Hotstar as their primary screen. Dealing with such high concurrency means that we cannot think of a classical coupling of client with the back-end infrastructure. We build our client applications to be resilient and gracefully degrade. While we maintain a very high degree of availability, we also prepare for the worst by reviewing all the client : server interactions and indicating either gracefully that the servers were experiencing high load or by a variety of panic switches in the infrastructure. These switches indicate to our client applications that they should ease off momentarily, either exponential back-off or sometimes a custom back-off depending on the interaction so as to build a jitter into the system that provides the back-end infrastructure time to heal. While the application has many capabilities, our primary function is that to render video to our customers reliably. If things don’t look completely in control, specific functionality can degrade gracefully and keep the primary function un-affected. Ensure that the primary function always works and ensure resiliency around server touch-points. Not every failure is fatal, and using intelligent caching with the right TTL’s can buy a lot of precious headroom. This is an important tenet. |
eNews
How short, addictive story videos quietly colonised the Indian smartphone
A landmark Meta-Ormax study of 2,000 viewers reveals a format that is growing fast, paying slowly and consumed almost entirely in secret
MUMBAI: India has a new entertainment habit, and it arrived without anyone really noticing. Micro dramas, those short, cliffhanger-driven episodic stories built for the smartphone screen, have quietly embedded themselves into the daily routines of millions of Indians, discovered not by design but by algorithmic accident, watched not in living rooms but in bedrooms, on commutes and in the five minutes before sleep.
That, in essence, is the finding of a sweeping new audience study released by Meta and media insights firm Ormax Media at Meta’s inaugural Marketing Summit: Micro-Drama Edition. Titled “Micro Dramas: The India Story” and based on 2,000 personal interviews and 50 depth interviews conducted between November 2025 and January 2026 across 14 states, it is the most comprehensive study of the category in India to date, and its findings are striking.
Sixty-five per cent of viewers discovered micro dramas within the last year. Of those, 89 per cent stumbled upon the format through social media feeds, primarily Instagram and Facebook, without ever searching for it. The algorithm did the heavy lifting. Discovery, as the report puts it bluntly, is algorithm-led, not intent-led.
The typical viewer journey begins with accidental exposure while scrolling, moves through a cliffhanger-driven incompletion hook that makes stopping feel unfinished, and is reinforced by algorithmic repetition until habitual consumption sets in. Only then, when a platform asks for an app download or a payment, does the viewer pause. Trust, not content quality, determines what happens next, and many simply return to the free feed rather than pay. It is a funnel with a wide mouth and a narrow neck.
The numbers on consumption tell their own story. Viewers spend a median of 3.5 hours per week watching micro dramas, spread across seven to eight sessions of roughly 30 minutes each, peaking sharply between 8pm and midnight. Daytime viewing is snackable and low-commitment, squeezed into morning commutes, work breaks and coffee pauses. Night-time is where the format truly lives: private, uninterrupted and, for many viewers, socially invisible. Ninety per cent watch alone, compared to just 43 per cent for long-form OTT content. Half the audience watches during their commute, well above the 37 per cent figure for streaming platforms, a direct reflection of the format’s low time investment advantage.
The audience itself breaks into three segments. Incidental viewers, comprising 39 per cent of the total, are passive consumers who stumble in and rarely seek content actively. Intent-building viewers, the largest group at 43 per cent, are beginning to form habits and seek out episodes but remain cautious. High-intent viewers, just 18 per cent, are the ones who download apps, tolerate ads and occasionally pay: skewing male, younger and urban.
What audiences want from the content is revealing. The top three genres are romance at 72 per cent, family drama at 64 per cent and comedy at 63 per cent, precisely the same top three as Hindi general entertainment television. The format rewards emotional familiarity over complexity. Romance in particular thrives because it demands low cognitive investment, needs no elaborate world-building and plays naturally into the private, pre-sleep viewing window where inhibitions lower and emotional intimacy feels safe.
The most-recalled shows, led by Kuku TV titles such as The Lady Boss Returns, The Billionaire Husband and Kiss My Luck, share a common narrative DNA: rich-poor conflict, hidden identities, power imbalances, melodrama and cliffhangers that make stopping feel physically uncomfortable. Predictability, the research warns, is fatal. Each episode must re-earn attention from scratch.
The terminology question is telling. Despite the industry’s embrace of the phrase “micro drama,” viewers have not adopted it. They call the content “short story videos,” “short dramas,” “reels with stories” or simply “serials.” One respondent from Chennai said bluntly that “micro sounds like a scientific word.” The category is at the stage that OTT occupied in 2019 and podcasts in the same year: widely consumed, poorly named and not yet crystallised in the public imagination.
Platform awareness remains alarmingly thin. Only three platforms, Kuku TV at 78 per cent, Story TV at 46 per cent and Quick TV at 28 per cent, have crossed the 20 per cent awareness threshold. The rest languish in single digits. This creates a trust deficit that directly throttles monetisation: viewers who cannot remember which app they used are hardly primed to enter their payment details.
Yet the appetite is clearly there. Sixty-five per cent of viewers watch only Indian content, drawn by the TV-serial familiarity of the storytelling, the comfort of Hindi as a shared language and the sight of actors they half-recognise from decades of television. South languages are rising fast: Tamil, Telugu and Kannada together account for 24 per cent of first-choice viewing. And AI-generated content, still a novelty, has landed better than expected: 47 per cent of viewers call it creative and unique, with only 6 per cent actively rejecting it.
Shweta Bajpai, director, media and entertainment (India) at Meta, called micro drama “a category that is rewriting the rules of Indian entertainment,” adding that the discovery engine being social distinguishes this wave from previous content formats. Shailesh Kapoor, founder and chief executive of Ormax Media, was characteristically measured: the format, he said, is showing “the early signs of becoming a distinct content category” and, given how closely it aligns with natural mobile behaviour, “has the potential to scale very quickly.”
The format’s fundamental mechanics are working. It enters lives quietly, through boredom and a scrolling thumb, and burrows in through incompletion and habit. The challenge now is monetisation: converting a category of highly engaged but deeply anonymous viewers into paying customers who trust the platform enough to hand over their UPI credentials. The story, as any micro-drama writer knows, is only as good as the next cliffhanger. India’s platforms had better have one ready.








