Tech Giants & Gaming Industry: Legal Trouble Ahead?

The Tech Giants Got Found Liable. Gaming Should Be Paying Attention.

27 March 2026

This week, two juries in the United States found Meta and Google liable for harm caused to young users. A Los Angeles jury awarded $6 million after finding Instagram and YouTube contributed to a young woman’s depression and anxiety — the woman in question had started using YouTube at age six and Instagram at nine. In an almost parallel legal instance for the Tech bros a New Mexico jury ordered Meta to pay $375 million for enabling child exploitation on its platforms.

Meta and Google will appeal. They’ll argue, as they have throughout, that teen mental health is too complex to pin on a single app. But the verdicts exist. Juries — real people, not tech-friendly judges — looked at the evidence and said: you knew what you were doing, and you did it anyway.

Here’s why anyone who plays or makes games should be paying close attention.


Gaming Is Already in the Conversation

The games industry has long argued it’s categorically different from social media — and in many ways it is. But that distinction is getting harder to maintain in courtrooms and government offices.

George E Osborn, editor of the Video Games Industry Memo and author of the upcoming book Power Play, made the point clearly on BBC Radio 5 Live this morning. “Whether they like it or not, games are in the social media conversation right now,” he said. “The UK’s social media ban consultation explicitly says that some gaming sites are going to be examined.”

His analogy is a useful one: traditional games are like sporting stadiums — tightly controlled, clear about what’s happening inside. Social media platforms are like cities — vast, difficult to police, impossible to fully monitor. Most games sit firmly in the stadium category. But some don’t.

And therin lies the rub as a fellow once said.


The Split That’s Coming

There are essentially two kinds of games now, and they probably shouldn’t be treated the same way legally or regulatorily.

The first is what most people think of when they think of games: you buy it, you download it, you play it. Single player or multiplayer, with a clear beginning, middle and end. Call of Duty. Grand Theft Auto. Space Invaders. The game wants you to finish it. It’s designed around your enjoyment, not your retention.

The second is the live service model — Fortnite being the obvious example. Always online, constantly updated, deeply social, with in-app purchases, battle passes, and engagement loops designed by the same behavioural psychologists who designed Instagram’s infinite scroll. These games don’t want you to finish. They want you to never leave.

Osborn thinks the vast majority of the industry won’t be caught up in incoming regulation. “Anything that looks and feels like a game first is probably not going to get wrapped up in this,” he said. Games with minimal or no user communication — or very light versions of it — are unlikely to face the same scrutiny as social media platforms.

But platforms that have voice chat, text chat, the ability to create experiences, or anything that starts to look more like a communication platform than a game? Those are going to get a much harder look. And he wouldn’t rule out specific platforms and games being highlighted as things to watch.


Loot Boxes Are Already Changing

The clearest sign that the regulatory tide is turning is what’s happening to PEGI — the European age rating system you see on game boxes. They’ve recently updated their approach to now regulate what they’re calling the “context” of video games: how much money you spend, who you talk to, and whether you feel pressured to play.

Loot boxes — the slot machine mechanic that’s made certain publishers billions — are now going to carry a 16 rating as standard across Europe. Not because the game itself is violent or adult in nature, but because the monetisation mechanic is considered harmful enough to restrict to older players.

That’s a significant shift. The games industry spent years arguing that loot boxes weren’t gambling. Some of those arguments held in court. But the Meta verdict isn’t about gambling — it’s about deliberately engineered compulsive behaviour. And that’s a much harder argument to win.


The Roblox Question

The most directly relevant gaming case isn’t in a courtroom yet — but it’s coming. More than 130 lawsuits are currently pending against Roblox, accusing the platform of failing to protect children from sexual exploitation. Roblox denies the claims.

But the legal framework that allowed the Meta and Google cases to proceed to trial — specifically, the argument that platform design choices constitute product liability rather than just content moderation — is exactly the framework Roblox’s accusers are using. Wednesday’s verdicts established that framework as something juries are willing to accept.

Roblox is a game. It’s also a platform where children create and share experiences, communicate with strangers, and spend real money. By Osborn’s stadium/city analogy, it’s very much a city.


What the Industry Could Learn

There’s a version of this story where the games industry gets unfairly caught in the crossfire of a fight that’s really about Instagram and TikTok. The genuine creativity, passion, and craft that goes into making games — and the millions of people who play them healthily and happily — deserves to be protected from regulatory overreach.

Osborn made the point that games developers are different from social media executives in one important way: they took a pay cut to do what they do. People go into games because they love games. That passion matters, and it often shows in the product.

But passion doesn’t exempt anyone from the consequences of bad design decisions. The companies that built predatory monetisation systems — that deliberately used variable reward psychology to extract money from children, that designed engagement loops specifically to prevent players from stopping — made choices. Those choices are now being examined with the same scrutiny that Facebook’s algorithm is facing in a Los Angeles courtroom.

The simplest test for any games company right now is also the oldest one: is what you’re building good for the person playing it? Not good for retention metrics, not good for average revenue per user, not good for daily active users. Good for the actual human being sitting in front of the screen.

The industry that can honestly answer yes to that question has very little to worry about. The industry that can’t should probably be paying close attention to what just happened in Los Angeles. The beginning of the end or the end of the beginning?


Sources: NPR, Reuters, BBC Radio 5 Live. George Osborn’s book Power Play is forthcoming.