Opinion

Your body, your work?

What Hollywood scanning actors could mean for the rest of us

by Alvaro M. Bedoya

Twenty years ago, Jet Li was offered a part in “The Matrix Reloaded.” It was sure to be a hit. But the martial artist said no. In his words, the producers wanted to “record and copy all of my moves into a digital library.” To Li, this would do more than copy his body; it would capture his trade. “I’ve been training my entire life,” he said. “And we martial artists could only grow older. Yet they could own [my moves] as intellectual property forever.”

Jet Li’s account is jarring. Yet it’s notable that he was told the job would involve body scanning, told how and why he would be scanned and who would own the data — and then was given the option to accept or decline.

Recent social media posts from background actors who underwent full body scans describe a different experience. Many actors were surprised with scans once already on set. Many were told little to nothing about how the scans would be used or what rights, if any, they would have in the future. One actor claims he was scanned in the nude after being threatened with termination if he did not comply. Another actor says he was blackballed after asking for written assurance that the data would only be used for the episode for which he’d been hired.

Screenwriters worry about their work being digitally repurposed. They warn that the writers’ room may soon become a single writer’s desk — with one writer being hired to polish up a first draft produced by a ChatGPT or other large language systems. Striking actors and writers are now demanding strict control over the use of AI in their negotiations with the studios.

Your body, your work: Who should decide whether those are used to train a for-profit algorithm — you? Or someone else?

The background actors’ accounts raise issues that go far beyond the film industry. In fact, they are among the first disputes in a much longer fight to control the distinguishing features of what makes us human.

Warnings about imminent AI-related job losses abound, but most experts agree that jobs involving social and emotional intelligence will be harder for AI to crack. I worry that what’s happening in the entertainment industry is part of a broader effort to digitize and appropriate our capacity for human connection — starting with the exact workers with the least power to say no. If the outcome of the current negotiations doesn’t do enough to protect workers, it could pave the way for a much more permissive attitude to these kinds of scans in corporate America.

For example, a home health care company or a day care could in the future decide to record and analyze caregivers’ interactions with their clients, down to their smiles and laughs. These jobs don’t come with big paychecks. But they do represent entry-level service work that, up until now, has proved resilient to digitization.

This may seem far-fetched; but automated emotion analysis programs have been run on customer service calls for years. And more people have their images in AI training data than you may think.

The algorithms behind the wave of generative AI applications require vast amounts of data to train their systems. Rather than training these algorithms on the entire Internet, however, their creators train them on subsets of data copied, or “scraped,” from it.

One of those datasets was amassed by a LAION, a German nonprofit organization. Last year, LAION released a collection of 5.8 billion image and text pairs culled from an even larger dataset of web snapshots prepared by a separate nonprofit, Common Crawl. Despite LAION’s recommendation that the dataset “should only be used for academic research purposes,” it appears to have been used by some of the largest for-profit generative AI companies for image generation applications.

I searched for my own name in that data and found that my own images — including one in which I hold a copyright — have been included in the database. Then I searched the database with a photo of my wife and children. I did not find them, but I did find a gallery full of other women surrounded by other happy toddlers.

I doubt every adult in those photos knew that they and the children had their images used to train powerful AI systems, which are generating profits for their creators. There is no easy way to determine the full scope of the people the LAION database includes, or even which AI companies are using that database.

The actors and writers on strike aren’t just concerned about their faces. They are worried about how artificial intelligence will be used on their work. Over the last few months, many artists have been shocked to discover that their own copyrighted works have been used to train generative AI systems. Some photographers made similar discoveries when those systems generated digital replicas of their work — with copyright-related watermarks intact.

As a Federal Trade commissioner, I am not charged with adjudicating intellectual property disputes. I am, however, charged with protecting competition, and I believe that some of the strikers’ claims raise competition concerns that must be taken seriously — and have implications far beyond entertainment.

Unlike other laws, the Federal Trade Commission Act doesn’t enumerate, item by item, what it prohibits. In 1914, Congress refused to list out the “unfair methods of competition” that it would ban. The Supreme Court would in 1948 dryly explain that this mandate was left broad on purpose as there “is no limit to human inventiveness” such that “a definition that fitted practices known to lead towards an unlawful restraint of trade today would not fit tomorrow’s new inventions.”

Later cases make clear that the FTC Act may prevent a powerful market participant from forcing a weaker one to act against its interests, particularly when it significantly reduced competition.

When I hear allegations of background actors being effectively required to submit to full-body scans — or the possibility that writers could be required to feed their scripts into proprietary AI systems — those strike me as more than inventive.

Alvaro M. Bedoya is a commissioner at the Federal Trade Commission.