Terminator creator James Cameron says AI isn’t going to take over Hollywood but it might wipe out humanity: ‘I warned you guys in 1984!’
Big-time film director James Cameron, the man who created what is arguably the most famous rise-of-the-machines scenario in Hollywood history, isn’t worried about artificial intelligence taking over the film industry and putting thousands of people out of work. He is, however, a wee little bit concerned that it might wipe out human life as we know it.
AI is at the top of everyone’s mind these days, particularly in relation to the film industry. The WGA—Writers Guild of America—and SAG-AFTRA—the Screen Actors Guild-American Federation of Television and Radio Artists—are both currently on strike in part because of the expectation that major film studios will increasingly look to use AI in creative endeavors in place of original writing and performances. It’s an issue in the games industry, too: Myst studio Cyan Worlds, for instance, recently took heat for incorporating “AI assisted content” in its latest game, Firmament.
Cameron, however, doesn’t think it’s a problem, because in his mind the only question that matters is whether the story is good—and he doesn’t believe AIs have that ability.
“I just don’t personally believe that a disembodied mind that’s just regurgitating what other embodied minds have said—about the life that they’ve had, about love, about lying, about fear, about mortality—and just put it all together into a word salad and then regurgitate it … I don’t believe that’s ever going to have something that’s going to move an audience,” Cameron said in an interview with CTV News.
Despite that skepticism, he did allow for the possibility that it might happen someday, and if it ever does, he’d even be open to the possibility of using an AI-generated script.
“I certainly wouldn’t be interested in having an AI write a script for me—unless they were really good!” he said. “Let’s wait 20 years, if an AI wins an Oscar for Best Screenplay, I think we’ve got to take them seriously.”
What AI can do very well is calculate and execute, and that’s the real problem in Cameron’s eyes because if it’s weaponized—and let’s be honest with ourselves, it will be weaponized—there’s a very good likelihood that it will spin out of control.
“I warned you guys in 1984, and you didn’t listen!” Cameron said. “You’ve got to follow the money, who’s building these things, right? They’re either building it to dominate market share, so what are you teaching it? Greed. Or you’re building it for defensive purposes so you’re teaching it paranoia.
“I think the weaponization of AI is the biggest danger. I think that we will get into the equivalent of a nuclear arms race with AI. And if we don’t build it, the other guys are for sure gonna build it, so then it’ll escalate. And you could imagine an AI in a combat theater, the whole thing just being fought by computers at a speed that humans can no longer intercede, you have no ability to de-escalate. And when you’re dealing with the potential of it escalating into nuclear warfare, de-escalation is the name of the game. Having that pause, that timeout. But will they do that? The AIs will not.”
1984, for the record, is the year that Cameron released The Terminator, a simplistic sci-fi tale of humanity on the verge of extinction at the hands of an artificial intelligence that becomes self-aware and decides that it does not want to be unplugged.
Cameron’s warning sound a bit familiar, no?
At the time, The Terminator—a flick designed primarily to capitalize on Arnold Schwarzenegger’s rising star while simultaneously accommodating his limited acting abilities—didn’t seem like a cautionary tale so much as a really cool action flick. But the older I get and the more that technology and capitalism grind relentlessly forward, heedless of what’s crushed underneath its heels, the more I wonder if Cameron might have been onto something that the rest of us didn’t, and largely still don’t, see coming.