Take, for example, the story of beloved Keanu Reeves finding out that a fake tear—that he surely did not squeeze out of his own eye—was digitally added to his face after filming. He never names the film, but the incident freaked him out so much that he added “a clause in every one of his movie contracts that prevents studios from digitally manipulating his performances.” There is also the example of Jet Li turning down the role of Seraph in the Matrix franchise (1999–) because he had a keen understanding that the studio could digitally capture his martial arts movements—his physical and intellectual property—claim it as their own in perpetuity, and potentially recycle and reuse it in the future without even having to secure his permission.1 He prophesied this . . . in the early 2000s.
Through this example, we begin to understand the startling lengths and breadths studios will go to to digitally alter performances as they see fit. Jet Li and Keanu Reeves are veterans of their industry, widely respected and recognized. The question that surfaces then is: how far would studios go in their undeniable violations against actors who have no substantial capital to fight the power or negotiate on their own behalf to protect their likeness? What happens to the extras? The fight coordinators? The stuntpeople? What recourse do they have in an industry where they are already criminally underpaid and consistently snubbed?
And, mind you, the dark future that Jet Li peeked into, where creatives can have their physical and intellectual property hijacked, has surely arrived. I have seen deepfakes of, for example, actors such as Michael B. Jordan and Ryan Reynolds floating around at an alarming frequency.
In an email, actress Tika Sumpter spoke to me about how creepy deepfake videos are and their potential implications, calling it “unsettling.”
“Acting is already a very tough career to sustain oneself in, and I think social media fools a lot of people into thinking every actor is thriving,” she explains. “Most of us have watched fantastic movies such as The Terminator [1984], thinking, Holy cow, that’s wild! That will never happen. And I’m trying not to be an alarmist here, but I do think about the idea of humans no longer being needed. As a society, we will have to figure out how much we want to continue to engage with humankind.”
Ramsey, who is also an actor, brings up current negotiations over the usage of actors’ likenesses and what it could mean for misrepresentation and misinformation (as shown in these deepfake videos): “With negotiations, some of the core issues around AI [are] not being able to license our images without our signoff—and also making sure that if they were to do that, we were paid fairly for it. And that those protections are baked into our contracts. So they couldn’t do it, for example, without our knowledge, put you in a compromising position, or put words in your mouth that you would never say, et cetera.”
Music, another creative industry, is also not immune from this advanced technological theft, considering the sheer amount of “AI covers” I have come across in the style of SZA or Jazmine Sullivan, or even Marvin Sapp. (Sorry, y’all, but those Patrick Star AI covers freak me the fuck out.)
Singer and musician Jordan Occasionally also echoes observations about Black people being violently severed from our work (particularly where deepfakes and Blackfishing are concerned) by such technology. “Technology has democratized the arts—music especially—and, of course, people are jumping in where they can,” they tell me. “But honestly? I didn’t think that people would then be like, ‘Okay, now I can steal [a Black voice].’ Right? Like nothing could have ever prepared me, for example, [for] an ‘AI Rapper,’ who was probably voiced by a white man, saying the N-word. But I should have known, because once [capitalists] get power, they want to extract as much value from these things as they can, particularly in the music industry. AI, in this case, is just a tack-on to the capitalistic nature of these industries.’”
And Jordan is quite right. As someone who cares not about being called an alarmist, it would be quite easy for me to dismiss AI as this great and catastrophic evil that could wipe us all out at a moment’s notice without any safeguards to stifle its proliferation, à la some Skynet knockoff that I’m sure some tech bro out there is just drooling at the possibility of funding.
But . . . that would also be giving AI way too much credit.
At the end of the day, AI—intelligence demonstrated by computers—is taught. AI has to be trained. It learns by what is fed to it, whether that information is ethical or not. That means that AI is not naturally coming out of the gate with the intention of stealing our faces, our words, or our voices. No. AI, as it stands, currently operates as an extension of cartoonishly evil tech robber barons who wish to plunder these things from us and monetize them for their own selfish and iniquitous purposes.
There is definitely a future in which AI could potentially function alongside us as a tool to further technological advancements and alleviate human suffering. But that is not the future for which we are headed if we cannot find a way to divorce this technology from the capitalistic framework that our society currently operates under. Even now, as I write on this subject, the WGA and SAG-AFTRA are striking for fair compensation for their intellectual property, their creations, their ideas, their likenesses, and their performances in the face of rapacious capitalists such as David Zaslav and Bob Iger. And these unions deserve our support. For without their labor and art, a substantial part of our lives becomes hollow—devoid of shape and color. We lose an important medium to discuss, educate, and enlighten on social issues; we lose the very things that bring us closer to our humanity and empower us to tell our stories; we lose our interconnectedness—which is pivotal in uniting the working class around the protection and preservation of their labor and dignity—a unifying force that AI does not possess the capacity to replicate.
And this cannot stand.