Home / Breaking News / TV’s Age of Algorithm Anxiety

TV’s Age of Algorithm Anxiety


The determinism-driven paranoia of Devs dances through Season 3 of Westworld, which for the first time leaves the park of the show’s title to explore what real life looks like in a world that abuses robots. The first two seasons of Westworld explored the idea that the AI “hosts” were being tortured—not only by the physical and sexual violence they endured at the hands of the park’s glumly sadistic visitors, but by the scripted narratives, or story loops, that suppressed their agency, their ability to think or feel for themselves outside of their coding. In Season 3, whose tagline is “Free will is not free,” the show suggests for the first time that the robots aren’t the only ones whose lives conform to existing scripts. One of the new villains in the first four episodes is Serac (played by the French actor Vincent Cassel), a reclusive trillionaire who’s found an algorithm that uses data to predict the future for every human on Earth.

The show’s creators, Lisa Joy and Jonathan Nolan, seem bruised by some of the criticism of the second season, with its indecipherable strata of conceits involving the Man in Black, the mythical game-within-a-game, the potential immortality of both humans and robots, and a timeline jumpier than a flea circus. So in Season 3, everything is markedly different. In a futuristic, Blade Runner–esque Los Angeles, a new character, Caleb (played by Breaking Bad’s Aaron Paul), tries to navigate life as a veteran, unfulfilled by his day job as a construction worker and disaffected by his nighttime activities of app-enabled petty crime. Through Caleb, Westworld suggests that humans outside the park are being manipulated into following the same prewritten paths, the same “tramlines,” as Dolores (Evan Rachel Wood), Maeve (Thandie Newton), and the other hosts.

Westworld Season 3’s tagline is “Free will is not free.” The show suggests for the first time that the robots aren’t the only ones whose lives conform to existing scripts.

Determinism aside, this is a zanier, sillier Westworld, and much more entertaining for it. When Caleb successfully commits crimes, his app tells him, “You made bank, now get drank.” Self-flying drones transport the ludicrously wealthy from one rooftop poolside martini bar to another. Dolores apparently escaped Westworld at the end of Season 2 to joyride motorcycles in scarlet bandage dresses, as if she’d somehow teleported into the Fast and the Furious franchise. Marshawn Lynch plays a new character whose T-shirt acts as a mood ring, spelling out whether he’s angry or amused or bored.

But through it all, the show’s anxiety about free will—and its extension of that conundrum to its human characters—is apparent. That everything takes place in a near-future landscape complete with product placement for Coach and Tory Burch only makes the show’s willingness to untangle subliminal cues more ironic. How much do we actually decide things for ourselves, Westworld wonders, and how much are we steered by the systems around us? “You and I are a lot alike,” Dolores tells Caleb in one scene. “They put you in a cage, Caleb. Decided what your life should be. They did the same thing to me.” If Devs is angsting over the moral and existential significance of technology that still seems—for now—out of reach, Westworld is taking the data mining and user profiling of contemporary life to its logical, dystopian extreme. The requisite “We’re not so different, you and I” speech here comes not between hero and villain, but between human and robot.

We want to hear what you think about this article. Submit a letter to the editor or write to letters@theatlantic.com.


Source link

About admin

Check Also

Ruby Garcia’s Family Upset Over Trump’s Claims He Talked To Them

by Daniel Johnson April 5, 2024 Mavi, who has taken on the role of the …

Leave a Reply

Your email address will not be published. Required fields are marked *

Powered by keepvid themefull earn money