Cinema is big. It’s the Oscars that got small.

From the Big Screen to the Smallest: The Oscars and the Final Lament for Cinema

In 1929, the Academy Awards were born alongside the consolidation of cinema as the defining art form of the twentieth century. The Oscars did not merely honor motion pictures; they sanctified the big screen as a cathedral of light where stories were projected larger than life, and where audiences gathered together in reverent silence to be transformed. Nearly a century later, the announcement that the Oscars will move to YouTube in 2029 feels less like an adaptation and more like a capitulation. It’s a moment of inflection that reads, unmistakably, as a eulogy.

Anyone who has followed my work on Twitter or my blog for any length of time knows that I effectively gave up on the Oscars years ago. Even so, this announcement demands cultural analysis and reflection on its deeper implications. One needn’t be a devoted viewer of the ceremony to recognize the ongoing erosion of cinema itself; disengagement does not preclude clear sight, and distance often sharpens it.

There is a morbid irony in a ceremony created to celebrate cinema’s grand scale choosing to live on the smallest screen possible. The Oscars migrating to YouTube is not simply a platform change; it is a symbolic reversal of values. The institution that once affirmed spectacle, patience, and collective experience now aligns itself with the very medium that played a decisive role in cinema’s metaphoric death—fragmented attention, algorithmic taste-making, and content flattened into disposable scrolls. What was once king has voluntarily donned the motley of the court jester.

For decades, the Oscars functioned as a kind of cultural mass. Even when ratings declined, the ceremony retained its claim to seriousness. It insisted—sometimes stubbornly—that movies mattered, that craft mattered, that the labor of hundreds could still culminate in something worthy of ritual. To move this rite to YouTube is to concede that cinema no longer warrants ceremony at all. It is now content, indistinguishable from reaction videos, vlogs, and monetized outrage. The awards will play not to the gods of light and shadow, but to the lowest common denominator of engagement.

This decision cannot be disentangled from the broader arc traced in the manuscript on which I am presently writing Are You Still Watching? Solving the Case of the Death of Cinema, which is my followup book to Monsters, Madness, and Mayhem: Why People Love Horror releasing in October 2026. The internet did not merely change distribution; it reprogrammed desire. It replaced anticipation with immediacy, reverence with irony, and stars with personalities. The movie star—once a distant, luminous figure whose very remoteness fueled myth—has been rendered obsolete by constant access (except for you Tom Cruise–you are the last remaining movie star in the classical sense). When everyone is visible at all times, no one can remain larger than life. In this sense, the internet did not just kill the movie star; it dismantled the conditions required for stardom to exist.

The Golden Era understood something we have since forgotten: limitation creates meaning. The big screen mattered because it was rare. The theatrical experience mattered because it demanded surrender—of time, of attention, of comfort. The Oscars mattered because they crowned achievements that could not be reduced to metrics. Box office was discussed, but it did not dictate value. Craft, risk, and ambition still held currency. One cannot imagine the architects of Hollywood—those who built studios, nurtured stars, and believed in cinema as a national dream—viewing this moment without despair. The roll call of names etched into Oscar history now echoes like a rebuke.

The move to YouTube completes a long erosion. First came the shrinking theatrical window, then the dominance of streaming, then the rebranding of films as “content.” Each step was defended as pragmatic, inevitable, even democratic. Yet inevitability is often the language of surrender. By placing the Oscars on YouTube, the Academy signals that it no longer believes cinema deserves its own stage—literal or metaphorical. It accepts, finally, that movies are just another tile in the feed.

What makes this moment especially tragic is that it arrives cloaked in the rhetoric of accessibility. YouTube promises reach, youth, relevance. But to what end and at what cost? Cinema was never meant to be optimized for virality. Its power lay in duration, in immersion, in the audacity to ask audiences to sit still and feel deeply. An awards show on YouTube does not elevate cinema to the digital age; it drags cinema down to the logic of the internet, where attention is fleeting and meaning is provisional. That which is required by the desired algorithm will be that which dictates the ceremony and pageantry thereof.

And yet, this lament is not without pride. There was a time when this industry truly was an industry of dreams. When the Oscars crowned films that expanded the language of the medium. When a win could alter a career not through branding, but through trust—trust that audiences would follow artists into challenging territory. That history cannot be erased by an algorithm, even if it can be buried beneath one.

If the Oscars moving to YouTube does not signal the death of cinema, it is difficult to imagine what would. It is the final nail not because it kills something vibrant, but because it seals a coffin long prepared. What remains will continue to exist—films will still be made, awards will still be handed out—but the animating belief that cinema is a singular, communal art form has been surrendered.

The tragedy is not that the Oscars will stream on YouTube. The tragedy is that, in doing so, they admit they no longer know what they are mourning.

This loss of self-knowledge did not arrive overnight. Long before the platform shift, the ceremony began to erode its own authority through an increasing embrace of socio-political posturing by hosts and award recipients alike. What was once a night dedicated, however imperfectly, to the celebration of films, performances, and craft gradually transformed into a sequence of soapboxes. The Oscars mistook moral exhibitionism for relevance, and in doing so alienated a broad public that tuned in not for lectures, but for an affirmation that movies themselves still mattered.

This is not an argument against artists holding convictions, nor a denial that cinema has always intersected with politics. Rather, it is an indictment of a ceremony that lost the discipline to distinguish between art and advocacy. When acceptance speeches routinely overshadowed the work being honored, the implicit message was clear: the films were secondary. Viewers responded accordingly. Ratings declined not merely because of streaming competition, but because the ceremony no longer respected its own premise. Had hosts and winners remained anchored in the films—celebrating storytelling, performance, direction, and the collaborative miracle of production—the Oscars might have retained their standing as a cultural commons rather than a partisan spectacle.

In surrendering the focus on cinema itself, the Academy weakened the very case for its continued relevance.

Progress is often invoked as an unqualified good, but history suggests it is more accurately understood as an exchange—one that invariably involves loss. Sometimes that “loss” isn’t’ felt immediately, but there is inevitably some mild, moderate, or signifiant loss somewhere. Every cultural advance carries a cost, and the measure of true progress lies in whether what is gained outweighs what is surrendered. In the case of the Oscars, the pursuit of modernity, relevance, and moral signaling came at the expense of gravitas, neutrality, and shared cultural meaning. What was gained—momentary applause within narrow circles, fleeting relevance in the news cycle—proved insufficient compensation for what was lost: broad public trust, ceremonial dignity, and the sense that this night belonged to everyone who loved movies, not just those who spoke the loudest.

When institutions confuse change with improvement, they often wake to find that they have survived only in form, not in spirit.

Taken together, the Oscars decline follows a macabre logic—a ceremony founded to exalt scale, craft, and collective experience gradually surrendered its authority by de-centering movies themselves—first through moral grandstanding, then through technological appeasement, and finally through full assimilation into the internet’s attention economy. Each step was justified as necessary, inclusive, or inevitable. Yet the cumulative effect was corrosive. The Oscars did not lose relevance because audiences abandoned cinema; audiences abandoned the ceremony because it no longer stood for cinema as something distinct, demanding, and worthy of reverence.

What remains is a hollowed-out ritual, stripped of its gravitational pull, migrating to YouTube not as a bold reinvention but as an admission of defeat. The move completes the journey from cathedral to feed, from shared cultural moment to algorithmic afterthought. It confirms that the Academy has chosen survival at the cost of meaning—and in doing so, has preserved the shell of the institution while relinquishing its soul.

Gloria Swanson’s Norma Desmond, reflecting on the industry’s changing fortunes, once delivered an epitaph that now feels uncomfortably prophetic: “I am big. It’s the pictures that got small.” A century after the birth of the Oscars, her words resonate with renewed clarity. Cinema did not shrink because audiences demanded less; it shrank because its stewards accepted less.

The Oscars’ migration to the smallest screen is not progress; it’s the final confirmation that something vast, communal, and luminous has been allowed to diminish, and that what replaced it was not worth the cost. A ceremony that no longer centered movies should not be surprised when audiences stopped gathering to watch it. The move to YouTube, then, feels less like a sudden betrayal and more like the logical endpoint of a long retreat: from celebration to commentary, from reverence to rhetoric, from a shared night at the movies to just another argument in the feed.

Ryan is the general manager for 90.7 WKGC Public Media and host of the show ReelTalk “where you can join the cinematic conversations frame by frame each week.” Additionally, he is the author of the upcoming film studies book titled Monsters, Madness, and Mayhem: Why People Love Horror. After teaching film studies for over eight years at the University of Tampa, he transitioned from the classroom to public media. He is a member of the Critics Association of Central Florida and Indie Film Critics of America. If you like this article, check out the others and FOLLOW this blog! Follow him on Twitter: RLTerry1 and LetterBoxd: RLTerry

CLUE 40th Anniversary

40 Years Later, It’s Still One of the Smartest Comedies Ever Made From One of the Dumbest Possible Premises.

Clue (1985) somehow caught lightning in a bottle, and has held onto it for four decade; this same lightning was then shaken and thrown against the silver screen in the most delightfully chaotic ways imaginable. Forty years later, this all-star murder mystery based on the classic boardgame remains sharper, funnier, and more lovingly crafted than most prestige comedies released today. What should have been a disposable novelty became a masterpiece of comedic architecture, tonal discipline, and ensemble chemistry. I first discovered it on VHS from my local public library, and even then I knew I had stumbled onto something special. My sister loves it as much as I do. It’s a movie that works on you—and then keeps working every time you revisit it.

For my show ReelTalk on WKGC Public Media this week, I invited returning guest and friend of the show, film critic Sean Boelman to join me in our celebration of Clue‘s 40th anniversary. You can listen to the show by clicking the appropriate link below. While my article captures the highlights of what Sean and I discuss, listening to the show after reading the article, you’ll have a much more robust experience!

At its core, Clue commits fully to three things most comedic mysteries never attempt at the same time: total absurdity, airtight plotting, and theatrical precision. Most films in the genre pick one lane—either slapstick, or clever mystery, or witty farce—but Clue weaves them together with an elegance that belies how frantic the movie feels moment to moment. Unlike many modern adaptations drowning in CGI, brand synergy, or self-aware winking, Clue treats its ludicrous premise with the sincere craftsmanship of an Agatha Christie play–yet–Clue’s apparatus is actually more closely related to the boardgame play than to the typical Christie literary apparatus. The humor is character-driven, rooted in rhythm, timing, and razor-sharp verbal dexterity. That sincerity, combined with its unhinged heart, is why the film remains timeless.

Much of Clue’s durability stems from how it uses language as a weapon. This is not a movie relying on boardgame nostalgia or shallow references; it is powered by dense wordplay, screwball pacing, and overlapping exchanges that feel plucked from a stage farce running at espresso speed. Every performer is asked to treat their lines with theatrical precision. The jokes arrive in layers, often stacked on top of each other, rewarding audiences who pay attention and enhancing the comedy with every rewatch. By grounding the absurdity in craft—rather than irony—the film avoids collapsing into randomness. It feels smart, not silly; intentional, not accidental. Humor this tightly constructed simply does not age.

Another reason the film works: it respects the genre it’s parodying. Clue doesn’t mock murder mysteries from a distance. It commits to the melodrama, the red herrings, the stakes—even as it gleefully skewers them. Parody only works when sincerity lies beneath the joke. Modern adaptations often fail because they either drown in self-awareness or cling to seriousness so tightly the comedy feels bolted on. Clue threads the needle by honoring the mechanics of a whodunit while joyfully stretching them to the breaking point. It loves the sandbox it’s playing in, and the audience can feel that affection.

Of course, the film’s most unforgettable asset is its ensemble cast, which may be one of the best comedic troupes ever assembled on screen. These are character actors trained in theater, sketch, and improv—who understand timing and ensemble harmony better than any star-studded ensemble today. Tim Curry’s manic precision, Madeline Kahn’s volcanic eccentricity, Michael McKean’s brilliant awkwardness, Lesley Ann Warren’s slinky aloofness—every actor is distinct, yet completely in tune with the film’s wavelength. No one competes for the spotlight; instead, every moment becomes a relay race of comedic energy. Modern ensemble films often feel like stitched-together “bits.” Clue feels alive, reactive, and musical. It is an ensemble in the purest sense.

And then, of course, there are the multiple endings—a theatrical gamble so audacious it could have sunk the film entirely. Instead, it became an iconic part of its identity. In 1985, you never knew which ending you’d get in theaters, a cheeky nod to the board game’s replayability. Instead of feeling gimmicky, it felt organic to the world of the film—a natural extension of its playful tone and farcical structure. Today, a studio would almost certainly turn the idea into a marketing ploy or streaming bonus feature, but in Clue, the endings are crafted with sincerity and precision, not cynicism. They’re not content strategy; they’re punchlines.

The film’s simplicity is another key to its longevity. Where modern game adaptations inflate themselves into lore-heavy franchises, Clue keeps everything contained in one house with one group of increasingly frantic characters. The mansion becomes a pressure cooker where personality collisions become the main spectacle. No elaborate world-building, no digital spectacle—just smart writing, sharp performances, and a commitment to letting the humor build naturally. The film’s scale is its strength.

Would Clue still find an audience today? Absolutely—although probably through a different path. Theatrical comedy has become a rare species, and a film this verbally dense might struggle to secure screen space. But word of mouth would spread like wildfire, and social media would turn its most quotable lines into instant memes. If anything, its intelligence, compact scope, and genuine ensemble work would feel refreshingly rebellious in today’s IP-heavy landscape.

What ultimately makes Clue endlessly rewatchable—more than contemporaries like Knives Out—is that it’s a comedy first and a mystery second. The joy doesn’t hinge on solving the puzzle; it hinges on watching these characters unravel in the most glorious fashion. Puzzles fade with familiarity. Brilliant performances only deepen. The more you watch Clue, the funnier it becomes.

So what is Clue’s greatest legacy? It proved something rare: that a film can be wildly silly and intellectually sharp at the same time. It’s a miracle of tonal balance, ensemble synchronicity, and writerly discipline. A movie that treats its audience with respect even as it descends into delightful chaos. A movie that should have been forgotten…yet became unforgettable.

Forty years later, Clue remains the gold standard—not because it adapts a board game faithfully, but because it transcends one. It is lightning in a bottle. And every time we open that bottle, the spark still flies.

Ryan is the general manager for 90.7 WKGC Public Media and host of the show ReelTalk “where you can join the cinematic conversations frame by frame each week.” Additionally, he is the author of the upcoming film studies book titled Monsters, Madness, and Mayhem: Why People Love Horror. After teaching film studies for over eight years at the University of Tampa, he transitioned from the classroom to public media. He is a member of the Critics Association of Central Florida and Indie Film Critics of America. If you like this article, check out the others and FOLLOW this blog! Follow him on Twitter: RLTerry1 and LetterBoxd: RLTerry

WICKED: FOR GOOD movie musical review

Some movies soar on broomsticks; this one never quite gets off the ground.

Wicked: For Good arrives with sky-high expectations, a beloved Broadway pedigree, and a cinematic world forever shaped by the 1939 Wizard of Oz. And while the heart for the material is undeniably present—director Jon M. Chu’s affection radiates through nearly every frame—the execution is fraught with problems that prevent the film from casting the spell it so eagerly attempts. It’s a movie overloaded with spectacle yet starved of narrative discipline, regrettably proving that sometimes a production can have all the right ingredients and still mix the potion incorrectly. There’s no question Jon M. Chu loves this material—his enthusiasm is evident. But passion alone isn’t enough. The film desperately needed stronger producing and organizational forces to ground the project, refine its pacing, and balance its emotional register. Instead, we get a production that feels at once over-managed and under-shaped.

Now demonized as the Wicked Witch of the West, Elphaba lives in exile in the Ozian forest, while Glinda resides at the palace in Emerald City, reveling in the perks of fame and popularity. As an angry mob rises against the Wicked Witch, she’ll need to reunite with Glinda to transform herself, and all of Oz, for good.

The most glaring issue in this movie is the pacing. This story never needed to be two movies. One Broadway show, one complete screen adaptation—simple math. Instead, Wicked and Wicked: For Good, collectively, feel like a single narrative forcibly stretched and compressed simultaneously. Scenes either end abruptly or linger with self-importance, giving the whole film a stop-and-start rhythm that betrays any emotional momentum. Moments that should breathe are suffocated, while others that should be tightened sprawl endlessly. Narratively, the film leans heavily on contrivances rather than character and plot development. Plot turns feel telegraphed or unearned, creating a sense that events are happening because the script demands it—not because the characters have earned the journey. Emotional beats are pushed rather than developed; the film tugs at heartstrings it hasn’t taken the time to weave. Many sequences feel manipulative instead of meaningful, leaving the viewer aware of the strings being pulled rather than swept up in the melody.

The film maintains the emotional equivalent of flooring the accelerator from beginning to end. Everything is heightened, everything is urgent, everything is presented at maximum volume. Without quieter resets, the story becomes exhausting rather than exhilarating. The lack of modulation leaves little room for nuance, making even potentially impactful moments blur together into one extended crescendo.

And then there’s the Oz problem itself–it was bad enough in the first movie, but this one amplifies all the flaws in this picture. From the opening Universal logo and Wicked title card, both stylized to resemble their 1930s counterparts, it’s clear the film wants to position itself adjacent to the classic Wizard of Oz. (And yes, I am aware that the Broadway show is based on books and not the 1939 classic, but this is a screen adaptation that is going to by default be connected spiritually and literally to the events, imagery, and characterizations of the original movie, but I digress). Whenever Wicked intersects with that iconic imagery, the visual and narrative disconnect is jarring. Tonally, textually, and aesthetically, nothing matches. Two of the most egregious examples are the Wicked Witch of the West’s castle, a location fundamentally misaligned with its 1939 counterpart in both history and design, and Glinda’s bubble. Hello??? She is clearly a magical being and travels by a magical bubble. To rob her of those elements is to rob her original characterization. For a film so eager to evoke some level of nostalgia, its disregard for consistency with cinema’s most beloved fantasy feels baffling.

The editing is among the film’s most distracting flaws—awkwardly timed transitions, uneven scene construction, and moments that feel spliced for convenience rather than cohesion. The cinematography dazzles with color and movement but contributes little to storytelling. It’s all flash, no narrative substance: beautiful images that ultimately amount to little more than digital confetti. And we cannot talk editing without addressing teh cringe CGI–the kind of digital spectacle that feels less like movie magic and more like a rough animatic accidentally exported at full resolution. Emerald City looks less like a tangible place and more like a high-end screensaver—everything polished to a rubbery sheen, with no texture, grit, or atmospheric depth. Characters often appear detached from their surroundings, as if composited into a digital diorama rather than inhabiting a lived-in world. Instead of mixing practical sets with digital enhancements, the film leans heavily on full-CG environments and even characters, resulting in octane-fueled and intimate moments feeling artificial. It’s like looking upon a world of fantasy that feels more like a giant animated backdrop with actors placed within versus a world that feels tangible.

Not even the presence of Michelle Yeoh is enough to elevate the film’s sense of class or gravitas. Although, it’s hard to blame her, given that she’s phoning in a performance built on scraps of narrative substance. In this second installment, her character is little more than an ornament of prestige, offering neither meaningful development nor any real impact on the story. Jeff Goldblum, likewise, delivers a surprisingly muted turn, coasting on his trademark charisma without ever fully engaging. When two performers known for commanding the screen seem this disengaged, it speaks less to their abilities and more to a film that gives them virtually nothing with which to work.

Wicked: For Good reaches for greatness but ultimately fails to stick the landing. It’s a film overflowing with heart yet undercut by structural missteps, contrived plotting, mismatched continuity, and a visual approach that prizes spectacle over substance. For a story about defying gravity, it’s ironic that this adaptation never quite lifts off the ground.

Ryan is the general manager for 90.7 WKGC Public Media and host of the show ReelTalk “where you can join the cinematic conversations frame by frame each week.” Additionally, he is the author of the upcoming film studies book titled Monsters, Madness, and Mayhem: Why People Love Horror. After teaching film studies for over eight years at the University of Tampa, he transitioned from the classroom to public media. He is a member of the Critics Association of Central Florida and Indie Film Critics of America. If you like this article, check out the others and FOLLOW this blog! Follow him on Twitter: RLTerry1 and LetterBoxd: RLTerry

NOW YOU SEE ME: NOW YOU DON’T movie review

The real magic is in how this movie made it past opening night.

“He was an illusionist…he wasn’t a very good illusionist.” Although that comedic line is delivered by Madeline Kahn in CLUE (1985), it seems rather fitting for the tertiary installment in the Now You See Me series. Now You See Me: Now You Don’t is a cinematic magic trick without magic—an illusion performed with all the spectacle of a streaming-original and none of the wonder that made the first film such a delightful surprise. What should have been a clever caper wrapped in misdirection, sleight of hand, and showmanship instead plays like an inflated pilot episode for a franchise desperate to convince us it still has something up its sleeve.

The Four Horsemen and a new generation of illusionists try to bring down a worldwide criminal network.

Let’s begin with the one element that is genuinely spellbinding: the château sequence of scenes. The production design on display here is fantastic—one of the best production designs of the year. Every room, corridor, and shadow-drenched chamber seems crafted with the meticulous eye of an artisan. Sadly, aside from the technical achievement, this setting is little more than a backdrop to the action within its labyrinthine corridors. In another movie, this location could have been a character unto itself; here, it’s little more than an exceptionally beautiful stage for an otherwise uninspired performance. Still, credit where is due: the château alone might be the only reason this movie deserves to be seen anywhere larger than a laptop screen.

The returning cast—those legacy performers who anchored the earlier installments—slip back into their roles with charm and chemistry. Harrelson always delivers an entertaining performance–ever since his days on Cheers. The rapport between the legacy cast is believable–a once-close group of friends that hasn’t seen one another in a decade, even if the screenplay material underserves them. And yes, Morgan Freeman’s cameo is a genuine delight, a sprinkle of prestige the film desperately needed. But not even Freeman can pull a rabbit out of this hat.

Unfortunately, the three new teenage cast members derail what little fun the film attempts to muster. Obnoxious, self-righteous, and utterly allergic to accountability, they embody the worst tendencies of modern franchise youthification. The film props them up as the only ones who can “fix the world,” all while they display a profound lack of understanding of that very world’s complexities. It’s a toxic ideological cocktail—one part hubris, one part naïveté, shaken vigorously and served without nuance. Their presence doesn’t invigorate the franchise; it infantilizes it.

An overview of the plotting reveals that Now You Don’t contains more holes and narrative gaps than those the O.J. jury was willing to ignore. Motivations shift without cause, twists are telegraphed from miles away, and the screenplay is so preoccupied with its social commentary that it forgets to construct a believable story around it. The message—about the wealthy exploiting the poor—is noble in theory but executed with such superficiality that it borders on parody. It’s activism-by-template, the cinematic equivalent of most “thoughts and prayers” tweets.

Worse, the film contains no meaningful tension. It coasts on comic-book logic without embracing the fun of comic-book storytelling. Stakes evaporate. Consequences vanish. Nearly every set piece feels contrived rather than orchestrated. Magic, by its nature, requires misdirection, timing, and a suspension of disbelief; this film offers none of those. What should have been a thrilling high-wire act is instead a leisurely stroll with training wheels–or perhaps a trite-cycle of a movie.

Now You See Me: Now You Don’t is a movie that wants to dazzle but barely flickers. It lacks cinematic gravitas, emotional investment, and narrative cohesion. And much like a forgettable card trick, it should ultimately disappear from cinemas—preferably before anyone attempts to resurrect this franchise again. It’s time to vanish.

Ryan is the general manager for 90.7 WKGC Public Media and host of the show ReelTalk “where you can join the cinematic conversations frame by frame each week.” Additionally, he is the author of the upcoming film studies book titled Monsters, Madness, and Mayhem: Why People Love Horror. After teaching film studies for over eight years at the University of Tampa, he transitioned from the classroom to public media. He is a member of the Critics Association of Central Florida and Indie Film Critics of America. If you like this article, check out the others and FOLLOW this blog! Follow him on Twitter: RLTerry1 and LetterBoxd: RLTerry

PREDATOR: BADLANDS movie review

Predator: Let’s Play. When streaming content hits the big screen.

Predator: Badlands is the equivalent of a “Let’s Play,” but with bigger explosions. The nonstop action, constant motion offer little to no substantive emotional investment. You’re an observer, not a participant—which might be fine for streaming, but it’s a strange fit for cinema. The latest in the Predator franchise plays like a two-hour sizzle reel with delusions of grandeur. It’s a glossy barrage of explosions, digital dust, and quippy one-liners that evaporate before they even hit the floor. By the time the credits roll–that’s if you haven’t fallen asleep—you’ve seen everything and felt nothing. It’s not that the film is aggressively bad—it’s that it’s aggressively empty–little more than content to pander to short attention spans with shiny movement instead of meaningful momentum.

Cast out from its clan, a Predator and an unlikely synthetic ally embark on a treacherous journey in search of the ultimate adversary.

The screenplay feels like it was written by an algorithm trained on reaction videos and Reddit threads. Every line of dialogue sounds like a placeholder; it’s as if someone said, “We’ll fix it later” or “funny line here,” and neglected to return to the page in order to fix it–before principle photography. There’s no sense of escalation, tension, or rhythm; it’s a series of flashy moments loosely stitched together, like a highlight reel of a game you didn’t play. Even the humor feels synthetic–much like the characters– punching at air instead of connecting with character or tone.

As for the characters, they exist mostly as camera targets. They are little more than digital avatars running, shooting, and shouting for reasons that never feel personal or compelling. The lead could be replaced by a different actor mid-film and you might not notice. This critic isn’t even convinced that Dek (our central Predator character) wasn’t entirely CGI, though it may have only been the facial area. “What’s my motivation?” Difficult to say–there wasn’t much upon which to build. Motivations are paper-thin, arcs nonexistent. The Predator itself, once a symbol of primal fear and unseen menace, now feels like a boss-level NPC waiting to be triggered by the next quick-time event.

Visually, Badlands has all the spectacle money can buy; but its spectacle is divorced from any meaningful purpose. The explosions are massive, the sound mix thunderous, and yet it’s as emotionally engaging as watching someone else play Call of Duty. Every frame screams “look at me!” without ever inviting you to feel something. The editing, too, is manic. And it’s not even as though the narrative demanded it; rather, the dynamic editing was most likely employed because the movie was terrified that you’d look away or down at your watch, which I did several times.

And maybe that’s the point. Predator: Badlands is far less like a movie and more like a cinematic exercise in a large scale “Let’s Play.” For those that are unfamiliar with the term, it’s a type of (usually) YouTube video of someone playing a video game and often their reactions to the game play. Think of it as a passive experience of someone else’s thrill ride. The ultimate, disconnected form of living vicariously. Don’t question anything, because it won’t take long to realize that this movie is hollow. You don’t engage; you just witness. The irony is that the film could’ve been a fascinating critique of screen-mediated experiences, but it never once stops to think.

This is just the latest in a growing trend from Disney’s genre arm: a reliance on brand nostalgia and visual polish in place of storytelling. Ever since the corporate appetite turned to IP recycling, the studio has mistaken familiarity for depth. Badlands is what happens when you try to “optimize engagement” instead of crafting a narrative, resulting in the film equivalent of clickbait dressed in billion-dollar armor.

Predator: Badlands doesn’t so much hunt its audience as it does chase its own tail. A movie that is fast, flashy, and utterly pointless; and desperately wants to go viral but forgets to be cinema. You don’t leave exhilarated; you leave wondering if you accidentally spent $15 to watch a YouTube compilation in IMAX.

Ryan is the general manager for 90.7 WKGC Public Media and host of the show ReelTalk “where you can join the cinematic conversations frame by frame each week.” Additionally, he is the author of the upcoming film studies book titled Monsters, Madness, and Mayhem: Why People Love Horror. After teaching film studies for over eight years at the University of Tampa, he transitioned from the classroom to public media. He is a member of the Critics Association of Central Florida and Indie Film Critics of America. If you like this article, check out the others and FOLLOW this blog! Follow him on Twitter: RLTerry1 and LetterBoxd: RLTerry