Cinema is big. It’s the Oscars that got small.

From the Big Screen to the Smallest: The Oscars and the Final Lament for Cinema

In 1929, the Academy Awards were born alongside the consolidation of cinema as the defining art form of the twentieth century. The Oscars did not merely honor motion pictures; they sanctified the big screen as a cathedral of light where stories were projected larger than life, and where audiences gathered together in reverent silence to be transformed. Nearly a century later, the announcement that the Oscars will move to YouTube in 2029 feels less like an adaptation and more like a capitulation. It’s a moment of inflection that reads, unmistakably, as a eulogy.

Anyone who has followed my work on Twitter or my blog for any length of time knows that I effectively gave up on the Oscars years ago. Even so, this announcement demands cultural analysis and reflection on its deeper implications. One needn’t be a devoted viewer of the ceremony to recognize the ongoing erosion of cinema itself; disengagement does not preclude clear sight, and distance often sharpens it.

There is a morbid irony in a ceremony created to celebrate cinema’s grand scale choosing to live on the smallest screen possible. The Oscars migrating to YouTube is not simply a platform change; it is a symbolic reversal of values. The institution that once affirmed spectacle, patience, and collective experience now aligns itself with the very medium that played a decisive role in cinema’s metaphoric death—fragmented attention, algorithmic taste-making, and content flattened into disposable scrolls. What was once king has voluntarily donned the motley of the court jester.

For decades, the Oscars functioned as a kind of cultural mass. Even when ratings declined, the ceremony retained its claim to seriousness. It insisted—sometimes stubbornly—that movies mattered, that craft mattered, that the labor of hundreds could still culminate in something worthy of ritual. To move this rite to YouTube is to concede that cinema no longer warrants ceremony at all. It is now content, indistinguishable from reaction videos, vlogs, and monetized outrage. The awards will play not to the gods of light and shadow, but to the lowest common denominator of engagement.

This decision cannot be disentangled from the broader arc traced in the manuscript on which I am presently writing Are You Still Watching? Solving the Case of the Death of Cinema, which is my followup book to Monsters, Madness, and Mayhem: Why People Love Horror releasing in October 2026. The internet did not merely change distribution; it reprogrammed desire. It replaced anticipation with immediacy, reverence with irony, and stars with personalities. The movie star—once a distant, luminous figure whose very remoteness fueled myth—has been rendered obsolete by constant access (except for you Tom Cruise–you are the last remaining movie star in the classical sense). When everyone is visible at all times, no one can remain larger than life. In this sense, the internet did not just kill the movie star; it dismantled the conditions required for stardom to exist.

The Golden Era understood something we have since forgotten: limitation creates meaning. The big screen mattered because it was rare. The theatrical experience mattered because it demanded surrender—of time, of attention, of comfort. The Oscars mattered because they crowned achievements that could not be reduced to metrics. Box office was discussed, but it did not dictate value. Craft, risk, and ambition still held currency. One cannot imagine the architects of Hollywood—those who built studios, nurtured stars, and believed in cinema as a national dream—viewing this moment without despair. The roll call of names etched into Oscar history now echoes like a rebuke.

The move to YouTube completes a long erosion. First came the shrinking theatrical window, then the dominance of streaming, then the rebranding of films as “content.” Each step was defended as pragmatic, inevitable, even democratic. Yet inevitability is often the language of surrender. By placing the Oscars on YouTube, the Academy signals that it no longer believes cinema deserves its own stage—literal or metaphorical. It accepts, finally, that movies are just another tile in the feed.

What makes this moment especially tragic is that it arrives cloaked in the rhetoric of accessibility. YouTube promises reach, youth, relevance. But to what end and at what cost? Cinema was never meant to be optimized for virality. Its power lay in duration, in immersion, in the audacity to ask audiences to sit still and feel deeply. An awards show on YouTube does not elevate cinema to the digital age; it drags cinema down to the logic of the internet, where attention is fleeting and meaning is provisional. That which is required by the desired algorithm will be that which dictates the ceremony and pageantry thereof.

And yet, this lament is not without pride. There was a time when this industry truly was an industry of dreams. When the Oscars crowned films that expanded the language of the medium. When a win could alter a career not through branding, but through trust—trust that audiences would follow artists into challenging territory. That history cannot be erased by an algorithm, even if it can be buried beneath one.

If the Oscars moving to YouTube does not signal the death of cinema, it is difficult to imagine what would. It is the final nail not because it kills something vibrant, but because it seals a coffin long prepared. What remains will continue to exist—films will still be made, awards will still be handed out—but the animating belief that cinema is a singular, communal art form has been surrendered.

The tragedy is not that the Oscars will stream on YouTube. The tragedy is that, in doing so, they admit they no longer know what they are mourning.

This loss of self-knowledge did not arrive overnight. Long before the platform shift, the ceremony began to erode its own authority through an increasing embrace of socio-political posturing by hosts and award recipients alike. What was once a night dedicated, however imperfectly, to the celebration of films, performances, and craft gradually transformed into a sequence of soapboxes. The Oscars mistook moral exhibitionism for relevance, and in doing so alienated a broad public that tuned in not for lectures, but for an affirmation that movies themselves still mattered.

This is not an argument against artists holding convictions, nor a denial that cinema has always intersected with politics. Rather, it is an indictment of a ceremony that lost the discipline to distinguish between art and advocacy. When acceptance speeches routinely overshadowed the work being honored, the implicit message was clear: the films were secondary. Viewers responded accordingly. Ratings declined not merely because of streaming competition, but because the ceremony no longer respected its own premise. Had hosts and winners remained anchored in the films—celebrating storytelling, performance, direction, and the collaborative miracle of production—the Oscars might have retained their standing as a cultural commons rather than a partisan spectacle.

In surrendering the focus on cinema itself, the Academy weakened the very case for its continued relevance.

Progress is often invoked as an unqualified good, but history suggests it is more accurately understood as an exchange—one that invariably involves loss. Sometimes that “loss” isn’t’ felt immediately, but there is inevitably some mild, moderate, or signifiant loss somewhere. Every cultural advance carries a cost, and the measure of true progress lies in whether what is gained outweighs what is surrendered. In the case of the Oscars, the pursuit of modernity, relevance, and moral signaling came at the expense of gravitas, neutrality, and shared cultural meaning. What was gained—momentary applause within narrow circles, fleeting relevance in the news cycle—proved insufficient compensation for what was lost: broad public trust, ceremonial dignity, and the sense that this night belonged to everyone who loved movies, not just those who spoke the loudest.

When institutions confuse change with improvement, they often wake to find that they have survived only in form, not in spirit.

Taken together, the Oscars decline follows a macabre logic—a ceremony founded to exalt scale, craft, and collective experience gradually surrendered its authority by de-centering movies themselves—first through moral grandstanding, then through technological appeasement, and finally through full assimilation into the internet’s attention economy. Each step was justified as necessary, inclusive, or inevitable. Yet the cumulative effect was corrosive. The Oscars did not lose relevance because audiences abandoned cinema; audiences abandoned the ceremony because it no longer stood for cinema as something distinct, demanding, and worthy of reverence.

What remains is a hollowed-out ritual, stripped of its gravitational pull, migrating to YouTube not as a bold reinvention but as an admission of defeat. The move completes the journey from cathedral to feed, from shared cultural moment to algorithmic afterthought. It confirms that the Academy has chosen survival at the cost of meaning—and in doing so, has preserved the shell of the institution while relinquishing its soul.

Gloria Swanson’s Norma Desmond, reflecting on the industry’s changing fortunes, once delivered an epitaph that now feels uncomfortably prophetic: “I am big. It’s the pictures that got small.” A century after the birth of the Oscars, her words resonate with renewed clarity. Cinema did not shrink because audiences demanded less; it shrank because its stewards accepted less.

The Oscars’ migration to the smallest screen is not progress; it’s the final confirmation that something vast, communal, and luminous has been allowed to diminish, and that what replaced it was not worth the cost. A ceremony that no longer centered movies should not be surprised when audiences stopped gathering to watch it. The move to YouTube, then, feels less like a sudden betrayal and more like the logical endpoint of a long retreat: from celebration to commentary, from reverence to rhetoric, from a shared night at the movies to just another argument in the feed.

Ryan is the general manager for 90.7 WKGC Public Media and host of the show ReelTalk “where you can join the cinematic conversations frame by frame each week.” Additionally, he is the author of the upcoming film studies book titled Monsters, Madness, and Mayhem: Why People Love Horror. After teaching film studies for over eight years at the University of Tampa, he transitioned from the classroom to public media. He is a member of the Critics Association of Central Florida and Indie Film Critics of America. If you like this article, check out the others and FOLLOW this blog! Follow him on Twitter: RLTerry1 and LetterBoxd: RLTerry

“A Quiet Place” horror film review

Heart-pounding. Spine-chilling. A creepy creature-feature that will leave you speechless. The demonstrable excellence in terrifying visual storytelling can effectively be summed up by the queen of silent film herself Norma Desmond, “we didn’t need dialogue, we had faces” (Sunset Boulevard). A Quiet Place truly earns its place among “certified fresh” horror films. Not since Don’t Breath and 10 Cloverfield Lane have I encountered such a thrillingly intelligent motion picture. Writer-director John Krasinski’s post-apocalyptic horror masterpiece showcases the power of visual storytelling within the horror genre. Furthermore Krasinski brilliantly channeled the soul of the iconic (mostly Universal Pictures) silent and early horror films for his modern interpretation of the creature-feature. No gimmicks here. Only a solid plot that builds an incredible, immersive cinematic experience upon the foundation of a simple plot with simple limitations. Simple plot, complex characters. That basic screenwriting principle is where so many filmmakers and writers go astray. Film is a visual medium, often supported by well-crafted, lean dialogue, and this film has visual storytelling in spades. This film represents one of the best examples of embracing the concept of “show don’t tell.”

Shhhh. Don’t make a sound. One family finds themselves surviving a post-apocalyptic world now inhabited by an alien species that hunts by sound.

There has certainly been a resurgence of exceptional horror films over the last few years. I mentioned Don’t Breath and 10 Cloverfield Lane earlier, we also have the Academy Award nominated Get Out from last year and many others. While many may shrug their shoulders at horror because it is a proliferated genre with many cheep, tawdry horror flicks, this same genre can be incredibly intelligent in how it makes an observation of society and offers commentary, a new perspective, or provides a means to a discussion. Some of the most critically acclaimed films over the decades have been horror. Being among the first films commercially released, horror has also stood the test of time and provides audiences with a experience that challenges worldviews, provokes physiological responses, and fuels nightmares and imaginations.

One of the most brilliant aspects to A Quiet Place is the film’s innate ability to instantly hook the audience with loud silence. Going into the movie, audiences know that the arachnid-like creatures kill anything within an earshot. Therefore, the audiences hang onto every bump, snap, or thud as the tension rises and suspense is drawn out to terrifying levels. Impeccable audience engagement. It takes a special kind of movie to completely immerse the audience into the world of the film in a multidimensional way. In terms of viability of the film and cross-promotion, this movie certainly has what it takes to be a popular and successful adaptation for a house at Universal’s Halloween Horror Nights or Busch Gardens’ Howl-O-Scream. Definitely has a place among the best horror film experiences to date.

The successful suspense and tension building can be attributed to seldom getting a good look at the alien-arachnid-like creatures. Had the audience seen the creature repeatedly throughout the film, it would lose fright value. As Hitchcock stated, “there is nothing scarier than an unopened door.” Meaning, the filmmaker’s ability to transfer the terror on screen to the minds of the audience is far more powerful and impressive than relying upon on-the-nose scares and jump-scare gimmicks. Well-crafted suspense and rising tension carries far more weight, and has the ability to support a narrative so much more effectively than a cheap scare. Although the atmosphere in this film may remind you of Don’t Breath, and rightly so, Krasinski’s film does not quite measure up to the macabre, terrifying atmosphere that Fede Alvarez provided audiences; however, Krasinski’s A Quiet Place is extremely close to the aforementioned and deserves the accolades that it has received.

In terms of how to closely read A Quiet Place, the film provides exceptional social commentary on the perils parenting and, by extension, protecting one’s offspring. In fact, I imagine that the experience for parents watching this film exceeds the levels of terror felt by those of us who do not have kids. There is also plenty of material on how far a parent is willing to go in order to protect their children. I also appreciate the film’s commentary on expected mothers, and how they stop at nothing to protect their unborn child from that which seeks to do it harm. Responding to and working through grave tragedy is another heavy and shocking subject matter in the film. We all respond to death differently; many of us grieve differently than one another. Some bottle up all the negative feelings for fear of how to deal with them, and others blame themselves because they feel that there is something that could’ve been done differently to protect a lost loved one. On a lighter note, the film also provides metaphor on how to work with and handle your older kids when they seek to push the boundaries–boundaries that may be dangerous and place them in harm’s way. There is so much here to talk about, and I have just touched on the surface. That is why horror is the best genre for creatively exploring psycho-social constructs and other observations about humanity and the world in which we live.

Quietly make your way to your seat in the auditorium. A Quiet Place is definitely a film to be experienced on the big screen with a theatre full of others who seek to be frightened. Enjoy the refreshing originality of a film that could have so easily went by way of so many other creatures features that lack anything memorable, and just blend into the background with countless others in this subgenre of horror. It may not have the well-defined external goal and end game of Don’t Breath, but it is certainly exciting and fun! You’ll certainly be absorbed into this terrifying post-apocalyptic world, where YOU are afraid to go bump in the night.

“I hate that word [comeback]. It’s a return! …”

“…a return to the millions of people who have never forgiven me for deserting the screen.” A powerful line from the iconic Norma Desmond in Sunset Boulevard–but–this also rings true for Michelle Pfeiffer, who is returning to the big screen following a self-imposed exile from Hollywood. After a long “famine” (the term Darren Aronofsky attributes to the Oscar-nominated actress’s absence), Pfeiffer is making a triumphant return to the big screen, and in BIG ways. Whether your favorite Pfeiffer performance is her universally critically acclaimed interpretation of Selina Kyle/Catwoman in Tim Burton’s Batman Returns or as Elvira Hancock in Scarface, cinephiles and fans alike can agree that the big screen has missed Pfeiffer’s bold screen presence and incredible beauty. What makes Pfeiffer unique in the world of cinema, is her ability to be incredibly ballsy and completely vulnerable all at the same time. Few actresses possess the ability to be a tomboy one minute and the portrait of sensuality the next. Why would one of the brightest stars in Hollywood in the 1980s and 90s slip away from the silver screen so conspicuously? The long and short of it is she desired to make time to raise her children. In a rare interview with Vanity Fair, Pfeiffer stated that she required so many schedule and location accommodations for her to continue to be a working-mother that she became “unhireable.” Now that her children are grown and out of the house, she is ready to get back to work!

While many may be focussing on Pfeiffer’s return to the big screen–to movies that are a match for her talent–the larger picture here could be lost. Approaching 60, Pfeiffer is at the age when many actresses are either not hired as often and/or are placed in grandmother roles; however, she is busier than ever! And in high profile roles in highly anticipated films. For the fans of her brilliant performance as the definitive Catwoman, she is returning to the superhero genre in the new Ant-Man and more recently she commanded the screen in Murder on the Orient Express. Pfeiffer also told Variety that should would very much like to reprise her role as Catwoman in a future film but not go to the lengths she had to before (citing placing the real bird in her mouth and the iconic sexy, but uncomfortable costume). Pfeiffer’s return to the screen is a testament that Hollywood is beginning to show that older established actresses are still bankable.

Pfeiffer comments that being an empty-nester has provided her with the push to get back out there. She wasn’t even sure that she would be able to step right back into acting because she often remarks that she sometimes feels like a fraud because she never received any formal training. Her rise from grocery store clerk to household name happened nearly overnight. Just goes to show that even though formal training and education are valuable tools in a show business professional’s tool belt, formal education itself does not an acclaimed actor make. Part of preparing to return to the superhero genre in Ant-Man and Wasp has her pouring over old comic books to prepare for her highest profile role in more than a decade. It is clear from the few interviews Pfeiffer grants (she is self-admittingly scared of interviews) that her favorite role in her career IS her role as Selina Kyle/Catwoman. Even today, she says that she is met by fans, young and old, of her work in that role. She quickly gives credit to Tim Burton who was highly instrumental in providing exceptional direction and a creative genius in the, what many critics call the, Batman movie that typifies the franchise. So, her return to the superhero movie genre is one that is highly anticipated.

While she is excited to get back out there, she still admits that she will continue to be choosy in her roles. She is an actress that has to feel a connection to a character in order to bring it to life. Whereas before she turned down roles in Silence of the Lambs and Thelma and Louise because of making sure she had time to be a mom, first and foremost, she will continue to exhibit her desire to not simply get out there and act again, but thoroughly enjoy the characters she plays. Part of Pfeiffer’s timeless charm is her ability to be 100% sexy feminine and 100% humorous tomboy at the same time. It’s this dichotomy that gives Pfeiffer her unique blend of charisma and screen presence that commands your attention and makes her memorable. Of all the qualities that aid in creating the standout actress that many of us love, she is equally humble and still learns from those actresses like Judi Dench and others that she continues to admire.

This past Halloween, I did my best to emulate her iconic Catwoman costume!

 

“Blade Runner” (1982) movie review

BadeRunnerStill a visionary masterpiece? On the rare occasion that I do not feel compelled to see one of the weekly new releases, I enjoy taking my Thursday night and watching an older movie that would be fun to review. As it turns out, it dawned on me that I had never seen Ridley Scott’s Neo-Noir Blade Runner despite the fact that it it a critically acclaimed film and highly regarded by many of my contemporaries. I have found that sometimes you have seen clips, heard people reference it, and simply hear the title so much that you think you have seen it. Then you realize that you’re familiar with the ideas, concept, or story but not the movie itself. So, I decided to watch it for Throwback Thursday and review it today. Unfortunately, I have been struggling with connecting with the film as so many other filmmakers and film lovers have. When watching a movie from 30+ years ago, I do my best to place myself in the shoes of the audience then. But, I am having difficulty this time. As a peer-reviewed cinema researcher, I believe that no matter how old a film is that it should still be relevant and impact audiences many decades down the road. Truthfully, I am not entirely seeing why it is such a regarded film still to this day. However, it is definitely an artistic masterpiece due to the technical elements of the production. So in many ways, yes, it still IS an iconic visionary masterpiece; but, fails to connect or resonate with audiences today.

Travel to a dystopian Los Angeles in the year 2020, or present day Detroit; take your pick. Many have fled the city for colonies on other planets or to the far north of the city to escape the rampant chaos. In the early to mid 2010s, Tyrell Corporation invented Replicants (or human-like androids) to carry out menial tasks and hard labor in a modern slavery fashion. Each unit was programmed to last for a specific amount of time (4yrs +/-). When a small band of Replicants decided that they wanted to take their lives into their own hands, they return to earth from the planet they were slaving way on and are determined to force Tyrell Corp to fix them. These Replicants led by Roy (Rutger Hauer) will stop at nothing. Over the years, when the Replicants began to pose a threat to humanity, special operations forces known as Blade Runners were trained to “retire” the androids. Former Blade Runner Rick Deckard (Harrison Ford) has been reactivated and forced to retire the small band of Replicants that pose a threat in the already dystopian Los Angeles. Follow Deckard as he conducts an investigation and is fearful of his own life as he attempts to track down and “retire” the remaining Replicants before they achieve long-lasting life. All seems pretty routine until he encounters a special Replicant named Rachael (Sean Young) at Tyrell Corp.

It doesn’t take long to understand that this film is a neo-noir detective movie that takes place in a dystopian future. Neo-noir is regarded as a film noir style movie produced after the classic film noir period (which was relatively short (~1940s-50s). This genre [although, technically, there is sufficient evidence to suggest that film noir is more of a style not a true genre] follows many of the same tropes and elements found in film noir (think Billy Wilder’s Sunset Blvd, Double Indemnity, classic detective movies, or Orson Welles). Often, the protagonist is a solitary individual who finds him or herself in over his or her head who faces or exhibits perpetual pessimism, fatality, or menace in a plot consisting of cynical attitudes and sexual motivations. From a technical perspective, film noir (or neo-noir) is stylistically dark, high contrast, low key lighting, contains strategic shadows, and shots filled with symbolism and dichotomy. The plots are usually slow burning and contain social commentary or a self-reflexive narrative. Once analyzing this movie as a neo-noir, it becomes more fascinating but still lacks that timelessness that can be found in some of the examples mentioned earlier in this paragraph. As a artistic film, I am impressed with the vision of Ridley Scott. As a classically-regarded and praised film, I am not very impressed. Although, I find that it is an excellent example of how many in the early 1980s viewed the future and that is is a fantastic example of neo-noir style filmmaking.

One of the biggest problems I had with the film is the fact that I had trouble loving the protagonist or hating the antagonist, or feeling sympathy for either of the aforementioned. In screenwriting, it is imperative that the audience make a firm connection with either the protagonist or the antagonist. Note: the antagonist in a film/neo-noir is not always the “bad guy.” Whereas even Gloria Swanson’s Norma Desmond in Sunset Blvd made a strong connection with the audience in that we feel great sympathy for her plight, yet she is the antagonist in the story–or many agree as such. Harrison Ford’s Deckard in Blade Runner never quite garnered strong support from me in the same way his nemesis Roy failed to elicit disdain. Both Deckard and Roy are fairly static characters–meaning they lack dynamic development. There is, however, an indirect glimmer of character development in Roy at the very end that plays significantly into the plot for a brief but strategic moment. As regularly reoccurring throughout the narrative the character of Rachael is, she can almost be removed from the film and change little in the overarching story. For the most part, she simply exists and pays into Deckard’s motivation, but mildly so. She neither causes him to view Replicants differently or becomes his sole goal. It is clear from early on in the plot that Deckard already had reservations in retiring Replicants. Rachael simply amplifies or intensifies the feelings that were already brewing.

Looking back at movies from the mid to late 20th century that take place in the early to mid 21st century can be quite entertaining. Sometimes the future portrayed in the film, in one form or another, has actually come to pass. Although, other times, the future is incredibly inaccurate. The dystopian Los Angeles in Blade Runner is definitely the latter. Yes, there are themes of unchecked immigration, authoritarian power, and capitalism that can be read as not so different from today; but, for all intents and purposes, the future is much more grim in the movie than in today’s reality. Perhaps that’s why it can be difficult to connect with this movie. It takes place in a “future” that never happened, and probably won’t happen in the now near future. I think that’s the danger when writing or directing a movie set in a future that relies heavily upon technology directly related to the plot. Some movies can pull it off. Take Back to the Future for instance. It works because the technology in the culture of the future isn’t significantly integrated into the essence of the plot or are solely responsible for some dystopian world. The futuristic technology merely exists and helps to move the plot along. In Blade Runner, the whole reason for the plot is because futuristic technology in our present day has turned on its creators and became the catalyst for a world drowning in chaos.

If you have never seen Ridley Scott’s Blade Runner, I definitely encourage you to do so, especially if you enjoy film or neo-noir movies. It provides us with a glimpse into how the world viewed a possible future in the early 1980s; and prompts us to think about life and how we might behave if we knew that we only had a few years to live. Survival of the fittest maybe? Or, fight or flight? If I was a psychologist, I think that this would be fascinating to analyze from a psycho-social perspective. At the end of the day, the film was quite the visionary masterpiece for its day and still remains a favorite of many filmmakers, scholars, and just film lovers alike.

*This review is in reference to the original theatrical release