Credit: NASA, ESA, STSci

These are unprecedented times we’re living in–troubling, yet at the same time enormously fascinating. We have access to instant information on just about every topic under the sun via the World Wide Web; the Hubble Telescope can see 10- 15 billion light years away; global warming or climate change, if you will, poses an immense threat to the human species as does global warfare and social strife; digital technology has become so sophisticated in terms of biological and ecological discovery, justice, medical advancements; scientists now believe dolphins are gangsters; the war in Afghanistan is the longest in America’s history; nutritional science is moving forward as rapidly as computer science; and Americans spend on average about twelve hours per person consuming media each day. Indeed, just about every phase of our lives is advancing immensely. There’s a lot going on!

So the question is: Are American movies doing enough to reflect this incredible time in human history? It could be argued that the U.S. has been a world leader in terms of cultural progress and artistic exploration for many years. Have we lost sight of the undeniable power of cinema to make a statement or advance a cause? And is that even in the realm of Hollywood’s responsibilities? Or is cinema mainly a vehicle for emotional escape and gratuitous entertainment? Think of the seventies: this country had been through the Vietnam War, the assassinations of John and Bobby Kennedy, the explosion of the drug culture, and the disenchantment of Watergate. And the movies of that time reflected what many thought to be the dissolution of the Republic. Film culture was well sourced with projects like The Deer Hunter, Days of Heaven, Five Easy Pieces, Chinatown, One Flew Over the Cuckoo’s Nest, and Apocalypse Now. There was a palpable sense that filmmakers had a responsibility to impart something of value to the culture at large. And think of the old cliché about European films; in many circles, films from that particular continent have been described as deeper, more artistic, and more real. The thing about clichés is they’re often rooted in truth.

So, what do you think? Does cinema have an obligation to portray something authentic and important about the current human condition? And, if so, what are the films of the last five years that you think reveal a truth or a critical insight about the Zeitgeist we find ourselves in?

Please share!