The Big Question 79: Why are Christians So Often Cast as the Villains in the Movies Today?
Sep 1, 2021 192
Why are Christians so often cast as the villains in the movies today?
At the height of the Cold War back in the 1970’s and 80’s, the villains in the movies were always Russian. Things have changed today.
Have you noticed how today, often the villains in Hollywood are Christians. They’re typically portrayed as people who are brainwashed fanatics. On one extreme, you have the packs of religious fanatics who go around chanting religious mantras and hunting people they don’t like down. On the other extreme, you have the stereotypical religious old lady who is so religious that she sucks the fun out of everything. Both are negative stereotypes, right?
I don’t necessarily think that these negative portrayals are an intentional plot by Hollywood against Christianity. I think that society has turned against Christianity in very direct ways in the last few decades, so there is a bias against people of faith, whether that faith is Christian or not. What Hollywood has always done is hold a mirror up against our society, so when you see Christians as villains in the movies, it’s because there is a part of society that actually does see Christians as villains in the world today.
That’s one of the key reasons why it’s popular to paint Christians in villainous roles in the moves. But there are other reasons, too.
One reason is that most of us can identify with some of the negative ways in which Christians are portrayed in the movies. The movies like to create villains that we can all identify with and Sadly, we all know someone who, while professing to follow Jesus, was narrow-minded, judgmental, and even bigoted against other people. And it’s a tragedy that those who say they are Christians have so misrepresented the Christian message. And, you know, at the end of the day, there’s no better villain than someone who has fallen and betrayed themselves.
Another reason why Christians are so often portrayed as villains in the movies is because it’s a natural human trait to try to blame everything that’s wrong on another, and the more powerful the “other”, the better. We no longer live in an overtly Christian society, and the Christian church is often seen as that convenient “other” to blame.
The Christian church is often seen these days as a powerful entity that has oppressed women and minorities and that has suppressed learning and progress throughout history. This idea, which is quite wrong, is largely due to the views of militant atheists such as Richard Dawkins and Christopher Hitchens, which have been highly publicised in recent decades. These ideas exist in the collective sub-consciousness of our society, so Hollywood knows that making Christians the villains will largely resonate with audiences.
At the end of the day, I think we all need to think for ourselves. And I think that you can enjoy the movies, without buying all the nonsense Hollywood tries to feed you.