Why America isn’t — and never was — a “Christian nation”

a-christian-nation1We’ve all heard it before, whether during a debate about same-sex marriage or in some passionate speech (usually given by a far right-wing conservative).

“America is a Christian nation.”

I’ve certainly heard it my fair share of times. It’s a common statement that’s been thrown around a lot more since a certain African-American man took office a few years ago. I recently started thinking about the concept outlined in the statement. I heard a guest toss it into the conversation of a CNN show last week, as a justification for employers being able to opt out of covering birth control under mandatory health insurance. It got the wheels spinning, and I realized that I beg to differ with the claim itself. America is not a Christian nation, neither in theory nor in practice. In fact, it never has been. I know that’s not a popular thing to say, especially in the “Bible Belt,” but it’s the truth. I won’t get into a discussion about whether that’s a good or bad thing (I’m sure that’ll end up as a separate post at some point), but we need to understand and agree on this basic point if we’re going to have constructive discussions about religion in this country.

Now, I know what you’re going to say.

“But the Constitution guarantees freedom of religion, which means I have the freedom to be Christian.”

And you’re right. But that doesn’t mean that everyone else has to be a Christian too. And that’s where the “Christian nation” argument starts to break down. The First Amendment literally debunks the idea that the United States was founded on the basis of one religion. I’ve beaten that dead horse far too much in previous posts, so I’ll just leave it at that. The point I’m trying to make is that even if most Americans want this country to be a Christian nation, it’s quite literally unconstitutional, at least where government is concerned. Remember what I said about a Christian nation “in theory?” This is what I meant. But the argument goes beyond that.

Colbert-Christian-NationWe always make this claim that we live in a Christian nation. But if that’s true, why do some states still have a death penalty? Why do so many churches erect huge buildings, but fail to feed the poor and clothe the homeless within their communities? Why is there an enormous wealth gap in America? Why do we treat others with contempt, hate, and malice? A Christian nation would be full of people who generously give to the less fortunate and treat others with nothing but love, as Jesus taught. Is that the kind of country we live in today? Not even close.

Instead, we live in a nation where the rich get richer, the poor get poorer, and we kill others after reading about Jesus telling people to turn the other cheek. We live in a nation where people of different faiths and backgrounds are shunned, ridiculed, and humiliated by others who claim to be “preaching the gospel.” Ironically enough, the people who get richer and support the death penalty are often the same people who insist that we live in a Christian nation. That’s like a thief saying that we live in a nation free of crime. It just doesn’t work.

Sadly, this “not a Christian nation” thing isn’t new. It’s been happening for years, centuries even. In fact, I don’t think America has ever been a Christian nation. And I love to see people argue that the Founding Fathers established the U.S. as such. They didn’t. The Founding Fathers were very intelligent men. Obviously they didn’t know everything, but they were aware of that, and they constructed the Constitution with the notion in mind that they would be gone one day, and future generations would need guidelines for growing and progressing the country. If men like that wanted this country to be founded on the principles of the Bible, we would know beyond a shadow of a doubt.. There wouldn’t even be a discussion about it. They would have made the Constitution in such a way that it clearly identifies Christianity as the official religion of the United States. But guess what? They didn’t.

Screen-shot-2012-01-05-at-9.02.49-PMIn fact, they did the opposite, which actually makes all the sense in the world to me. To elaborate, let’s have a brief history lesson here. Along with no representation in Parliament and higher taxes, the Founding Fathers and other revolutionaries fought against England to gain the freedom to practice religion freely. They were quite literally trying to escape the grip of one religion, which was being imposed upon them by England. They didn’t like being told what to believe, so they fought against that and won. And you’re telling me that when the Founding Fathers met to form the Bill of Rights and the rest of the Constitution, their first order of business was to add a clause that establishes a single religion as being dominant in government? Are you kidding me?

That’s the genius of the First Amendment: it exists to prevent the kind of oppression that England forced upon the colonies. Yes, that includes religious favoritism. So when we’re arguing about freedoms, don’t try to use the First Amendment as a justification for imposing purely ideological Christianity-based laws on the rest of the country. You’re turning the principle of the freedom of religion clause on its head, and trying to make everyone think that it’s just a different interpretation of the truth.

Just for the sake of the argument, let’s assume that the consensus is correct, and the U.S. was founded as a Christian nation. If that’s the case, we’ve failed miserably at living up to that creed. We’ve enslaved an entire race of people, oppressed women, killed Native Americans for land, launched war after war against other nations, stolen from the poor, abused power, and committed every other sin in the book since our country’s inception. That doesn’t sound like a nation of Christians to me. We can all say that the Founders wanted the U.S. to be a Christian nation until we’re blue in the face, but that doesn’t meant that it’s true, or that they succeeded in making it so.

faith-and-politicsIn case nothing I’ve said so far has offended and/or surprised you, I saved the best for last. In all honesty, I don’t want the United States to be a Christian nation. I say this as a Christian, which is sort of a contradiction in and of itself. Allow me to clarify a few things, though. First, when I say “Christian nation,” I’m referring to the idea from a legislative and legalistic standpoint. I don’t want America to be a country in which every law is based on the teachings of the Bible. This might seem like a small detail, but it makes all the difference here.

I don’t want it because it takes the wrong approach. As Christians, we’re responsible for bringing others to Christ. But I think that how we accomplish that is almost as important as the task itself. If we create more Christians, but they’re misinformed or they don’t have a real, meaningful relationship with Christ, we have failed. Legislation that’s always based on Christianity makes this worse, because we’re literally saying that it’s illegal to be anything other than a Christian. That doesn’t create a nation full of goodhearted, Christ-loving people. If that’s the approach we’re going to take, there’s almost no point in trying.

We’re defeating the entire purpose by ordering people to follow Christ. It has to be a choice, which is what makes the connection that much more powerful once the choice is made. That’s what we as Christians should strive for. In fact, endorsing the legislative approach to Christianity actually makes us look weak. If we can’t show others the love that Christ has for them by witnessing and inviting them to worship, we fail as Christians. That’s on us. And passing laws that essentially force Americans to adopt Christian beliefs won’t help create new Christians. If anything, it alienates those people even more, making it harder to reach them and bring them to Christ.

shutterstock_62864413With that being said, I do think that a Christian nation, in the purely spiritual sense, would be amazing. Of course I want as many Americans as possible to fall in love with Christ. He changes lives like no other, and I’ve seen that firsthand. But if we’re trying to force them to find Jesus by basically making every sin and biblical taboo illegal, we’re doing it wrong.

Here’s a thought that I’ll leave you with: what if I told you that by trying to use laws to turn America into a Christian nation, we’re actually committing blasphemy? Think about that. By criminalizing things like same-sex marriage and discriminating against Muslims and atheists, all “in God’s name,” we’re actually disrespecting Him. If we think that actual laws are necessary to make people accept Jesus Christ, we’re spitting in God’s face. We’re essentially saying, “Hey God, you know your Bible, your son’s sacrifice, your grace, and your impact on my life? Yeah, they aren’t enough to influence a nonbeliever, so I’m going to use this worldly thing to speed up that process.” And we have the nerve to think that we’re growing His church this way?

Editor’s note: this post was inspired by an Amanda Marcotte article published on Salon.com. You can read the article in its entirety here.

Advertisements

One thought on “Why America isn’t — and never was — a “Christian nation”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s