Originally Posted by jillianleab
To me there is a difference between a country which is made up of mostly Christians, and a country which governs itself with Christian morals. I do not think American does this. Sure we base many of our laws off of common Christian principles (don't murder), but we have no national religion, and we have many laws which go against Christianity as well (abortion). To me, if America were a "Christian nation", it would be a theocracy.
Certainly there are some people out there who wish we were a theocracy, and there are probably even some out there who think we are - but we aren't. We are a democracy which promotes religious freedom - the way it should be. Our leaders and population being mostly one religion means nothing, really.
I've heard people call this a Christian nation because a majority of our population is Christian, but I ask, what does that mean? Most of our population is also white - are we a white nation? There are more men in America than women - are we a male nation? If there are more brunettes than blondes, are we a brunette nation? A nation of the middle-aged? Should we start referring to ourselves as "America - the white, male, brunette, middle-aged Christian nation!"? Majority means little when describing your country; it's arbitrary and unimportant. Plus, it evokes an "us-versus-them" mentality, which I think (hope) we can all agree, leads to nothing but trouble.
We're Americans. We promote freedom, especially of religion. We don't need a national religion because that would hinder such freedom. If we are ever declared a Christian nation, I'm moving - the sh!t is about to hit the fan!