America: A Christian Nation?
A big
point of contention amongst those interested in politics is the question “Were
the United States founded as a Christian nation?” Many of those on the left
side of the aisle would love to say that our founding fathers were major
secularists who hated everything within the Bible and those with the religious
right would say that each and every one of our founding fathers were Bible
thumping Christians. According to what I have researched, it’s kind of a mixed
bag. I would say that the United States is what it is thanks to the freedoms
that are granted through the applications of the Bible and the teachings of
Jesus. If that is what it takes to make a Christian nation, then the United
States is a Christian nation.
Many people would site the 1797
treaty with the Barbary pirates where President Adams wrote “The United States
was not, in any way, founded on the Christian religion.” But notice that
further treaties retreated from this statement and made no more bows to Islam.
So one could look at this statement being a blunt attempt at a compromise to
protect American merchants, and that is what the flow of the politics and
history indicates to me.
There are still further objections
that people can bring up from history in the form of our founding fathers. Our founding
fathers made appeals to Almighty God, but people could argue that it wasn’t
necessarily the Judeo-Christian God. Most of our political elite will make
religious appeals to make the masses sway toward that politician’s agenda. This
does not necessarily dictate that person’s beliefs one way or the other,
however. President Obama recently invoked the name of Jesus in support of gay
marriage, which sounds like a political play. Mike Huckabee gave a speech,
almost a sermon, after the Sandy Hook shooting, and I believe that Huckabee
truly believed in what he was saying.
One could be speculative of the
beliefs of our leaders until the end of time, but I must move on. There is one
group of people that analysts and historians always just glance over, and that
is the American public. Funny that, since the political elite draw their power
from the public which they are supposed to serve. With respect to the
revolution to modern day, the majority of the American public has identified as
Christians. Though are numbers are falling recently, the majority of people
still profess faith in Christ. While one can give a reasonable case that the
Constitution and the Declaration of Independence were not explicitly Christian
documents, you cannot deny the fact that Christianity is a fundamental part of
the identity of America due to the belief of her people. I would make the
argument that since the United States of America is a nation of Christians,
then this wonderful nation that I call home is a Christian nation.
So to recap all of this, it is
possible that the God/ Creator that the Declaration and Constitution appeal to
is closer to a deistic God, but the laws and rights portrayed do seem to be a
derivation from Biblical law and the teachings of Jesus. The people of the
United States, however, take this God to be the Judeo-Christian God and the
majority of the people are Christians. So are the United States a Christian
nation? The case can be made either way, so it could be called a gray area
answer. I would state that the evidence seems to lean more toward the positive,
but that is the opinion of one man. God bless you and have a good rest of your day.
Comments
Post a Comment