AMERICA FOUNDED AS A CHRISTIAN NATION? Americans are so sad over being sold out by their so-called representatives in DC. It’s so sad when there are Americans who still think the system works like they were taught in school. It’s like an older person with dementia who wants to keep going outside to water the flowers at a home that has long since been gone.
Controversial indeed. Who among Americans doesn’t like a good yarn about good vs. evil–with the USA being the Good Guys? Count this writer among those who at one time believed the “USA was founded as a Christian nation” fairy tale.
Actually, the best way to determine whether or not the United States is a Christian nation is to look at the founding documents. Nowhere, not even once, does any of the USA’s founding documents mention “Jesus” or “Christ.” Pretty odd when a country doesn’t mention the person it supposedly was founded for, huh?
Still, this article raises some good points.
h/t: Citizen Tom
- Was America once a Christian nation? (loopyloo305.com)
- Do You Know if You are Going to Heaven? Be Sure!
- Was America Founded as a Christian Nation? (Review) (davehershey.wordpress.com)
- A Christian Nation (canadafreepress.com)
- Assumptions of a Christian Nation (davidrgriffiths.wordpress.com)
- President Obama Declares The Future Must Not Belong to Practicing Christians (jordanwellsministries.wordpress.com)
- Eternal Salvation Through Jesus Christ: Salvation Messages from End Times Prophecy Report
It may seem intuitive, at first, to attempt to answer this question by focusing on government. But the best way to determine whether or not the United States is a Christian nation is to compare the philosophy of its people to the Word of God.
The Declaration of Independence states that every person has these God-given, inalienable rights: life, liberty and the pursuit of happiness. This philosophy is what we could call the “American Worldview,” and it drives everything about the nation— from its economic and foreign policy to the private lives of its people. This is the atmosphere in which most of us have grown up. But can this American Worldview be called a Christian Worldview? Can we really call the United States a Christian nation?
First, what does “life” mean to a Christian? Most Americans would say we have a right to be alive, just by virtue…
View original post 1,054 more words