Actually, America Is Too a Christian Nation
Scott Hogenson /
To say the founding of the United States reflects biblical Christianity is to state the obvious.
The Declaration of Independence and the Constitution incorporated many fundamental precepts of the Reformation, and these precepts long have been recognized by American statesmen and jurists.
From overt assertions that people are “endowed by their Creator with certain unalienable rights” and are entitled to the liberties of “the Laws of Nature and of Nature’s God,” to the more subtle acknowledgement of the birth of Jesus Christ in Article VII of the Constitution, biblical Christianity was absolutely central to the American founding.
What’s more, 46 states explicitly mention God in their own constitutions, according to a report from Pew Research Center.
Clearly, Christianity remains central to our national character today.
Such observations ought not be controversial, but saying what is true can get you in big trouble these days. In this instance, the media outlet Politico is sounding alarms over some Republicans self-describing as Christian nationalists, while others are talking about formally declaring the United States to be a Christian nation.
Calling America a Christian nation today might be debatable; a 2021 poll by Pew found that the number of Americans who describe themselves as Christian fell to 63%, down from 78% in 2007.
But our national founding and ongoing civic philosophy are unquestionably Christian. President Woodrow Wilson, a Democrat, said it most succinctly in observing, “America was born a Christian nation.”
Wilson is by no means alone in his recognition of the Christian roots of America and the absolute necessity of biblical Christianity in the United States.
President Theodore Roosevelt, a Republican, said: “The teachings of the Bible are so interwoven and entwined with our whole civic and social life that it would be literally impossible for us to figure to ourselves what that life would be if these teachings were removed.”
President Herbert Hoover, a fellow Republican, echoed Roosevelt by saying: “The whole inspiration of our civilization springs from the teachings of Christ and the lessons of the prophets. To read the Bible for these fundamentals is a necessity of American life.”
President Harry Truman’s observations of the importance of the Bible to America stretch back to the Old Testament.
“The fundamental basis of this nation’s law was given to Moses on the Mount. The fundamental basis of our Bill of Rights comes from the teachings which we get from Exodus and St. Matthew, from Isaiah and St. Paul,” Truman, a Democrat, said.
The House of Representatives was clear in its declaration of the American character. A House Resolution from May 1854 stated: “The belief of our people in the pure doctrines and divine truths of the gospel of Jesus Christ” was vital to the American system of government.
Presidents, members of Congress, Supreme Court justices, and many others long have recognized the role of the Bible and Christianity in the United States, both in terms of the nation’s founding and its continuation as a global beacon of liberty.
So why is it that in 2022, some are denying history and encouraging their fellow Americans to forsake our national legacy?
It’s no secret that a lot of people want an American future that is radically different from its past and present. The 1619 Project, critical race theory, and other vehicles that are built from the ground up to revile the United States and its founding are symptomatic of a deep enmity toward our nation. But these attacks also validate the truth of Christianity’s powerful influence on society, both here and across the world.
Biblical Christianity enshrines liberty and informs good government. Adherents of Marxism, socialism, communism, and other authoritarian structures know this too. They are well aware that the imposition of tyranny is far more difficult when the society they seek to subjugate believes in the truths of biblical Christianity.
That is why the political Left and its acolytes are so focused in their slander of the faith. The marginalization and destruction of Christianity is the necessary precursor to forcing despotism on Americans.
To claim that declaring America a Christian nation amounts to the establishment of a state religion is silly. If people of other faiths wish to live here in peace, they are and always have been welcomed and protected in the practice of their faith; the First Amendment’s protection of freedom of religion is an outgrowth of the Reformation.
But America and its founding don’t cease to be Christian just because a small number of political and cultural elites cast themselves as deniers of history.
Today’s attacks on Christianity are not new. They’ve been going on since the first century and will continue apace, just as Jesus Christ told his disciples. Today, these assaults are being propelled into our political discourse and only will increase.
Christians, indeed Americans of all faiths, must not permit the whitewashing of our history and the continued erosion of the liberties articulated in biblical Christianity.
The Daily Signal publishes a variety of perspectives. Nothing written here is to be construed as representing the views of The Heritage Foundation.
Have an opinion about this article? To sound off, please email [email protected] and we’ll consider publishing your edited remarks in our regular “We Hear You” feature. Remember to include the url or headline of the article plus your name and town and/or state.