December 7, 2022

Nearly half of US adults believe America should be a ‘Christian nation’

45% of Americans Say U.S. Should Be a ‘Christian Nation’

Pew Research Center conducted this survey to explore Americans’ attitudes about religion’s role in public life. The survey asked respondents whether they think churches and other religious organizations should be involved in politics, whether the U.S.

Most U.S. adults believe America’s founders intended the country to be a Christian nation, and many say they think it should be a Christian nation today, according to a new Pew Research Center survey designed to explore Americans’ views on the topic. But the survey also finds widely differing opinions about what it means to be a “Christian nation” and to support “Christian nationalism.” 

For instance, many supporters of Christian nationhood define the concept in broad terms, as the idea that the country is guided by Christian values. Those who say the United States should not be a Christian nation, on the other hand, are much more inclined to define a Christian nation as one where the laws explicitly enshrine religious teachings.

Overall, six-in-ten U.S. adults – including nearly seven-in-ten Christians – say they believe the founders “originally intended” for the U.S. to be a Christian nation. And 45% of U.S. adults – including about six-in-ten Christians – say they think the country “should be” a Christian nation. A third say the U.S. “is now” a Christian nation.

At the same time, a large majority of the public expresses some reservations about intermingling religion and government. For example, about three-quarters of U.S. adults (77%) say that churches and other houses of worship should not endorse candidates for political offices. Two-thirds (67%) say that religious institutions should keep out of political matters rather than expressing their views on day-to-day social or political questions. And the new survey – along with other recent Center research – makes clear that there is far more support for the idea of separation of church and state than opposition to it among Americans overall.

This raises the question: What do people mean when they say the U.S. should be a “Christian nation”? While some people who say the U.S. should be a Christian nation define the concept as one where a nation’s laws are based on Christian tenets and the nation’s leaders are Christian, it is much more common for people in this category to see a Christian nation as one where people are more broadly guided by Christian values or a belief in God, even if its laws are not explicitly Christian and its leaders can have a variety of faiths or no faith at all. Some people who say the U.S. should be a Christian nation are thinking about the religious makeup of the population; to them, a Christian nation is a country where most people are Christians. Others are simply envisioning a place where people treat each other well and have good morals.

More at: Pewresearch.org

Faith of the Founding Fathers

Share the News