What does it mean to claim the US is a Christian nation, and what does the Constitution say?
Many Americans believe the United States was founded as a Christian nation
ABC News: US Many Americans believe the United States was founded as a Christian nation