| 17 February 2024 | 0 Comments

What does it mean to claim the US is a Christian nation, and what does the Constitution say?

Many Americans believe the United States was founded as a Christian nation

​ABC News: US Many Americans believe the United States was founded as a Christian nation 

Leave a Comment

Your email address will not be published.