What does it mean to claim the US is a Christian nation, and what does the Constitution say?
By PETER SMITH Associated Press Many Americans believe the United States was founded as a Christian nation, and the idea is energizing some…
Continue Reading