Large numbers of Americans believe the founders intended the U.S. to be a Christian nation. The belief is especially strong among Republicans and their white evangelical base.
Large numbers of Americans believe the founders intended the U.S. to be a Christian nation. The belief is especially strong among Republicans and their white evangelical base.
Except that’s all false according to the Founders themselves.