Taking America back for God: Christian nationalism in the United States
"Taking America Back for God conclusively reveals that understanding the current cultural and political climate in the United States requires reckoning with Christian nationalism. Christian ideals and symbols have long played an important role in public life in the United States, but Christian...