For those who know that in America, religion matters, and for those who wish it didn't matter so much, comes GOD IN AMERICA, a sweeping history of how religious faith has shaped America. Interweaving documentary footage, historical dramatizations and interviews with religious historians, this documentary series is an in-depth exploration of the historical role of religion in the public life of the United States.