I’m exhausted listening to Trump apologists who are minorities, particularly Blacks, try to push back these days saying all we need is for America to bring back God into the nation’s life and its institutions. Accepting the above premise, arguendo, where was God in 1861 at the start of the Civil War? Where was God when prior to this date chattel slavery was being established across the nation?
Where was God during Jim and Jane Crow? Where was God during modern day struggles of Blacks and others to gain full citizenship in its myriad permutations? Where was God during the Trail of Tears?
This isn’t to denigrate God or a person’s sincerely held religious beliefs. But I’m trying to find a time in America when a turn to God has ameliorated the deplorable conditions the nation placed upon Blacks and others. Are we too arrogant or naive to conclude that the civil rights pioneers were deaf to God’s ears?
They were as bright, if not brighter, than those of us today. Are we too arrogant to conclude that Native American leaders didn’t hear from the Christian God as some of them tried to accept?
The list of historical abominations is endless. A return to God in this country? When has a turn to God in this country manifested itself positively? In 1865 when the Civil War ended? God then apparently left shortly thereafter because efforts to address the remnants of the peculiar institution ceased and weren’t attempted to be fully addressed again until 1964.
I’m perplexed. And serious in my confusion. When was God there for this nation of people when non male minorities and others were being treated as less than full citizens and humans? At what point is the line of demarcation clear that a turn to God benefited all as it is being posited today? Please enlighten me.
– Arthur U. Tellum