I do not support an Americanized Christianity, but I am fervently contending for a Christianized America. The influence of Christianity gave America the prosperity and freedoms that have been the envy of nations all over the world. The same can be true of any nation, for as the psalmist declared, “Blessed is the nation whose God is the Lord” (Ps. 33:12a).
America is now in serious trouble because there has been a decades-long war to dismantle Christianity in America. When President Joe Biden declared March 31, Easter Sunday, as “Transgender Visibility Day,” it was not done in a vacuum. Neither was the White House ban of messages about Jesus or His Resurrection on children’s eggs at the annual White House Easter Egg Roll done without clear intent.
Breaking news, Spirit-filled stories. Subscribe to Charisma on YouTube now!
These were merely the fruit of this 60-plus year push to dismantle Christianity in America. We have now reached a defining moment in our history, and it is time for all freedom-loving Americans to push back with the truth about the positive, life-giving role of the Christian faith in our country.
America Was Known as a Christian Nation From Its Inception
For the first 150 years of America’s existence, no one questioned the statement, “America is a Christian nation.” This did not mean the U.S. officially sanctioned any church, sect or denomination. America was considered Christian because the vast majority of our citizens identified as Christian, and our laws and institutions were founded on Christian principles and values.
Thank you for visiting mycharisma.com. To enjoy the rest of this MyCharisma post, please visit this link.
Join Charisma Magazine Online to follow everything the Holy Spirit is doing around the world!