“Where is America Headed?”
I know a lot of folks are concerned about where they think America is headed. The greatest concern seems to be with moral issues, especially homosexuality. Are we, Americans in general, becoming more and more immoral? Judging by what we see in the media (entertainment industry) and hear on the news, you would think that we are.
Perhaps the time is coming when our society will mimic more and more the Roman/Greek world into which our Lord came and the apostles preached and the first disciples were made. Maybe the time is coming that those on the Lord's side will stand out more than ever in contrast to the world around us.
I don't find that the Lord, Paul or any others ever blamed our 1st century brothers and sisters for the way the folks around them were living. They were never told that if they would become more active in some way, they would stem the tide so to speak and the Roman world would turn itself around.
I don't know that our greatest concern should be with where America is headed but rather with where I personally am headed as well as the local church of which I am a member. Are we being influenced to become immoral ourselves or are we putting on the whole armor of God and standing up to fight the good fight of faith?
We really don't know for sure where America is headed. Based on what has happened to previous nations, I would admit that it doesn't look good. (I remember hearing a lesson when I was a teenager on why Rome fell and the reasons given were all things that America was having problems with even then.) Nations, cultures, societies rise and fall. But Christians are a part of a greater nation, a holy and royal nation with Jesus as our King. It is a nation that will never be destroyed, meaning that each of us, living holy in the Lord, will live forever with him in eternity.
So let us be "blameless and harmless, children of God without fault in the midst of a crooked and perverse generation, among whom (we) shine as lights in the world." (Phil. 2:15)