How Is The Culture Shaped, Or Not, By Religion?

A friend sent me the following quote …

Christianity began as a personal relationship with Jesus Christ. When it went to Athens, it became a philosophy. When it went to Rome, it became an organization. When it spread throughout Europe, it became a culture. When it came to America, it became a business!

… which prompted me to question and think about all the ways our culture has been affected or even led down the path of destruction by the evolution of religious ideals in America. Whether or not you consider yourself a Christian, you must agree that our country’s culture was shaped and founded on Christian values and principles. Most of us believe in the general right and wrongs which come from the Christian bible. Over the decades, those principles have been attacked on a number of politically correct levels for various reasons. Where has it led us? Do we consider ourselves better off now that we have been enlightened and are less judgmental of those who stray from the basic tenants of Christianity? Are our religious leaders not preaching values and selling out for full pews and collection plates? Do we feel, as a whole, that we have more spiritual direction and purpose in our lives now that Christianity has pretty much been deemed less acceptable or politically incorrect?

It seems to me we have left the values and character in the dust in exchange for the everybody should feel good attitude that has waged an all out attack on the Christian religion. You don’t have to be a Christian to appreciate the structure it brought to our culture. So, how does it help us, again, to be more secular, without a clear set of moral guide lines to live and grow by?

Be the first to comment

Leave a Reply

Your email address will not be published.


*