what killed ethics in america?
It seems to me that the United States became (principally in the 20th century) a culture focused on outcomes as the primary measure of success. I would suggest that this is a result of corporate culture spreading its influence throughout our society. The dominant business model is outcome-oriented. How you get there doesn't matter so much as how much money you make, how much you sell, how successful you are.
The end result is the main thing. Everything else, including ethics and conduct, comes secondary. Legality matters because it could hurt the bottom line, but otherwise, anything seems to go.
SO how can we resurrect ethics in America? Can we teach ethics in our schools? Can we, as the elements of our society, model a way of living and doing business in which the process matters as much as the outcome?
And for those of us who are people of faith, what does it mean to be the body of Christ in a culture that is centered on success to the exclusion of everything else?
The end result is the main thing. Everything else, including ethics and conduct, comes secondary. Legality matters because it could hurt the bottom line, but otherwise, anything seems to go.
SO how can we resurrect ethics in America? Can we teach ethics in our schools? Can we, as the elements of our society, model a way of living and doing business in which the process matters as much as the outcome?
And for those of us who are people of faith, what does it mean to be the body of Christ in a culture that is centered on success to the exclusion of everything else?
<< Home