Religion has inspired and driven people to do truly horrible things to others, and for what? Basically because they think their invisible friend wanted them to and that he'd eternally reward or hurt them based on if they obeyed him or not.
All opposite of what American does. In France the doctor comes to your house if you need it, free dental care in England, Germany you get alternative healthcare along with traditional, all free. Why don’t we know this?
I am a Christian, however, I don't take the bible literally and will never get why others do. Man is flawed and the bible was written, translated, and interpreted by man. Of course the bible would be too!
I have no need of religion, I care about people and live according to my conscience. I don't need to pretend I am going to be rewarded or punished for my behavior. I don't need a "God" to tell me what's right or wrong.