Faith healing vs medical healing

The term faith healing is best known in connection with Christianity. A number of its adherents interpret the Bible, particularly the New Testament, as teaching belief in – and practice of – faith healing. They claim that faith alone can cure blindness, deafness, cancer AIDS, development disorders, anemia among other diseases.
The New Times
Times Reporter