the first, to save lives; the second, to inform the public about the dangers of untested or discredited medical treatments; the third, to prevent the poor, poorly educated and vulnerable from being defrauded.
The important message is of course we should base our decisions on evidence. Evidence-based nursing is an important topic in *any* modern nursing school today. I was wondering why. Will it be important to stress evidence because many student nurses came into the school already indoctrinated with religious belief which promote superstition and false hope without evidence?