Video Library
by DrRic

What is Western Medicine?



Western Medicine is the practice of healing learned in current system of US medical schools. To graduate with either a degree of Doctor of Medicine (MD) or Doctor of Osteopathy (DO), one is either a physician or surgeon. These doctors in the US work in an office based practice, hospital based practice or mixture of both. Most western medicine doctors use prescription drugs, or surgery to improve the life of the patient. Technology and science are used to manage disease.


                                                                          Western Medicine | Alternative Medicine | Integrative Medicine