Dentistry is the branch of medicine that is involved in the study, diagnosis, prevention, and treatment of diseases, disorders and conditions of the oral cavity, the maxillofacial area and the adjacent and associated structures, and their impact on the human body.[1] To the layman, dentistry tends to be perceived as being focused primarily on human teeth, though it is not limited strictly to this. Dentistry is widely considered necessary for complete overall health. Doctors who practice dentistry are known as dentists.