Article
The image of dermatology as a medical profession has undergone a variety of changes over the decades. In the 1800s and the early part of the 20th century, dermatologists were the physicians to go to for venereal diseases, as well as diseases of the skin, so their place as a medical necessity was secure.