Majority of U.S. Doctors No Longer White, Male

A new study finds the U.S. medical field is less dominated by white men than it used to be, but there are still few Black and Hispanic doctors, dentists and pharmacists.

Majority of U.S. Doctors No Longer White, Male
doctor patient handshake illustration

A new study finds the U.S. medical field is less dominated by white men than it used to be, but there are still few Black and Hispanic doctors, dentists and pharmacists.