Majority of U.S. Doctors No Longer White, Male
A new study finds the U.S. medical field is less dominated by white men than it used to be, but there are still few Black and Hispanic doctors, dentists and pharmacists.
A new study finds the U.S. medical field is less dominated by white men than it used to be, but there are still few Black and Hispanic doctors, dentists and pharmacists.