for the most part. in my opinion, doctors in the united states don't really do much work. Compared to other countries, doctors here a lot more assistants and nurses that do most of the work. Just like dentists, they don't do anything, they just look at your mouth and say "you need braces", or "you need fillings done" then their medical assistant does most of the work.
But i don't know, that's just my opinion.
That's got very little to do with honesty...