For men who have had female general practice (or family medicine) doctors.
Did you find them reluctant to examine or treat the male specific body parts?
If you have been seen by a female doctor for that body area, would you consider her as knowledgeable as a male doctor?
I'm not being sexist. My recent experience with a female doctor was unsatisfactory and I would like to know if that was a rare case or a general trend. In my search for a replacement I don't want to rule out the majority of the nearby doctors if my bad experience was not typical.