My family doctor's practice is all female. It's weird, because I actually just noticed this. They have no male doctors, no male nurses, not even any male office staff. I don't know if they planned it, but it lends to a very comfortable environment.
I have learned over the years that there is a fundamental difference in women and men doctors. Male doctors will always assume that they know exactly what is wrong with you, regardless of what you say. Women doctors believe that you know what's wrong with you, they just need to get you to tell them what it is. They tend to work more one on one and have more patence. Take my lovely family doctor and the year from hell that led up to my diagnosis with Fibro. I was willing to give up, every test came back negative, I broke down into tears in her office, and she kept with me, and we did eventually find out what was wrong. I am forever grateful for her for that.
I have heard a number of stories like that from other people, that woman doctor has a better bedside manner, is more likely to listen, and more likely to not just dismiss something you say. I don't know if it's the nurturing instinct or what, but I know that people were insane for not letting women be doctors before. I think we do just fine, thank you.