I had some really lousy doctors in the past. They were all so busy informing me that I was fat and that this was Evil and Wrong to even notice or consider that I might be having sex.
But right now I see two faboo people for my health care: my internist for health concerns and managing my pseudo-PCOS/metabolic disorder/whatever it is this week and an NP at Purdue's women's clinic for my gynecological needs.
Both of them have bowls of condoms wrapped with a sheet that has instructions for use in their offices and their waiting rooms, and my NP has lots of pamphlets and booklets lying around about various sex related issues and STDs, so if you're too embarrassed to bring stuff up with them, at the very least you could grab a pamphlet or a condom.
I have ended up bringing up my concerns about my sexual activities and how they might intersect with my health before they have, but in subsequent visits, they have come back to them, mentioning things like "so your STD screen came back negative...did your partner get tested like we were talking about?"
Neither of them have made me feel embarrassed or foolish or acted like "ew, *you* have a sex life?" whenever I've asked them anything, and they've made sure to ask me about birth control methods, medication interacting with my birth control pills, and other concerns. When I've done things like gotten STD screens and HIV testing, theyv'e carefully taken the time to explain what they're testing for, how they're testing for it, what each disease is and how I could have gotten it.
While I think that it's extremely important that doctors bring this up, I'd also like to put a plug in for making sure that you speak up when you're in the office with your doctor, too. It gets easier for me to open my mouth and ask each time I go in there.