My twin sister and I first heard the gory truth about sex when...
...we were on the bus in kindergarten. We came home that afternoon and promptly told our mom that it must not be true, right? Taking it in stride, she told us that it was true, and that it might sound strange because we were kids, but that grown-ups didn't mind it.
Then she bought us the genius book Where Did I Come From--first published in 1973!--which we both still vividly remember. (Has anyone else read it? It's awesome.)
A few years later, I told my mom that some kids at school had been talking about balls. "What do you think that means?" she asked me. I rolled my eyes at such an obvious question. "Boobs!" I said and walked out of the room.
Ha, kids are so clueless!
How did your parents tell you about sex? Was it awkward? Awesome? Or did they decide not to mention anything and hope you'd just figure it out from movies, sex ed, etc.? Where did you think babies came from before you found out?
P.S. My friend even threw up when she was little and her parents told her about sex. She told them it was because of her chocolate milk, but it wasn't.