Well, as someone who only went to one semester of college, I apparently don't think college is necessary to survive in this world. I have never known what I wanted to do for a career and I went to college mainly because I realized working full time kinda sucks. I also wanted to have the college experience, which wasn't a party experience for me. I went to a religious college and, though I hung out with kids who drank, they weren't getting drunk or partying it up. It was responsible, conservative drinking.
I sometimes wish that I would go back to school, but I still don't know what I want to do with myself and I don't want to pay off a loan for years and years to come. Especially when most of the jobs I go for don't need a college education. (I've been trying to get back into the natural foods type grocery store business for the past couple years, the ones around here pay pretty well and have benefits.)
My sister was like me in that she didn't know what she wanted to do, but she enjoys learning a lot more than I and so went and got a degree in Social Sciences.
Neither of our parents finished high school and they've never demanded we graduate from college. I'm glad I was never pressured to finish college, though I think perhaps I might feel/be more mature if I had gotten a degree. Tests and papers were never my strong suit and that's what makes me fear college, for that is all you ever seem to do. I was a straight B student -without really trying- in high school, in college I got 1 B, 2 C's and 1 D. Blah.
I think education is definitely important, the main purpose is so that we can make better decisions, improve our life and learn about the past so that we don't repeat it in the future.