It seems as though it’s becoming increasingly important for young adults to plan on attending college these days if they want to support themselves. I heard someone say the other day on television that people used to say that everyone needed a high school diploma, and that is equivalent to how a college degree is viewed these days. Does everyone need a college degree?
Moreover, when students graduate and enter the job market, they’re not going to have the experience just yet. There is much to be said about experience and how it helps people in their respective fields. In many businesses, you have people who have experience and have worked their way up, and then you have people that are fresh out of college. Which one is going to get the promotion?
Typically, it’s going to be the one that has the schooling, as they can then get the experience that they need once the degree is earned. What are your views on college vs job experience? That’s pretty much the way it works out in the professional world, and it’s going to happen more and more as the years go by. People are expected to get a degree for most professions that pay reasonably, and of course that degree comes with a hefty price tag these days. It’s sort of a catch 22 situation, right?