Watching Fox News right now. Apparently, 48% of the jobs college degreed people have do not require a college degree.

Steve Forbes made a brilliant quote in that college degreed people leave school with a worthless degree and are basically indentured servants to their college loans for 20+ years.

I am starting to wonder if college is an outright scam for the elitist professors who ride the fucking dole. Outside of the medical or scientific community, why the fuck is college even relevant? Marketing degree? LOL.

I took US History as a credit, and I cannot remember one fucking thing about the course. However, 5 years ago I took a tour of Gettysburg and know more about the Civil War than probably 99% of any college history degreed shitbag.