So, this thread & exchange with @Squiggle_Ds is really interesting to me.
Over the centuries, what has been the role of higher education? Did only the scions of wealthy families attend college (Oxford, Leipzig, Bologna) so they could learn Latin, astronomy, rhetoric, metaphysics, and medicine? (I sort of gather that, even then, students tended to be sort of wild.)
And, in the Industrial Age, did colleges start to become trade schools (the various A&Ts)?
Was the (apparently) high ratio of liberal arts majors a post-WWII aberration? Or did it somehow reflects a concentration (still) of the offspring of wealth in colleges?
I went to a good, small liberal arts college (where I majored in Physics) and I feel the better off for it, but is that only because my parents could afford it? What if I had gone to a cheap state school and taken classes in English, religion, and art survey? Would I be just as well off and able to think critically? (Omitting the question of whether I can think critically now.)