Worst College Majors
The worst college majors are fields of study that leave the student with relatively few job opportunities and often a distorted or ultra-liberal view of the world. If the student incurred debt in attending college, then he is left with obligations without any practical way to pay for them.
We encourage college students and employers to improve this list over time.
- Women's studies
- Religion (often atheistic or anti-Christian in curriculum)
- English Literature
- Interior Decorating
- Art History