I am interested in what causes someone to choose their major. I am hoping to find research showing that your choice of major is a result of your social class, culture, and college experience, as well as, the state of the economy upon entering college. Are there majors that tend to serve as fall back majors? By this I mean that a large amount of students that graduate with certain degrees actually started in a different field of study. Your major choice is arguably the most important choice you make in college. Not only does it influence your experience at college, but also, strongly influences your experience after college.