Answered by Tom, Hiring Expert at VF Corporation, on Friday, September 22, 2017
In general, experience is more important than a degree. Of course, if you are in college right now, your work experience will be limited, so the degree is more important at this stage in your career, much less so later on. Having said that, more often than not, the fact that you have a degree is more important than what the degree is in. Sure, there are obvious exceptions, pretty hard to get a very specialized role such as a medical doctor or lawyer without a degree in such.
What you need to ask yourself at this point is how sure are you about what you want to do for a career. If you feel strongly about something, align your education with that, it will certainly help you. If you are unsure, I strongly encourage you to try as many different courses as you can and gain as much experience as you can through internships/co-ops and volunteer activities. The more exposure you get to things, the more you learn what you like, what you don't like, what you are good at, not so good at, etc. and all experience is valued by a future employer.