As a developer, I'm sure you are familiar with the Stack Overflow website. Besides providing answers (some more useful than others) to any code-related questions you've Googled, they also conduct an annual survey amongst developers around the world. Some of the results are surprising, but one really stood out to me: of the 66,823 responses to the question about their undergraduate major, only 63.3 percent of professional developers have a degree in computer science, computer engineering or software engineering. So in other words: 1 in 3 developers don't. As someone who is one of those 1 in 3, I was actually a bit relieved to find out it's really not that unusual to do a job you don't actually have the "right" degree for.
A good waste of time?
So knowing these numbers, there is a good chance that you or at least one of your fellow coders in the office have never followed a class about Big O notation or even Boolean logic. But we're all doing the same job! Personally, this fact -and already being a "minority" by having exclusively male colleagues- has caused me to doubt my abilities. That's right, the dreaded Impostor Syndrome. In all those years I could have learned about the fundamentals of my daily work, I spent my uni years in the lab or staring through microscopes. Back then, I already knew I wanted to do more with coding so I chose my minor to be in bio-informatics. After graduating I worked in cancer research for 18 months. On my coffee breaks I stayed behind my desk trying to figure out the code behind that impressive image analysis software I was working with. But alas, the research funds dried up and there were no opportunities for training me on creating the cool MatLab macros they were using.
After a long job search to find something more IT-related I got into a full time traineeship in .NET development, and after another 18 months of that I was now a developer. Or was I? I had read a stack of books that reached up to my waist, taken over a dozen official courses and was even Microsoft certified. But I still felt way behind on all those guys who had a lot more years of learning experience on the subject. And in a way, I was right. At least at that point. The real learning didn't start until I started working alongside experienced programmers, and some of their backgrounds were even similar to mine - which was encouraging. After a while my skills were starting to catch up with those of my fellow devs, even leading my own projects at some point.
Making it work
To some extent the Impostor Syndrome is still there, but over the years I've realized it's not necessarily a disadvantage to not have a Computer Science degree. Sure, on some days I think about how skilled I would have been if I didn't spend all those years in Life Sciences. Obviously, the research project I was on wasn't anywhere near a cure for cancer and as it turns out, knowing the molecular structures of amino acids is not very useful knowledge in IT or even everyday life. You could say the same of anything you might have learned that is not directly related to coding. But if 1 in 3 are in the same boat and doing the same job as the people who do have a "relevant" degree, we must be doing something right, don't we?
Fortunately, the answer is yes! You could say the relatively high amount of retrained or self taught programmers is a direct result of high demand. But if it was really that fruitless to hire "unqualified" candidates and invest time and money in having them work as a developer, they wouldn't have bothered. I've heard this comfirmed by several people in the field, both online and in real life. In this Forbes.com article for example, a Lead Data Scientist states that:
"The variance in how well programmers perform at work is massive and easily dwarfs any effect from education."
This one is also pretty reassuring:
"The people who get more out of a degree are the ones who mostly taught themselves anyway—they just happened to do it at a university. So it's not particularly about the formal qualification, but being motivated and eager to learn is a lot more valuable."
Some room for improvement
Okay, so nice quotes and all, but how does this all work out in practice? Is there really no difference in coding style, quality or efficiency? Well, you can't really give a clear answer to that because, as one of the above quotes says, there is so much variance between programmers. Speaking for myself, I know my style is a bit more pragmatic than some of the more theoretically educated devs. No matter how cool I think their architectural gems are, I've come to terms with the fact you can make something that is only a tiny bit less elegant work just as well. As long as you're not making an unmaintainable mess, a few more lines of code are not going to hurt anyone.
Debugging is another thing. During my internship in cellular pathology it was my job to stare into a microscope full time, to learn to differentiate all kinds of anomalies in human tissue. This means you get to scan a huge amount of cells, scrutinizing every detail that might stand out. Doing this all day (going between tedious, fascinating and frustrating) kind of rewired my brain and it shows in my debugging. Weird, huh? Well, mildly annoying at best and sometimes even exasperating. Instead of quickly pinpointing the origin of the bug, I tend to go over a large portion of code and look into things that aren't exactly the cause of the problem I was trying to fix. Of course I'm working on this, but it's going to take a while to get it out of my system. On the other hand, I sometimes get to fix bugs that haven't even presented themselves yet, so there's that.
But, as you probably know, being a proper developer is about more than just having technical skills. No matter how much of a coding wizard you are, there are other aspects that won't be included in the average Comp Sci curriculum but they will make you more valuable in your field. From what I've heard over the years, employers value your soft- and technical skills equally. They would prefer code that is neatly written, maintainable and well documented over five lines of spaghetti that might do the same thing but 10ms faster. Taking responsibility for your actions -even if you're a junior- and willingness to clean up any mess you've made is a good life skill in general, but priceless to your boss and teammates. And since every one of us, CS degree or not, is trying to learn new things every day, the ability to share your knowledge is essential.
So all in all I think it's safe to say theoretically and practically trained developers are both assets to any employer. Even though it's not easy to shake the dreaded Impostor Syndrome, we might as well try to. And being 1 in 3 is nowhere near as rare as I thought we were. The knowledge you've gained before going into coding may seem useless at times, but it still programmed (pun may be intended) your brain to do your job and even excel at it.
In case you're still not convinced your odd combination of education and career don't have to be a disadvantage: maybe every developer might be doing what they do thanks to just that. Yep, if about 150 years ago Ada Lovelace hadn't put her love for maths and art together into something she called "poetic science", she may have never gotten the idea of the general-purpose machine that would eventually be the base of modern computing. Who knows what our jobs would be like if she hadn't!