Oversold Part I: Assessing The Legitimacy Of Tech Degrees


      From aviation to retail to ecology, it’s obvious that technology has more applications than ever in our interconnected society. With this recent pervasiveness has come an access to information like never. Whether its sites focusing on the natural sciences like hyperphysics, or more general, MOOC-rich nonprofits like edX.org, it’s clear that an individual has no shortage of sources for educating themselves, especially within the scope of STEM. Ironically, a traditional bachelor’s degree is still considered the safest route to a tech career, even though most relevant concepts are available for free. So, while the sentiment is admirable, the notion that Computer Science can be mastered via lectures/labs is not only illusive, but I’d argue that it impedes technological advancement.
I first began to question the curriculum after not meeting expectations at an internship last Summer.  Not only did I not receive a return offer, but my glaring lack of understanding was constantly on display; so much so that my partner would occasionally conduct subtle (and slightly condescending) assessments of my contributions to the project.  Granted, my partner was kind of a jerk but nevertheless, their skepticism of me was valid and it didn’t help that they seemed to have a better grasp of our product at any given time, despite not being a STEM major.   
Naturally, my comprehension of course material, or lack thereof, seems like an apt metric of a degree’s legitimacy.  Moreover, Spring 2020 was the first semester of my junior year: the first semester in which all my classes were major related.  In short, it feels like I learned very little.  Further, the classes varied in difficulty, but all seemed a waste of time, nevertheless. 
My Software Engineering course for example, though easy, hasn’t made me any more prepared for a professional role but has rather made me question if I should even regard the field as part of STEM: given its rather pseudo-scientific, customer-oriented nature.  Intro to Databases wasn’t much better but at least I gained some familiarity with SQL and relational algebra; the latter of which is elegant in and of itself.     
More upsetting are the courses that are not only challenging, but whose content doesn’t seem applicable to the future of technology.  A recent example would be my Computer Org & Architecture course and the concurrent lab.  These offerings annoyed me from day one because I don’t believe there are many advancements to be made in hardware, especially when one considers that Moore’s law doesn’t really hold up anymore.  There’s even the notion that the binary system of information storage will soon be obsolete with the advent of quantum computing.  Though, the verdict is still out on that.  Regardless, these courses were time consuming and after continuing to perform poorly in the lecture, I withdrew on the last day of class (a move allowed given the pandemic): $500 down the drain.
Similarly, my Data Structures lab was no cakewalk, but academically, my grades were excellent; the corresponding lecture, taken a semester earlier, also yielded above average results.  Ironically, I would still consider my grasp of object-oriented programming sub-par.  The assignments, while good for familiarizing a student with Java/C++ semantics, were so abstract I never knew what exactly I was constructing and still don’t.  I just knew the goal was to produce an output free of errors; this could mostly be achieved with the aid of the professor and/or an assistant.  And therein lies the issue: the disconnect from the problem and/or solution.
See, I believe that courses’ ability to be taught via classroom and textbook can be modeled as a spectrum: with the left end being most able and right being least able.  Further, on the left end would be more core-y classes, like history and pretty much all math; given that it seems like grades are directly correlated to reading/studying.  I’d say the natural sciences fall in the center: although reading and lectures will help, the ability to interpret and produce diagrams is more important.  Moving right from there I’d place courses like Org & Arch: a worse study time-to-grade correlation than Physics and more ambiguity; basically, a nightmare.  Then on the right end of the spectrum would be courses whose titles sound useful but could almost be passed with little understanding of the material. 
More specifically, it seems that blindly completing assignments just doesn’t promote retention or understanding in computational courses like it does in a math course, for example.  Further, I absorbed more fundamentals, from code (e.g., Python) to libraries (e.g., Scikit learn) to algorithms (e.g., Linear Regression) to vcs (e.g., Git) outside of class.   The difference is that with personal projects, generally, the application is more defined and whether I succeed or not, concepts stick better because my brain categorizes the process as scientific and not clerical
So, what’s the alternative? You might ask.  One option could be that those trying to start a career in tech may find a Bachelor’s in Mathematics to have less of a beta vibe: this isn’t a guarantee.  I only base that on the fact that I can speak extensively on topics from Discrete Math and Linear Algebra, despite only getting C’s in them.  However, I got amazing grades in Data Structures and still struggle to write a recursive function off the top of my head.  The true downside to this approach, however, is that one would still have to devote time outside of class to master tech concepts; making it just as unproductive as majoring in Comp Sci.  Ideally, systems like Github in combination with gig apps like UpWork and FieldNation would be a more effective means of securing an entry level tech role.  Though, I assume success with that route hasn’t been consistent enough among jobseekers to make tech degrees irrelevant; given the institutional requirements on most job listings. 
What’s more alarming is the possibility that, due to other socio-political factors, landing a job may still prove difficult.  Even if a prospect does all the responsible things: projects in Github, hours spent studying, bootcamps, certs, or whatever other training method that’s generally advised as a means of starting a tech career.  However, this hypothesis is more addressed in the 2nd part of this post, and I hope you’ll take a look.  In the meantime, thanks for allowing me to fill your head with my cynicism and of course: stay pissed.   

Comments