By Ben Kamin
04-13-2025
8 minute read
I’m about to graduate from the University of San Francisco (USFCA) with a Computer Science degree, and I’ve been doing a lot of reflecting. Four years of lectures, coding assignments, and exams are behind me. It feels great. But now that I’m done, I've realized that most of the skills that make me feel ready for a software engineering career weren’t learned in the classroom. They came from my internships and my personal projects. This isn’t a diss against my university as much as it is a reality check about what a CS degree really is. Let me share my journey, something that might sound familiar to other CS students.
In my junior year, I landed an internship at a startup called Eridan. I've been there for a year and three months now. When I started, I was definitely a bum. I barely understood my assignments, only just barely getting by. My first tasks were small UI tweaks: adding small features to small components. Over time, those small fixes snowballed into larger responsibilities. By the end of my internship over the summer, I wasn’t just tweaking the UI, I developed two new pages, from gathering requirements to wireframing to implementation and testing. Because of my growth and impact, I was asked to stay on beyond the summer, which meant working part-time throughout my final year of university.
After my internships, doing classwork again frustrated me. Don’t get me wrong, my CS classes gave me valuable theoretical knowledge. I can implement common algorithms, understand Big-O notation, and explain the basics of compilers, web dev, and computer architecture. But when it came to practical software engineering, there were large gaps.
One major frustration was how small and contrived most of the class assignments were. Outside of three classes I've taken, the largest project I ever did was maybe a few files of code, done by myself which was promptly thrown away after grading. These assignments were designed to teach a specific concept. But they couldn’t even hold a candle to the complexity that I have to deal with at Eridan. School never required me to think about things in terms of maintainability, costs, or having more than one user. I understand that a classroom can’t fully simulate a production environment, but a little more realism in projects would have gone a long way. It was hard to get excited about a homework assignment after you’ve had the pleasure (and stress) of building something actually useful.
Before I continue, I need to go back and mention the three classes I excluded from that blanket statement earlier. Compilers, Software Development & my Senior Team Project were some of the best courses I have taken at university for one reason: it's one project that you work on throughout the entire semester. You have to write maintainable code for the sake of being able to improve it throughout a semester, which is much longer than the week or two small assignments that we would be given in other courses. This is good. But, you still didn’t have to worry about costs or users; just get the tests to pass.
All that to say: universities are full of outdated tools and techniques. Universities can be slow to update curricula, which I realized first-hand. Some of my courses taught technologies and practices from a decade ago. One of my classes in 2024, full stack web development, spent weeks on JQuery, leaving Angular and React for the last two weeks. I know Angular is still being used. But JQuery? in 2024? I learned more about React from my own side projects and my internships than I ever did in any class. Same goes for things like cloud databases outside MongoDB, or emerging technologies and techniques. These topics just weren’t in the syllabi. It was frustrating to realize that my current education was already outdated before I would even graduate. So most of the things I learned for my degree were outdated in the real world, putting the responsibility on me to stay up-to-date. I get that fundamentals are important, but having more classes with updated topics that align more with what’s happening in the tech world would have been extremely useful.
Not one professor talked seriously about side projects nor learning outside of the classroom assignments. I have yet to see an add-on for people who want to go above and beyond without receiving an academic reward in the course. Which is especially true for those who genuinely want to create something unique and interesting. There is the understanding that the job market is terrible right now. But, no one told me how to land a job until it was already time to start applying for internships in my junior year. Unless there is some radical evolution for the CS curriculum, there needs to be more stress, to complete side projects, practice leetcode and understand how to market themselves.
But perhaps the biggest gap was the lack of practical teamwork experience in the curriculum. Working in a team is core to software development, yet so many of my CS classes had us coding alone. We almost never used proper version control on assignments, with branches, issues or versioning. It’s a totally different mentality to use git to its full potential. Similarly, workflows, like SCRUM, were never touched upon in class. No stand-ups, no sprints, no task boards. You’d think software engineering courses might simulate a team project environment. But at least in my experience, they rarely did. There is a lot more outside of literally coding when it comes to SWE than is taught by a CS degree, and I think that is an issue.
Now, at the end of my college journey, I have mixed feelings. I’m grateful for the computer science fundamentals I learned. The theory, data structures, algorithms, systems knowledge, do matter, no matter how much I wish that they didn't. They don't matter in the way that everything you do will require the use of a complex data structure and complex traversal, but knowing how and when to use them are extremely valuable. Those times where you can say “I used a linked list here because it made the most sense” aren’t as common, but knowing how and when to implement it is important. Yet I can’t ignore the fact that this is just a small piece of software; most of the practical engineering skills I have, I picked up outside the university curriculum. It’s a bit like I majored in Computer Science, but had an undeclared minor in self learning for a career through internships and side projects.
I often wonder: what if I hadn’t done these internships or personal projects? Would I feel ready to jump into a software engineering job just with my degree? Honestly, no way. I would have way too much catching up to do for things I was blind to the fact that I didn’t know. Even more so, I know friends and classmates who didn’t have the opportunity to intern, or didn’t have a good experience interning. Some of them are a lot of trouble, desperately trying to learn frameworks and tools right as they’re about to graduate. It’s a tough position to be in, and I think universities could do a better job preventing it. Even though ultimately, you can just leave it as a skill issue for not understanding the evolving nature of getting a job in tech. Providing courses on the most up to date technologies is probably impossible, but I can't help but dream that was possible.
For me, having the internship in my junior year was massive. I learned so much about things that I had no clue existed or knew to learn. I stopped expecting the curriculum to cover everything. I sought out internships precisely because I knew I wasn’t getting the full picture in class. While I wish that my university properly prepared me for these internships and my eventual full time position, I understand why they can't.
When my boss left Eridan in November, I was tasked with picking up his projects that he was the only contributor on. And all of that work fell on me. But instead of saying, that's above my pay grade, I dove into research for ways that I can do his job better. I’ve been slowly realizing what being the sole contributor means for the level of professionalism expected and the decision making power you have.
I cannot tell you the level of understanding I've gotten from architecting and developing because the opportunity arose. But, it's not like I couldn't get this type of experience building my own projects. This type of knowledge wasn't ever touched on in my coursework. There were no conversations about what hosting providers to use, nor what database makes the most sense in x situation. The farthest I got in covering databases in academics were the basics of SQL and a pinch of MongoDB. And one of the biggest things that I have an issue with was the fact that there wasn't anything about how to present trade offs, or to even understand what.
I guess this reflection made me realize that CS degrees leave people unprepared if they don't work on side projects or internships. And quite frankly, that disturbs me. An education is supposed to prepare you for your career. And if the education doesn’t, then it leaves people in shitty situations. I find that a CS degree often leaves graduates with blindspots they don’t know exist yet. People generally expect (outside of tech) to be able to get a job once they graduate given the fact they earned their degree. In the tech industry it’s different. If that’s how it’s going to be, so be it. But, I think that’s wrong. I guess that’s what I signed up for when I set my aspirations to be a software engineer.
TL;DR - Computer Science degrees don't guarantee the skills needed to be a Software Engineer. So learn them or you won't be prepared. To Universities: update your curriculum and make sure it actually prepares your students for the field they’re going to be in.