Online courses vs. colleges for software engineering


Proud dropouts sometimes swear by online courses, because the project-based learning that they provide is hands-on, and also representative of the how the real world is going to be. On the other hand, proud academics sometimes like to brag about equations and theorems that are really not relevant, but still sound academically enlightening.

I’ve done numerous Udacity nanodegrees and paid Coursera courses. I’m also currently pursuing my masters online from Georgia Tech in computer science (it’s OMSCS program). I’m going to make a contrast between the two approaches, highlighting their merits and demerits.

Vocational education can be very ineffective over time

I remember sitting with a distinguished security expert in my company, along with a colleague of mine. The security expert asked my colleague, “Which programming languages do you know?” He answered, “Ruby, mainly focusing on Ruby on Rails.”

At this point, the security expert said, “It’s the way you answer the question that tells a lot about your experience.”

Although he did not elaborate, I kind of knew what he was talking about. That’s because I’ve been asked that question repeatedly in the past, and each time my response would be, “Well, it’s a tricky question to answer. I can learn the syntax of any new language pretty easily, but the eco-system is an entirely different thing. What exactly do you mean by knowing a language?”

Only vocational learning results in people learning a vast amount of individual facts, but have no idea of how they tie in together. The abstraction that ties facts together is theory, but having no exposure to theory can sometimes impede your progress.

You might keep learning a new JavaScript framework each month, but in the grand scheme of things, you’re not really learning anything new. It just seems that way. What I learnt in Udacity had a very short shelf life. Tensorflow? Well, it’s PyTorch now.

College education is useless if you can’t use it

The one thing that most of the colleges don’t do well is teaching you how to apply what you just learnt. Also, the claim that colleges teach you how to think about a subject is also grossly false. The only advantage is that you get exposed to theoretical backings (which is basically other people’s way of thinking about something) for different subjects. If learnt correctly, the concepts don’t change much over time, although programming languages might.

Want to learn how to read technical papers? Well, they might have an assignment that’d ask you to do that. But the thing is, the college doesn’t teach you how to do that. It just tells you to do it. And you end up forcing yourself to figure out how.

“Here’s Bayes’ theorem. And here’s 100 more theorems on top of it. You don’t have to memorize them. We’ll teach you how to derive them. But those derivations you might have to memorize.”

I’m more interested in how the first person thought about these ideas, not a sugar-coated interpretation of it ages later. Although college education taught me the “what” of these mathematical rules, I’ve never seen an instance of “why” and “why only this way?”

In conclusion

Both in Udacity’s case and Georgia Tech’s, the assignments were the only enlightening part. The theoretical videos were just about “relaying information” rather than increasing knowledge. Unless you’re already familiar with a subject, you won’t be able to understand it to your satisfaction through both these mediums.

The best way to learn is to do your own experiments. Once understood, that understanding lasts a lifetime. Facts can change, but the governing rules, if deciphered, won’t.

Also look at: The best way to learn Python is not always courses