This is a read-only archive!

Education

I've been out of college almost exactly four years. Looking back at my Computer Science classes, I think I can see some ways that my education was woefully inadequate. Or at least not what I wish it would have been, knowing what I (hope I) know now.

My classes were in C and C++ for the first year or two. I remember being very turned off of C++ because it didn't have a native "string" type and you always HAD to screw with null-terminated arrays of characters. I never once heard mentioned the STL or the benefits of using it, so far as I remember. Only a couple years after graduating did I learn that using C-style header files and functions in C++ was antiquated; it was tolerated in homework assignments and my professors didn't say much about it. I remember learning about C++ templates, but it was a chapter or two blown through in a week. We never did anything useful with them.

I remember distinctly someone asking a professor "which version" of C++ we were allowed to use on an assignment. He responded "standard". When pressed further, he conceded "whatever Visual C++ can handle". My university was and as far as I know remains a Microsoft sort of place. We all got Visual Studio "for free" (meaning it was included as a fee in our tuition, most probably). Linux wasn't discouraged, but all assignments had to be done in VC++. I wish that would have been different. I wish I'd have used gcc or something once or twice. Most of the professors I liked were crusty old Unix guys who advocated *nix personally if not formally in the curriculum; it's one of the things that got me interested in Linux. I wish I'd picked their brains more.

In my last year or so of classes, the entire university was in the process of phasing out C++ and teaching all the core classes in Java. I was lucky to get a mix of Java and C and C++, if only so I could see the differences between them, and I got enough of a taste of Java to know I don't like it. I had one Perl class (the only one offered, that I'm aware of), but it was half a term and focused on CGI. I had some assembler language classes but they were run on a little toy MIPS emulator in (of course) Windows. How bad must it be for the next generation of coders coming from my old college, who have Java as their primary or perhaps only language?

I remember learning about object oriented programming, but the only thing I actually learned from those courses was terminology: constructor, destructor, inspector, mutator. We learned about inheritance but nothing about how it can be useful. We learned about overloading and overriding operators and methods but never anything about when to do it and when not to. I think the word polymorphism was mentioned and defined, but never explained with good examples. We never touched design patterns; I read a book about those a few months ago and it was a real revelation in many ways.

I learned a lot about binary trees and stacks and priority queues and graph search algorithms. But I never really learned about when we should use those things and when not. I never had an assignment that said "Here's some data; pick the right data structure to model and organize and query the data and write a program that does it". In 90% of my classes I probably never wrote a program longer than 200 lines of code. My courses had me writing and re-writing and re-re-writing implementations of singly-linked lists until I was blue in the face. It's nice to know how those things work, but it's also nice to know that reinventing the wheel is stupid when you're trying to get a program done on a budget; they could have mentioned that tidbit at least in passing.

Strangely I never had a single database class in college. I never learned SQL until I taught myself after I graduated. If anything I would think SQL is almost essential for any programmer to know. I never learned shell scripting along the lines of bash scripting or Perl as a shell scripting language.

How about Vim? Vim has contributed greatly to whatever productivity I have at work (and home) nowadays. But the one time I remember it being mentioned in college was a professor saying something about "this really old and weird editor called VEE EYE, anyone ever heard of it?". Left to my own devices, I always used the sucktastic Visual Studio IDE, since it was the best thing I knew existed at the time. I did have one good class (again taught by one of the crusty old Unix guys) where we discussed the importance of a good editor, how syntax highlighting and auto-indentation can help a coder catch errors and be more productive. I don't remember if Vim was mentioned by name, I think it was more an advocation of fancy IDEs along the lines of Visual Studio. But it was something I wish we'd have gone into more detail on.

The thing I probably value most from my formal education are the things that are the least practically useful. Theory of computation stuff. Compiler design, programming language design, things like that. State machines and grammars and string parsing. The history and evolution of programming languages. How programming languages work under the hood, garbage collection algorithms, object heaps and call stacks. Those were really interesting and I think helpful in understanding what's really going on when you write and run code. But I'm unsure how much it helps in a practical sense, when your boss says "Give me this by Friday".

I'd say only within the past year have I even begun to approach the level of "adequate" programmer in terms of skill level, and it's almost entirely thanks to things I've learned on my own since college. Maybe I'm misremembering, maybe college gave me some kind of magical foundation upon which all other knowledge has been built. Maybe there isn't enough time in college to learn the things I wish I had learned. I don't buy that.

There's no doubt the piece of paper saying that I graduated has helped me in practical ways, e.g. finding and getting a job. But I can't help but think that things I've learned privately just because I like learning them has been more useful in being able to actually do my job. On that note, having started in high school with hideous old QBASIC, ventured through C and C++, hurried past Java, gone down the Perl path for a long while, briefly glanced at the likes of Haskell and PHP and many others, and ended up firmly in Ruby territory, I'm planning to revisit C++ again a bit now. Maybe it'll all make more sense now. I ordered a book last week, we'll see how it looks once it gets here.

July 07, 2007 @ 2:37 PM PDT
Cateogory: Programming

1 Comment

Hussam
Quoth Hussam on July 07, 2007 @ 10:21 PM PDT

I agree with you. Colleges only equip you with the tools out there but not the skills required. You either have to be a coder on your own or have to learn things on-the-job. I had the chills when I thought about having to pay for all those college loans without having a decent job due to lack of skills. This is why I decided to start coding beyond school assignments and for fun.

I don't know how hard it is to pick up an open source project but I can tell you that if it is in the latter stages of development it can get hectic. This is why I decided to start on my own from scratch but I rarely reach even a beta stage before giving up.

Visual Studio was, is and will probably remain an industry standard. The .NET architecture allows you to program in your language of choice (it supports somewhere in the neighborhood of 20) and hence is quickly gaining ground. ASP, C#, VB, Delphi etc are the only names I hear when talking about the business world. Java comes in as a competitor to these. And I come across PHP, Perl, Python, Ruby etc. only in the open source world. This leads me to believe that although I can learn a combination of PHP/MySQL for my own good, Java/Oracle and .NET/MsSQL will bring the real cash in.

It hurts to think that I only have 1 more year of college to go through and still haven't learned anything that would really benefit me in the outside world. Maybe it's just me.