Why I believe that Python should be taught over C as a first programming language in universities.

It is (or maybe was) a worldwide trend for universities and colleges around the globe to introduce students to the C programming language as their first programming language. While some schools have now changed to Java instead, some are still teaching C as the first language one gets exposed to.

While there is nothing wrong with C (disclaimer: I love C, and I also consider it a must for software developers to learn, whether working high or low level), I firmly believe that is not suited to be used for such roles. C, being a medium level programming language, is somehow high level (this of course, is relative to comparison) while still managing to stay relatively close to the metal. C of course, derives much of its power from this single fact. Hardware interaction, while not useful to every single programmer, is still very useful for some tasks. One task I can think of, due to my work, is computer infrastructure projects. Compilers, Virtual Machines, and more, all can benefit from being close to the metal, and have a direct interaction with the computer’s hardware. I am sure this is all there is to it, but the aforementioned examples, can serve to demonstrate the need for low level interactivity.

However, tradeoff for this power is the fact that low level facilities get in your way when you program in C. Pointers quickly come to mind. No matter how powerful pointers are, most people are unable to effectively use them. This can be seen in most high level languages today, where pointers are pretty much extinct. Not only this, but to properly understand C, you need to have, at least some, knowledge of computer architecture. While you still can use C, regardless of the prevalence of such knowledge, it’s not uncommon to see students learn that a C x86 integer is 4 bytes, but not being able to answer why it is so if asked. Not to mention that the way student’s are taught C, by not telling them about the C standard library, they are getting reduced to exercises about array manipulation that get repetitive and tedious over a short amount of time, and don’t really add much to the student’s knowledge. The end result is that students feel incapable of delivering anything more than something trivial. Not to mention that there are schools (at least here in Greece) that teach C programming in Windows.

On the other hand, higher level languages today, such as javascript and python, at least in my opinion are more suited to serve as a language in an introductory course. That is because these languages are high level enough to abstract the underlying architecture in such a way, so that programming concepts can be demonstrated without the need to have knowledge of the underlying architecture. This way, algorithms, data structures, and more can be taught, without having the language’s idioms get in the way. What’s more, is that these languages are known to have an easier learning curve, and last but not least, they integrate a great standard library, that student’s can use to deliver something useful, and see real world applications development faster than in C. What’s not to love?

While I personally believe that a serious programmer should know at least one of the so called medium level languages (C, C++), even if they learn to one only to understand more about the way the machine works, I firmly believe that such languages should not be used in introductory courses in computer science.

So how do you feel about that? Do you think that C should be taught at introductory courses in computer science? Or do you feel (like me) that other languages are better suited, and why?