On Evan Miller's "You Can't Dig Upwards": C'ing Programming

This was inspired by Evan Miller's "You Can't Dig Upwards"

I urge aspiring and seasoned programmers to fully read through Evan Miller's You Can't Dig Upwards. It is a good read and explains my general feelings when it comes to computer science.

In retrospect, learning stick shift was a prudent investment of time, even though I’ve never had to prove it to society by (for example) driving a stick-shift ambulance full of orphans while avoiding heavy gunfire. Driving stick is just a good skill to have. More people should have it, in my opinion.

Before I begin, I'd like to point out what he said gave me quite the awesome imagery and I will be trying to do this in the future.

In all seriousness, his post sums up how, in the computer science world, Python is one of the first languages we learn, which almost guarantees that programmers will never learn C and he believes this to be a bad thing.

He also notes he is in the minority with this opinion. I'd like to point out that he's not alone and I share his opinion.

He argues that

Programmers who only know Python lack a proper mental model of how computers work

I believe this to be true. I'm not trying to shit on people who don't know C. There are very talented people who go throughout their computer science journeys without ever touching C. I'm not trying to prop myself up either. Truth be told, I barely remember C. While I am writing this, I am getting flashbacks to my architecture classes and the pain of debugging C code and getting constant segmentation faults. Pretty sure I have programmatic PTSD from my memories of C programming.

However, after wrestling with the devil by doing the Malloc Lab from Carnegie Mellon University, I remember feeling like a peak programmer. Doing that lab was one of my mentally taxing things I have done in my computer science career. That is, until I took Advanced Object Oriented Programming at my school a year later. I'm pretty sure I cried a few times during that class. That class should have been renamed to "How to Build Your Own C++". But as I said, I never felt better about my skills as a programmer after those classes.

But after those classes, I never did anything similar again. There's a reason I keep saying I have to re-learn everything. I don't think I retained much of my skills. I also have been slacking off from doing the appropriate work at home but have been doing better with it recently. You can see my post on Grading Myself Honestly on the Programmer Competency Matrix (By the way, I am a little more rusty than I originally thought so I am going to have to update this again and then haul ass into scrubbing away the programmer's rust).

Understandably so that sometimes, in industry, we don't "use" the skills from the computer science degree. We see the joke all the time, "We never use algorithms in industry but we get interviewed on them!" However, I think we need an understanding about how the computer works at any level of our careers or undertakings. If we want to build good software, understanding the underlying is a must.

However, the culture of "just use Python," "computers are fast enough," and "throw hardware at the problem" is creating a deeper systemtic problem: as a result of teaching Python first, and denigrating C, fewer and fewer people will want to become professional programmers.

I hope Evan hasn't disappeared from the planet with that statement.

I'm the type of person that really doesn't like blackboxing or abstracting - sometimes to a fault because affects the speed at which I do things. I like understanding the underlying. I am curious about the underlying. Maybe you can say I am a glutton for punishment. Sometimes I think Python is just magic. Shit just works. That's great and all. I even argue that if you want to prototype something really quick, you should really use Python. But I don't think it's conducive to really learning as much as you can from computer science.

You might argue that it's not needed, especially in today's software landscape. This is true. Technology moves quickly (how many Javascript frameworks have come out during your reading of this?). It is currently, economically infeasible to build extremely good software especially if it runs the risk of being something that no one wants. That is a huge waste of resources. So, the solution is to throw quantity at problems, not quality. However, I'd like to present a fair counter argument. If we establish that foundation of understanding how computers work, I argue that we will be better equipped to dole out solutions even faster than we are doing now. Sure, it's a lot slower to do this but we'll accelerate faster later. However, this is generally not how industry (and the world) works.

I believe that it might be time to revisit this fast paced world and try to slow shit down. I'm not sure how the economics would work around this but I mean how many articles have come out during your reading of this that lambasts some company with poorly written software because there was a data leak or something? It seems like everyday there is something new popping up with poorly functioning software. I'm pretty sure the economics behind poorly functioning software isn't pretty.

This especially comes into light with new technologies springing up. People don't understand the underlying but then want to build everything with it. I see this in the blockchain space. "X but on the blockchain." How about understanding what blockchain is first? Don't get me wrong, building stuff is great. It flexes a mental muscle. But thinking deeply about your solutions will flex an even bigger mental muscle. Deep thinking usually constitutes well, digging deeper into the train of thought. So, in the case of blockchain, understanding what it is, how it works, how it was built will better equip you into producing good solutions.

I should point out that I am not discouraging a head first dive into new technology. I advocate such a process. The best way to get acquainted is to dive in but I argue that we have to go deeper after we get acquainted.

Taking this back into the context of Miller's post, understanding how a computer works will help address the quality not the quantity.

In industry, you just need to get shit done. Python or Java get shit done quickly. No need for any extra understanding except understanding which modules and libraries facilitate the solution. I argue that we should go deeper whether it is a proposal at work or doing your own shit at home. I find programming the most fun when I am digging deeper into solutions.

There's a mistaken notion that programming is most enjoyable when you're doing a diverse array of things using the smallest amount of code, leveraging client libaries and so forth. This is true for people trying to get something done to meet a deadline; but for people motivated by learning - and who aspire to be extremely good programmers - it's actually more fun to create programs that do less work, but in a more intellectually stimulating way.

I think we can achieve both but maybe I am just optimistic. With how fast technology is moving (I am pretty sure that was the sound of a quantum computer being tweaked), it's going to exacerbate the speed at which we try to do things.

It's that mental model - rather than the C language itself - that will enable you to poke through the abstractions created by others, and write programs you never thought possible.

I don't know about you, but that sounds awesome to me. So, for my relearning, I have committed myself to relearning C and going deep into the underlying - the architecture, the operating system, the memory, etc. It's going to be a slow process but a quote that really has stuck by me is "Don't overestimate what you can do in a day but don't underestimate what you can do in a year."


You'll only receive email when they publish something new.

More from b17z
All posts