Programming Decoded: Bridging the Gap Between Human and Machine
The world of coding and programming can be an intimidating place for the uninitiated. It’s difficult to find a clear explanation of how it all works that isn’t laden with industry jargon that requires complex explanations. Consider this post the first of several in a Programming, Deconstructed series; our attempt at unpacking the topic and explaining the fundamentals of programming in a way that is accessible to everyone, regardless of their background.
Like most complex topics, knowledge about programming is cumulative. So before we dig into a discussion of basic programming concepts or compare different languages, we need to answer the most fundamental question: what is programming at the most basic level? Let's start off by talking about a computer we all love to hate—the human brain.
It’s a wildly versatile organ, allowing us to determine everything from how to catch a football based on it's initial trajectory to guessing how someone else feels based on nothing but their body language. But one of the most impressive functions of the brain is how it processes language.
When trying to understand a sentence, the brain breaks it up into different parts: semantics and syntax (note: context is also pretty important, but that's best left for a more advanced discussion of programming). Semantics measure the meaning of a word, while syntax refers to the rules we have for combining words into phrases and sentences, and for understanding the relationship between words. Using a combination and semantics and syntax, the human brain is able to assign meaning to words and phrases that isn’t explicitly stated.
Computer processors, on the other hand, don’t have the same ability to interpret syntax (and context), and that’s where programming comes in. It’s important to keep in mind is that both the processor in your computer and the human brain serve a similar function: they produce an output based on an input. But they process information in fundamentally different ways. To better understand programming, we first have to understand how humans and computers interpret the world differently.
Contrary to popular belief, programming is, at its core, just creative problem solving according to a predefined set of rules. Whether it's fixing an existing tech-related headache or inventing a solution to a problem that, earlier, hadn't even been defined, programming isn’t necessarily about solving a computer problem, but more the process of using a computer to solve a real-life problem.
If you wish to make a PB&J sandwich from scratch, you must first invent the universe
The key to any type of problem solving is taking things step-by-step, and with programming it’s more like baby-step-by-baby-step. Because computers process information differently than the human brain, we have to explain things in different terms.
Let's consider the task of making a peanut butter and jelly sandwich. First, you need to define your list of ingredients: a loaf of bread, a jar of peanut butter (chunky, you monster), a jar of jelly—raspberry is the only option, as we all know—one plate, and two knives (thou shalt not double dip). After defining the ingredients, the next step is to provide a set of instructions for making the sandwich. If you're not a programmer, your instructions might look a bit like this:
1. Remove two slices of bread
2. Put the peanut butter on one slice
3. Put the jelly on the other slice
4. Put them together
5. Enjoy
Obviously, the computer didn't interpret the instructions correctly. In this example, the difference is a matter of inferences. A person is able to infer that "put the peanut butter on a slice of bread" is really a series of many steps that are quite complex, whereas a computer is frustratingly literal in the way it interprets instructions. If we were to imagine the conversation between a human and a computer, it might go something like this:
Human: Open the jar of peanut butter, please.
Computer: How do I do that?
Human: Twist the cap
Computer: What does 'twist' mean?
Human: Rotate. Rotate the cap.
Computer: How much should I rotate the cap?
Human: I don't know. Three, maybe 4 times?
Computer: 3 or 4 radians. Got it.
Human: No. Full revolutions. Rotate the cap 1440°.
Computer: Ok. Got it. Rotating the cap 1440°. Which direction?
Human: ( ╯°□°)╯︵ ┻━┻
And that's just getting the jar of peanut butter open. That’s how programming works. It’s about thinking a few levels below and breaking down actions in the most simplistic steps. The entire process is methodical, and requires a very explicit, step-by-step breakdown to get to your exact desired outcome.