Skip to content

Digital Realities & Abstraction

Before we move on, we need to think a little about how these digital realities come into existence.

Generally, you will find that programming textbooks will describe programming in terms of abstraction. This comes from thinking about how a computer works and the process we use to create digital realities. To understand this, we need to think about what a computer is.

A computer is an electronic device. It operates by controlling voltage through its circuits. Transistors enable us to create gates that control the current. Lots of gates are combined to form the processing units of the computer, where the flow of current can be controlled to achieve computations. Gates are also used to alter the flow of current through memory chips, controlling which parts of the chips retain, gain, or lose current. So, at this level of thinking, a computer is just electrical current flowing through a circuit.

So how does this become a digital reality?

The answer is: through abstraction. This is our ability to move above events and physical realities, and use our thoughts and imaginations to picture things in an abstract way.

Using abstraction, we can imagine that the presence of current at a certain location in the computer is a 1, whereas an absence of current at that location is a 0. This gives us a binary value. Individual binary values can be grouped to create larger values. Most modern computers are designed with groups of 64 binary values being their natural unit of data (64-bit computers).

Taking abstraction to the next level, we can use these binary values both to represent data within the computer, and control the computations occurring within its processors. One way we do this is by grouping binary values to represent a number. For example, 8 bits (a byte) gives us 256 (28) unique values. Two common ways of interpreting the 256 unique values in a byte is as a number from 0 to 255, or a number from -128 to 127. The actual values of the 8 bits in both cases are the same, but the way we interpret them changes depending on the reality we are imagining. We can abstract this idea further, using bytes to represent real numbers (numbers with decimal points) with a fixed degree of accuracy, individual characters of text, colours of images, signals of audio, and basically anything else we can think of.

Abstracting further, we can combine individual values into entities like characters and levels in games, documents in a word processor, bank accounts, or any other thing we want to represent within the digital realities we are creating.

By using abstraction we can work at higher levels and avoid needing to think about current, bits, or any other details of how everything actually works. We can spend our time picturing the reality we want to create and crafting the code needed to produce it.

This is what software development is all about.