What is a computer?

What is a computer? You may shrug it off and say that it's the machine on which you are reading this article right now - and which can also calculate your income taxes in spreadsheets, send emails, or video call your friends. But then, I continue my Socratic questioning, what else is a computer? You may acknowledge that the brain is kind of a computer - or, at least, that it can act like a computer (perhaps not a very good one, but that's a story for another time). And what about other kinds of ‘computers’? There are quantum computers, neuromorphic computers, even DNA computers or computers made of slime mold. Even more bewildering is that some biologists think of cells as computers, and others even think the whole universe is a computer!

Clearly, the notion of computation has broadened significantly as computers have found their way into our everyday life. Slime mold is very different from the smartphone you have in your pocket, and both are very different from the quantum computers that are being built right now - so how can we justify calling them all ‘computers’? This is a fundamental question that we must think about in order to develop the computers of tomorrow - which may work quite differently from what we are used to today. Such unconventional computers are being researched and developed at the University of Groningen at the CogniGron center for cognitive systems and materials.

This is the first article in a series on computation. Here, we will dive into some of the historical roots of computer science. In the second article, we will learn about the modern definition of computers, which also gave birth to computer science: what we now call an algorithm. This will serve as the starting point for subsequent articles, in which we will try to figure out what systems we can sensibly call computers and whether or not a computer can be intelligent. Perhaps we can even determine if we are living in a simulation...

Calculating with numbers

Left: the digital calculator that we are (probably) most familiar with. Credits to Recha Oktaviani via unsplash. Right: the abacus, an ancient non-electronic calculator. Credits to Crissy Jarvis via unsplash.

Let's begin our journey with something simpler than a computer: the calculator that you used in middle school. In a calculator, you can process numbers and do arithmetic that will make your head hurt if you try doing it yourself. I mean, could you calculate  without getting a headache? Note that a calculator need not be electronic; remember the abacus? Nor need it be digital; the MONIAC is an analog machine that calculates economic processes with water.

From numbers to logic

So how do we get from a calculator to a computer? We can ask the German mathematician and philosopher Gottlob Leibniz. His answer was an imagined calculator that operates on logical statements rather than numbers. The grand vision for his Calculus Ratiocinator was to answer any possible question through computation, as echoed in his motto "Calculemus", or in English:

Let us calculate (...) to see who is right.

Unfortunately, Leibniz never built this computing machine. It would take centuries for engineering practices to reach the level of precision required to build universal computers. More interesting from a mathematical perspective was the issue of representing logical statements in some mathematical language that a calculator can work with. Leibniz referred to this language as the characteristica universalis, but he was not able to construct such a language.

More than a century after his death, George Boole made significant progress by developing a logical language in which true statements are represented as 1, and false ones as 0. He even developed a system of operations on such statements. However monumental this was, it was merely the beginning of a long journey towards a universal language for logic. It took great mathematicians like Frege, Russell, Gödel, and Tarski until the 20th century to fully formalize a logical system that would enable the development of the modern computer. Their work led to the modern definition of algorithms, which we will cover in a follow-up article.

Left: credits to Michael Dziedzic via unsplash. Right: credits to Nick Fewings via unsplash.

From logic into the world

Mathematicians were thinking of deep questions in logic and mathematics, which is wonderful, but let's face it: most people would prefer to watch cat videos than to prove mathematical statements. So it's a good thing that Ada Lovelace, who would surely be a cat-video-enthusiast if she lived today, came along with the realization that the numbers in a calculator can be made to represent anything whatsoever - even domains far removed from mathematics, like poetry or music.

Representing art through numbers was quite the leap at the time - and remains fascinating to think about. Nowadays we are used to looking at images and videos on our monitors but we rarely appreciate that every image and every video is just a number behind the screens. Through clever processing of these numbers, we can display them on a screen as something meaningful - but mind you, the computer has no idea about this meaning. A computer is completely oblivious that only one of the two images below is actually meaningful to us.

A digitalized world

While a calculator is restricted to doing operations on numbers, a computer can be said to operate on any object that can be represented by a number. And according to Ada Lovelace, and proven by the cat videos on my laptop, this includes many things that seem far away from numbers or mathematics. Indeed, our modern world does not cease to surprise us with the set of all objects and activities that can be digitalized, such as social networks, multiplayer games, flight simulators.

For today, let us arrive at the following informal definition:

💡 A computer is a calculator that can act on objects other than numbers.

Next: algorithms

Next time, we will follow history into the field of mathematical logic, in order to see our current best answer for Leibniz's universal language and computer. This leads us to the formal definition of an algorithm. Algorithms lie at the heart of computer science and enabled the development and use of modern computers. And yet, we will also ask, how does the concept of an "algorithm" influence us when we think about "computers"? Are there computers that cannot be described by algorithms?