AI Horizon: The Future, Intelligence, Learning, Language, Thought, Progress!

Essays > Basic Computer Science

Efficiency and the Space-Time Continuum
(Original Essay - Printable Version)

 

A lot of computer science is about efficiency. For instance, one frequently used mechanism for measuring the theoretical speed of algorithms is Big-O notation. What most people don't realize, however, is that often there is a trade-off between speed and memory: or, as I like to call it, space and time.

Think of space efficiency and time efficiency as two opposite ends on a band (a continuum). Every point in between the two ends has a certain time and space efficiency. The more time efficiency you have, the less space efficiency you have, and vice versa. The picture below illustrates this in a simple fashion:

The Space-Time Continuum

Algorithms like Quicksort and Mergesort are exceedingly fast, but require lots of space to do the operations. On the other side of the spectrum, Bubble Sort is exceedingly slow, but takes up the minimum of space.

Heap Sort, for instance, has a very good balance between space and speed. The heap itself takes up about the same space as an array, and yet the speed of the sort is in the same order of magnitude as Quicksort and Mergesort (although it is slower on average than the other two). Heap Sort has the additional benefit of being quite consistent in its speed, so it is useful in programs where timing is crucial (i.e. networks).

For data trees, 2-3 trees and 2-3-4 trees are faster and more balanced than the normal binary search trees, but they take up an extraordinary amount of space because they usually have tons of unused variables lying around.

The Red-Black tree is a compromise between space and time. The Red-Black tree is basically a binary tree representation of a 2-3-4 tree, and so it takes up less space than the 2-3-4 tree (it doesn't have all of those empty fields), but it still retains the search efficiency of the 2-3-4 tree!

Thus, there has to be a balance in the space and time aspects of computing. Most of the research in Computer Science these days is devoted to Time efficiency, particularly the theoretical time barrier of NP-Complete problems (like the Traveling Salesman Problem). These days memory is cheap, and storage space almost seems to be given away.

With networking and robotics, however, the necessity of a balance becomes apparent. Often, memory on these machines is scarce, as is processing power. Memory has to be used conservatively, otherwise the network servers could become stalled until the operation is finished. Robots often have to function with the limited resources installed on their own structures, and thus many times they do not have the memory to be spared for vast computing speed. In these situations, a compromise must be made.

With networking, this issue becomes mainly about topology. Network Topology is basically a description of how the physical connections of a network are set up. Maybe you know the term "daisy chain": that is a kind of network topology. A daisy chain (in which all computers are connected to two others in one chain) uses the minimum of cables, but the relative speed of the connections is smaller, because if the computer on one end tries to send a signal to a computer at the other end, it must first go through every computer in between. On the other hand, if every computer were connected to every other computer (called a "fully connected mesh network"), signals would be fast, but you would use a lot more cables, and the network may become hard to maintain. So, in this case, space correlates to the number of connections in a network, while time refers to the speed that signals travel the network.

Thus, although it may seem a trivial issue, it is really quite important, even now, to have efficiency in both space and time. Of course, the type of compromise made depends on the situation, but generally, for most programmers, time is of the essence, while for locations in which memory is scarce, of course, space is the issue. Maybe someday we'll be able to find algorithms that are extremely efficient in both speed and memory, bridges in the Space-Time continuum.

Thanks to Hikmat A. El-Jaber for his suggestions.

Original Essay's URL: http://www.aihorizon.com/essays/basiccs/general/spacetime.htm
Written and Published by the People at AI Horizon <http://www.aihorizon.com/>