The Von Neumann Bottleneck

Published: March 1st 2023
The Von Neumann Bottleneck
Forgotten trailblazers (part 1): John von Neumann

John von Neumann was a 20th Century Hungarian-American polymath who made several really important contributions to many scientific fields, ranging from chemical engineering to information science and computing.

 

Perhaps his greatest contribution is the Von Neumann Architecture.

 

You may be wondering what it is and why is it important. Well, almost every conventional computer in the world, including the one which you are using to read this blog uses it. You have programs and data which are stored in a memory, but programs are run and data analysed by the central processing unit or CPU. It is John von Neumann who came up with this design.

 

We should remember John Von Neumann as the intellectual giant who took Alan Turing’s abstract dreams about computers and computation and turned them into the computer sitting on your desk, the smart phone in your pocket, and the games console children (and some adults!) use to play Pokémon and Call of Duty. Alan Turing played a central part in the Allied victory in WW2, but that is another story.....

 

The Von Neumann architecture is great if you want to surf the web, or use application software like a Microsoft package. If you are in the business of producing scalable AI or analysing networks (for example, social networks or utility grids) the wheels rapidly start to fall off the waggon. The reason is that these sorts of problems need to process large quantities of data, often in real time. Grabbing all that information from memory, processing it on the CPU and then pushing it back into memory places a limit on how quickly a conventional computer can process data. Pushing all those electrons around (which represent the ‘1’s and ‘0’s) also consumes power, generates heat, and crucially contributes to greenhouse gas emissions and hence global warming.

 

This effect is known as The Von Neumann Bottleneck. For applications like AI and network analysis, this is a significant problem. The data processing needed to train an AI or to find potential vulnerabilities in a utility grid is truly humongous (extremely complex scientific term there!). To stand any chance of obtaining results in a timely fashion these computations need to be done concurrently, that is in parallel. Despite many years of research, and a plethora of false starts, we are still a world away from parallel computing architectures which are low power, easily scalable, and perhaps most importantly, easy to use.

Share