Revising the classic computing paradigm and its technological implementations
Today's computing is told to be based on the classic paradigm, proposed by von Neumann, a three-quarter century ago. However, that paradigm was justified (for the timing relations of) vacuum tubes only. The technological development invalidated the classic paradigm (but not the model!) and led to catastrophic performance losses in computing systems, from operating gate level to large networks, including the neuromorphic ones. The paper reviews the critical points of the classic paradigm and scrutinizes the confusion made around it. It discusses some of the consequences of improper technological implementation, from the shared media to the parallelized operation. The model is perfect, but it is applied outside of its range of validity. The paradigm is extended by providing the "procedure" that enables computing science to work with cases where the transfer time is not negligible apart from processing time.
READ FULL TEXT