Understanding and implementing the brain’s computational paradigm is the grand challenge facing computer researchers. Not only does it provide computational capabilities far beyond those of conventional computers, its energy efficiency is truly remarkable. Furthermore, I believe strongly that computer architects and designers in particular have an important set of skills and a perspective that should be applied to meeting this grand challenge.
The brain’s neocortex is constructed of massively interconnected neurons that compute and communicate via voltage spikes, and a strong argument can be made that precise spike timing is an essential element of the paradigm. I will describe biologically plausible computational elements based on precise spike timing. Through examples, I will illustrate some features of spike-based temporal computation and how it differs from other brain-inspired approaches. A machine learning architecture based on this model is under development and will be described in some detail. Perhaps the most important feature of this approach is that training times are orders of magnitude less than with conventional machine learning models. Although this is work in progress, it clearly illustrates the application of a computer designer’s perspective to solving the ultimate computing grand challenge, and I hope that it motivates broader participation from computer architects and designers.
James E. Smith is Professor Emeritus with the Department of Electrical and Computer Engineering at the University of Wisconsin-Madison. He received his PhD from the University of Illinois in 1976. He then joined the faculty of the University of Wisconsin-Madison, teaching and conducting research ̶ first in fault-tolerant computing, then in computer architecture. He has been involved in a number of computer research and development projects both as a faculty member at Wisconsin and in industry.
Prof. Smith has made a number of significant contributions to the development of superscalar processors. These contributions include basic mechanisms for dynamic branch prediction and implementing precise traps. He has also studied vector processor architectures and worked on the development of innovative microarchitecture paradigms. He received the 1999 ACM/IEEE Eckert-Mauchly Award for these contributions. Today, almost every microprocessor makes use of these techniques from Prof Smith.
Currently, he is studying computational neuroscience at home along the Clark Fork near Missoula, Montana.