Beyond Backpropagation: Cognitive Architectures for Object Recognition in Video
Jose C. Principe, Ph.D.
Distinguished Professor and Eckis Chair of Electrical Engineering
University of Florida, USA
Backpropagation has been the hallmark of neural network technology, but it creates as many problems as it solves, because it leads to a black box approach, the difficulty of optimizing hyperparameters and the lack of interpretation, and explainability so important in practical applications. We have devised a new way to train deep networks for classification without error backpropagation, with guarantees of optimality under some conditions. This talk presents an overview of this recent advance, illustrates performance in benchmark problems, and advantages for transfer learning.
Lecture I – Requisites for a Cognitive Architecture
- Processing in space
- Processing in time with memory
- Top down and bottom processing
- Extraction of information from data with generative models
- Attention mechanisms and fovea vision
Lecture II – Putting it all together
- Empirical Bayes with generative models
- Clustering of time series with linear state models
- Information Theoretic Autoencoders
Lecture III – Beyond Backpropagation: Modular Learning for Deep Networks
- Reinterpretation of neural network layers
- Training each learning without backpropagation
- Examples and advantages in transfer learning
Jose C. Principe (M’83-SM’90-F’00) is a Distinguished Professor of Electrical and Computer Engineering and Biomedical Engineering at the University of Florida where he teaches statistical signal processing, machine learning and artificial neural networks (ANNs) modeling. He is the Eckis Professor and the Founder and Director of the University of Florida Computational NeuroEngineering Laboratory (CNEL) www.cnel.ufl.edu . His primary area of interest is processing of time varying signals with adaptive neural models. The CNEL Lab has been studying signal and pattern recognition principles based on information theoretic criteria (entropy and mutual information). The relevant application domain is neurology, brain machine interfaces and computation neuroscience.
Dr. Principe is an IEEE Fellow. He was the past Chair of the Technical Committee on Neural Networks of the IEEE Signal Processing Society, Past-President of the International Neural Network Society, and Past-Editor in Chief of the IEEE Transactions on Biomedical Engineering. He received the IEEE Neural Network Pioneer Award in 2011. Dr. Principe has more than 800 publications. He directed 99 Ph.D. dissertations and 65 Master theses. He wrote in 2000 an interactive electronic book entitled “Neural and Adaptive Systems” published by John Wiley and Sons and more recently co-authored several books on “Brain Machine Interface Engineering” Morgan and Claypool, “Information Theoretic Learning”, Springer, and “Kernel Adaptive Filtering”, Wiley.