Incremental Predictive Coding: A Parallel and Fully Automatic Learning Algorithm

Published in arXiv (Preprint), 2022

Recommended citation: https://arxiv.org/pdf/2212.00720.pdf

Abstract

Neuroscience-inspired models, such as predictive coding, have the potential to play an important role in the future of machine intelligence. However, they are not yet used in industrial applications due to some limitations, such as the lack of efficiency. In this work, we address this by proposing incremental predictive coding (iPC), a variation of the original framework derived from the incremental expectation maximization algorithm, where every operation can be performed in parallel without external control. We show both theoretically and empirically that iPC is much faster than the original algorithm originally developed by Rao and Ballard, while maintaining performance comparable to backpropagation in image classification tasks. This work impacts several areas, has general applications in computational neuroscience and machine learning, and specific applications in scenarios where automatization and parallelization are important, such as distributed computing and implementations of deep learning models on analog and neuromorphic chips.

Download paper here

BibTex

@misc{salvatori2022incremental, title={Incremental Predictive Coding: A Parallel and Fully Automatic Learning Algorithm}, author={Tommaso Salvatori and Yuhang Song and Beren Millidge and Zhenghua Xu and Lei Sha and Cornelius Emde and Rafal Bogacz and Thomas Lukasiewicz}, year={2022}, eprint={2212.00720}, archivePrefix={arXiv}, primaryClass={cs.NE} }