Limits to Reservoir Learning

07/26/2023
by   Anthony M. Polloreno, et al.
0

In this work, we bound a machine's ability to learn based on computational limitations implied by physicality. We start by considering the information processing capacity (IPC), a normalized measure of the expected squared error of a collection of signals to a complete basis of functions. We use the IPC to measure the degradation under noise of the performance of reservoir computers, a particular kind of recurrent network, when constrained by physical considerations. First, we show that the IPC is at most a polynomial in the system size n, even when considering the collection of 2^n possible pointwise products of the n output signals. Next, we argue that this degradation implies that the family of functions represented by the reservoir requires an exponential number of samples to learn in the presence of the reservoir's noise. Finally, we conclude with a discussion of the performance of the same collection of 2^n functions without noise when being used for binary classification.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset