Anthony Polloreno, Ph.D.

@ampolloreno

Engineer

The Impact of Noise on Recurrent Neural Networks III

We're now ready to examine how noise affects our model of recurrent computation — specifically, reservoir computing using echo state networks (ESNs). In the previous post, we implemented a GPU-based simulator with a simple Gaussian noise model applied to the reservoir output.

Though more realistic systems may involve complex noise characteristics, our basic model helps build intuition. Based on earlier discussions, we’re examining not just raw outputs but also all products of the output signals. We evaluate reservoir performance using the NARMA10 task.

To understand noise effects, we model how these signal products become corrupted. The total number of possible signal products grows exponentially — as large as the power set of output signals. We group them by how many terms they include and analyze the likelihood that any given product remains uncorrupted by noise.

This allows us to estimate the number of “useful” signals. While we’d expect an exponential number in an ideal setting, the presence of noise limits that growth to something closer to a polynomial scale.

We then plot how the effective degree of this polynomial (i.e., how many signals survive) varies with noise. In the extreme high-noise regime, the ESN can’t learn — performance saturates, and the effective degree drops to zero. In the limit of zero noise, the ESN might access an exponential number of features, and the polynomial approximation breaks down entirely. Let’s dig into the code:

Acknowledgements

Thanks to Alex Meiburg, André Melo, and Eric Peterson for their thoughtful feedback on this post.