With the exception of infinite storage devices, a modern computer is Turing complete. The memory can be very long in reality. Recurrent neural networks (RNN) with memory (and running repeatedly), for example, are said to be Turing complete in addition to being universal function approximators. It means that RNN models are computers. But, Convolutional Neural Networks are not Turing complete.

They found that augmenting their structure with tiling systems offers their Turing completeness, as there is equivalence between tilings and Turing machines (Wang tiles for example: https://moyix.wordpress.com/2012/04/06/computing-with-tiles/). So, for example, CNN with Wang tiles units at any layer/neurons can compute anything that a Turing machine can. They can run any algorithm and compute any code loop. This is one main contribution because it was a concern for AI scientists (https://openreview.net/pdf?id=HyGBdo0qFm).

Another contribution is about the infinite possibility to create other neural networks as much as we want, with respect to a given Lie group. This is called equivariance, they highlight the E6 Lie group which is a very beautiful one. Please have a preview to E6 here : https://en.wikipedia.org/wiki/E6_(mathematics). There is also E8 Lie group, which has a representation connected to Metatron Cube patterns in Quantum Physics String Theory of Everything (read more on this link: https://medium.com/@TheConstructionZone/hybycozo-and-e8-the-exceptionally-simple-theory-that-started-everything-2a8b57edad69).

Finally, We can go far away, by embedding many structures together, by stacking methods, recurrent/recursive, convolutional, equivariance methods, etc. This is so wonderful. Equivariance preserves universal approximation properties (with appropriate settings), and at the end, all of the Turing machines generated can approximate and compute anything as a real computer.

We are excited about those findings and are waiting for their book titled “Infinite Representation for Deep Learning” coming soon to see how they learn all those novel architectures.