Central Pattern Generators are neural networks (of the meat-kind) that generate our rhythmic movements like walking. Conceptually they are a kind of automaton that generates movement pattern without much input, or even without input. (Cue deafferented cat experiments)
GPT3 is a language Central Pattern Generator. It produces something homologous, i.e. anguage "movement" patterns, with or without sensory input, like a deafferented cat-mouth. These kind of mechanisms will prove to be huge contributions for the understanding of our own brain functions. Every year, neural networks are expanding the repertoire of sensory functions that they are able to decipher, approximately or not. Vision is largely convolutions, and our current grasp of them is good enough that we can make hypotheses or mental models about how to proceed next. Transformer models like Bert and GPT* are now providing the language outputs, the computational equivalent of Wernicke and Broca areas . These world-facing systems are going to become essential as we are closing up to what we consider higher-level functions of cognition, such as central planning, mental attention (not just transformer attention) , working memory, motivation, etc.
This model opens up substantial space for theorizing how to put the pieces together.
No comments yet