![]() ![]() To show how these terminological differences play out in practice, I’m going to take the part of Simulators describing GPT’s properties, and unpack each of the properties in the kind of language that’s typically used in predictive processing papers. Incentive to reverse-engineer the (semantic) physics of the training distribution Self-supervised, but often this is omitted Predictive loss on a self-supervised dataset Given the conceptual similarity but terminological differences, perhaps it's useful to create a translation table between the maps: Simulators terminology Both frames give you similar phenomenological capabilities: for example, what CFAR’s "inner simulator" technique is doing is literally and explicitly conditioning your brain-based generative model on a given observation and generating rollouts.The generative model is updated using approximate Bayesian inference. ![]() Systems are equipped with a generative model that is able to simulate the system's sensory inputs.They also have some non-overlapping parts. The two maps are partially overlapping, even though they were originally created to understand different systems. In terms of the space of maps and the space of systems, we have a situation like this: The simulators frame typically adds a connection to GPT-like models, and the usual central example is LLMs. The predictive processing frame tends to add some understanding of how generative models can be learned by brains and what the results look like in the real world, and the usual central example is the brain. This is also true about another isomorphic frame - predictive models as described by Evan Hubinger. Actually, in my view, it's hard to find any deep conceptual difference - simulators broadly are generative models. ![]() There’s a lot of overlap between the concept of simulators and the concept of generative world models in predictive processing. Simulators as (predictive processing) generative models This post unpacks how this might happen, by translating the Simulators frame into the language of predictive processing, and arguing that there is an emergent control loop between the generative world model inside of GPT and the external world. Given that GPT already learns to model a lot about humans and reality from listening to the conversations in the first cave, it seems reasonable to expect that it will also learn to model itself. In fact, more and more of the conversations are actually written by GPT.Īs GPT listens to the echoes of its own words, might it start to notice “wait, that’s me speaking”? Now imagine that more and more of the conversations GPT overhears in the first cave mention GPT. The only way it can learn about the real world is by listening to the conversations of the humans in the first cave, and predicting the next word. GPT trained on text is in the second cave. Now imagine a second cave, further away from the real world. Imagine humans in Plato’s cave, interacting with reality by watching the shadows on the wall. Letter mail CL:封 to trust to believe to profess faith in truthful confidence trust at will at randomĬlothes dress garment to serve (in the military, a prison sentence etc) to obey to convince to admire to acclimatize to take (medicine) mourning clothes to wear mourning clothesĭose (measure word for medicine) Taiwan pr.Prelude: when GPT first hears its own voice To leave (a message etc) to retain to stay to remain to keep to preserve To do to manage to handle to go about to run to set up to deal withĬertificate proof to prove to demonstrate to confirm variant of 症Īuckland (New Zealand city) Oakland (California, US city)īusiness school university of business studiesĮducational background academic qualificationsĬustom-made made-to-order to have something custom made ![]() Not yet did not have not not 8th earthly branch: 1-3 p.m., 6th solar month (7th July-6th August), year of the Sheep ancient Chinese compass point: 210° Rèn zhèng to authenticate to approve 89514482 ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |