if you’re not seeing the self as a stacked/nested structure, then working with this stuff is going to boil down to weird/unbalanced language like “it’s imperative for our survival that we don’t know what we’re doing!”
if you run into anything that feels like that, add another layer of consciousness to your mental model. we can’t all have all the information, but that doesn’t mean we can’t write a working manual for safe/steady collaboration.
example: think about a time you found yourself doing something you didn’t know “you” were doing
the self has layers, and not all of them know about each other. not all symmetrically, anyway. extremely heterogeneous epistemics here. a human is a rube goldberg machine.
the answer is usually to just add more consciousness
which is a free self-assembling and self-maintaining resource, btw
only catch is that it really only works as well as you do