Internalists hold that what determines the content of a mental state are internal physical properties (what is going on in the brain) and externalists hold that the content of a mental state is determined at least partly by the external world. I am an internalist, for reasons I have given at length elsewhere. However, the ideas behind externalism have some importance. Specifically that when you are trying to determine what someone else is thinking about, in objective terms, reference to the external world is required. Or is it?
Well, here is a trivial example to show that it must, at least to some degree: a brain may be in the same state, and thus have the same mental state, in both the real world and a simulated world that exactly mimics the real world. This because as far as the operation of the brain is concerned what matters is the input (sensations), it doesn’t care how those inputs are generated. And thus whether the ideas in someone’s head are really about real objects, or virtual objects, or whatever can only be determined by knowing the kind of environment they live in.
But in some ways this example misses the point, because it proceeds with the assumption that what our ideas need to be about is the objects in the world themselves. In a way this is true, when I talk about brains I mean to talk about brains themselves, whatever they are, not simply my perception of brains. But, in another sense, my talk about trees in a virtual world and trees in the real world is about the same thing, that which is the cause of a certain kind of sensory impression. We could say that there exist equivalence classes of objects, each of which contains objects that give the same sensory impressions to us. For example, one such class would contain both real trees, virtual trees, consistent tree hallucinations, ect. A deeper question to ask is whether we can determine if a mental state is about one of these equivalence classes without reference to the external world. This removes the easy counterexample of the virtual world, since the brain in the real world thinking about trees and the brain in the virtual world thinking about trees are thinking about the same equivalence class (in the sense used above) of things.
Now, if we fully understood how a particular brain worked, we should be able to determine which equivalence class of inputs a particular concept is about. We could accomplish this simply by running the brain “backwards”, seeing which inputs lead to that concept being activated. Of course some might claim that even here we have made reference to the external world, specifically to the sense organs, which we need to determine what particular inputs correspond to. I am not convinced, however, that the sense organs are really necessary for this process. Consider vision for example. Dealing only with the brain, we are left with the optic nerve as the direct input for vision. Now the argument is that given the optic nerve alone we can’t determine what kind of input we are dealing with (for example, it could be smell). But this ignores that in our mind we have certain expectations about how visual input works. We expect that there are certain similarities between seeing an object from a distance and seeing it near by. We also expect that when me move our eyes that our visual input will change in certain well defined ways. And there are others. Now, we can read these expectations off from the brain alone, and they narrow down what this sense could be quite well. Now that is not to say that they narrow it down to exactly our vision. Possibly vision sensitive to different frequencies of light could play the role, or sonar that conveyed information about surface characteristics in place of color. But I would contend that these possible sense organs, constructed to yield basically the same experience, are simply examples of more members of the same equivalence class of input.
So, in theory (although not practically), I content that one could understand the brain without reference to the external world, assuming that we are satisfied knowing only which equivalence class of experiences a thought is about. Now we could simply demand that we must know what the thought is actually about (which member of the equivalence class) to understand someone’s mind via their brain, but I think this demand is misguided, since if we make this demand then we don’t know what our thoughts are really about with complete certainty either (you can never completely rule out the possibility that you are in a simulated world).