enowning
Monday, July 16, 2007
 
In-der-Blog-sein

Idea Festival asks and Steven Horst responds:
I find Heidegger hard to figure out, personally. But my understanding of what he says that is relevant to philosophy of mind is basically that discursive thought is only a small portion of what goes on in our (broadly) mental lives, and is itself parasitic upon more basic modes of thinking and being like skillful bodily engagement with the world. So trying to understand the mind in its totality in terms of, or on the model of, discursive reasoning is sort of like trying to understand the physical world in terms of rocks and lakes and sofas. There are things that are more or less explainable in those terms, but the explanation is not fundamental. It leaves a lot of things out, and doesn't allow you to understand why the macro-level regularities so often break down.

This has a particular application to the view that the mind is, in its totality, a computer running a symbolic program. Computer programs are themselves modeled upon discursive reasoning. Upon a very regimented form of reasoning, to be sure, but discursive nonetheless. The classic AI/CogSci strategy is to hypothesize that all of our reasoning, including unconscious and intraconscious processes, are underwritten by something program-like, and that means ultimately something modeled on our understanding of conscious discursive reasoning, involving rules and representations, only transposed to another level that the conscious mind cannot access. If Heidegger is right, and the underlying architecture of thought is not well described in such terms, that's a serious problem, as it means that AI and traditional CogSci are importing an interpretive metaphor (rules and representations) that mis-represents its subject matter.
 
Comments: Post a Comment

<< Home
For when Ereignis is not sufficient.

Appropriation appropriates! Send your appropriations to enowning at gmail.com.

View mobile version