home

Turing's got a Honey

Joe wrote:
This is another fine, fitting khunoinian torpedoeing of minimal rationality.

Joe wrote:
Khuno wrote:
Joe wrote:
When a person affirms - flat-out - that "Thermostats have low-level intentionality.", he is neither experimenting nor even conjecturing about a logical possibility.


You don't know what intentionality means...


Nor did you. After I caught you stating many times that "Thermostats can think.", you qualified it - as a face-saving remark - with the above "low-level intentionality" crack.
Khuno wrote:
In principle, there's no difference. Magnitude of complexity and number of causal operations is all that distinguishes a human mind from a thermostat for dennett.

To Dennett as to all of the AI people, a machine can only be ascribed a mind (an intelligence) - if the machine passes the Turing test.

Let's subject your thermostat to the Turing Test.

Alan Turing:

"I propose to consider the question, 'Can machines think?'"

If a judge holds a conversation with a man and a Honeywell thermostat, but he can tell them apart, then the thermostat fails the test. It does not think.

What you're saying about machines and the capacity of thought is either false (when a thermostat is subject to the Turing test and fails it instantly) or what you are saying is trivially true (by elastically redefining what "thinking" is into a parody). Under the rule of the Turing test - for a machine to think, its language mastery/competence cannot be distinguishable from a human. "Low-level intentionality" does not count as thinking.

Though Honeywell thermostats have been known - on occasion - to be scintillating conversationalists, they - for the most part - go about verbalizing: "It's too hot in here! Turn me down! Turn me down!".

thermostat

Khuno wrote:
I clearly don't define consciousness as psychological consciousness. Thermostats are psychologically conscious...primitive low level intentionality, but they have representational states.

A thermostat lacks representational states. It does not posses what it takes to represent concepts, objects, states, percepts, truth-values, etc, i.e. a conscious mind. A thermostat is no more than a physio-chemical feedback device. If feedback devices "have representational states", then we would have to classify stars as having far more robust representational states - than thermostats. For instance, the gravity inside a star "tries" to contract the star. The radiant energy pressure, generated by thermo-nuclear reactions inside the star, "tries" to blow it apart. In functional stars, gravity and radiant energy counteract one another - though neither uniformily nor stably over the lifetime of the star. When fuels, available for fusion deplete, the star loses radiant energy pressure, and gravity crunches it down to a fractional cinder of itself. In massive stars, fusion re-occurs with higher mass number atoms than hydrogen - due to gravity crunching the higher mass number atoms to fuse. Once again, the diameter of the star swells outward by radiant energy pressure.

Under this "consciousness pervades throughout the universe" clap-trap, stars should be construed as "representing" or "thinking" more profoundly than thermostats could ever aspire.

Blabbering that a thermostat possesses representational states is a poetic metaphor, designed to help us mis-understand what a thermostat is and what thinking is.