Machines That Think: From Artificial Intelligence to Artificial Consciousness?

Presentation to The Philosophy Forum September 4, 2011

1.0 What Is Intelligence?
1.1 Classic paper from the American Psychological Association: “Intelligence: Knowns and Unknowns” (1995). Available at:,%201...
1.2 Intelligence quotient (IQ) tests do correlate with one another suggesting the presence of general intelligence factor (g). There is good correlation with IQ and academic and job performance, and social behaviour. There is evidence of multiple intelligences, tacit knowledge, and cultural variation which are not always accounted for in psychometric testing. Genetic and environmental factors are both evident, with the former becoming more prominent with age. Matters such
as malnutrition (c.f., Flynn effect), exposure to toxic substances, early education interventions and schooling, all contribute to the capacity to reach genetic potential.
1.3 "Intelligence is a very general mental capability that, among other things, involves the ability to reason, plan, solve problems, think abstractly, comprehend complex ideas, learn quickly, and learn from experience". (Gottfredson et al, Mainstream Science on Intelligence, 1997)

2.0 Boolean and Switching Algebra
2.1 Boolean algebra, as developed in 1854 by George Boole in his book An Investigation of the Laws of Thought. Instead of the usual algebra of numbers, Boolean algebra is the algebra of values 0 and 1 ("no", "yes"), or equivalently of subsets. The operations include the conjunction ("and"), disjunction ("or"), and negation ("not") and combinations thereof.
2.2 This was further elaborated by Claude Shannon in the 1930s who found that Boolean algebra could be used in logic gates; all other types of Boolean logic gates (i.e., AND, OR, NOT, XOR, XNOR) can be created from a suitable network of NAND or NOR gates (Pierce had come to the same conclusion in the 1880s, but his work wasn't published until the 1930s). This is further elaborated with conditional statements (if-then-else, case/switch, do-while, repeat-until etc.)
2.3 A further development is Bayesian probability (named after Thomas Bayes 1702–1761) which evaluates the probability of a hypothesis with an an iterative process in which collection of fresh evidence repeatedly modifies an initial probability distribution. Fuzzy set theories (Zadeh, Klaua, 1965) extends the bivalent elemental membership with a continuum, allowing for "fuzzy logic".

3.0 Moore's Law and Beyond
3.1 Moore's law describes a long-term trend in the history of computing hardware originally described as a past observation in 1965 and has remained true since as the number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years. This trend has continued for more than half a century and is expected to continue until at least 2015 or 2020. The capabilities of many digital electronic devices are strongly linked to Moore's law: processing speed, memory capacity, sensors, network capacity and especially hard disk capacity (Kryder’s Law). Kurzweil speculates that it is likely that some new type of technology (possibly optical or quantum
computers) will replace current integrated-circuit technology, and that Moore's Law will hold true long after 2020.
3.2 Some example metrics:
RAM (price per megabyte): 1957: $411M, 1979: $6,700, 1989: $184, 1999: 78c, 2007: 7.8c
HDD (price per megabyte): 1957: $10K, 1980: $183, 1989: $36, 1999: 2c, 2010: 0.01c
HDD (capacity in gigabytes): 1956: 0.005, 1980: 0.018, 1989: 0.04, 1999: 10.2, 2009: 1500
Process (transistor count): 1971: 2300, 1982: 134K, 1993: 3.1M, 2003: 220M, 2011: 2,600M

4.0 Human and Computer Computational Capacity
4.1 The growth in computing processing and storage leads to an expectation of a singularity where machine capacities is greater than human capacities. c.f., Hans Moravec "When will computer hardware match the human brain?" . "Based on extrapolation of past trends and on examination of technologies under development, it is predicted that the required hardware will be available in cheap machines in the 2020s."
4. 2 Human capacity in processing: "[O]verall human behavior will take about 100 million MIPS of computer power" (Moravec, 1997) or "There are roughly 10^15 synapses operating at about 10 impulses/second, giving roughly 10^16 synapse
operations per second." (Merkle, 1989). Human capacity in memory: "Estimates of the number of synapses have been made in the range from 10^13 to 10^15, with corresponding estimates of memory capacity" (Merkle, 1998) - however the
brain is highly redundant in processing and memory.
4.3 The total energy consumption of the brain is about 25 watts and is about 1400cc in volume. The world's current most powerful supercomputer (at 8.162 petaflops), the K Computer, consumes 9.89 MW and and consists of 864
cabinets. The cost of the K computer was over 100 billion Yen ($1.25 billion US) to design and build. Following Moravec, it is unlikely that machines that cost so much will be used to mimic human activity - instead they will be used for 'number-crunching'.
4.4 From Kurzweil's 'The Age of Spiritual Machines' (1999) by 2091: A $1,000 personal computer has as much raw power as the human brain. The summed computational powers of all computers is comparable to the total brainpower of
the human race. Computers are embedded everywhere in the environment (inside of furniture, jewelry, walls, clothing, etc.). By a $1,000 personal computer is 1,000 times more powerful than the human brain.

5.0 What Is Consciousness?
5.1 Consciousness is typically described in terms of phenomenological subjectivity; awareness, a sense of self, which is also applied in contemporary medicine as a continuum (from being fully alert and cognisant to being disorientated, to delerious, to being unconscious and unresponsive). The historical definition suggested social co-knowledge (con- "together" + scire "to know") suggesting moral reasoning (conscientia, conscience) and language. This original use is still applied in law with the concept of legal responsibility with consciousness.
5.2 A handful of monist idealists, particularly common in some branches of religion, consider that all is consciousness. Some monist physicalist philosophers do not think that consciousness is a valid concept on the grounds that it implies a mind-body seperation proposed by Decartes (res cogitans versus res extensa) and argue that all consciousness can be explained by neuro-physical phenomena (e.g., Daniel Dennett). Others have argued that classical physics cannot explain consciousness and have sought quantum mind theories (e.g., Karl H. Pribram and David Bohm, Roger Penrose).
5.3 There is a strong tie between consciousness and language (in its broadest sense). Medical and legal opinion both agree that assessments of consciousness must include the capacity to engage in communication. A concept of 'self' that is beyond the instinctual is only formulated through language and culture with the handful of 'feral children' (e.g., the Genie experience) serving as evidence. Descartes also argued that the lack of language in animals indicated a lack of lack of access to res cogitans, the realm of thought (although many animals have since been shown to engage in fairly sophisticated communication). A more speculative, position was that from Julian Jaynes (The Origin of Consciousness in the Breakdown of the Bicameral Mind (1976) that argued in order of an "inner conversation" to occur, that a bicameral brain was required.
5.4 Any discussion of the psychological conscious must also include the preconscious and the unconscious (e.g., Sigmund Freud, Carl Jung, Jacques Lacan). Human conscious expression is also modified by embodied internal desires and social mores of acceptability. These may be explicit within the mental world (i.e., available for consideration) or they may become so deeply ingrained to result in reactions that appear instinctual. The role of the unconscious motivations to expressions and the embodiment of the mind is a chief argument used by Hubert Dreyfus (What Computers [Still] Can't Do, 1972, 1977, 1992)

6.0 Chinese Room Experiment and Philosophical Zombies
6.1 The Chinese Room (John Searle, 1980) and Philosophical Zombies (David Chalmers, 1996) are thought-experiments that are both based on the notion that consciousness is predicated upon the subject understanding, rather than being
a mere simulation. In the Chinese room neither the individual nor even the system as a whole understands Chinese characters - they are simply following rules. With philosophical zombies, the zombies are likewise unaware of qualia -
they are simply following rules.
6.2 Any artificial intelligence that is likewise built on a rules-based system, no matter how complex, will at best be giving a simulation of consciousness - even when tested to expected levels of complexity (e.g., a Turing test). Some
philosphers argue that this simulation is sufficient to indicate consciousness itself (e.g., Daniel Dennett, Douglas Hofstadter). Others argue instead that that consciousness is epiphenomenal (e.g., Frank Jackson). A critical test is the
incapacity of an artificial intelligence to "think outside" of its own rules and generate new shared and mutually understood symbolic values (c.f., Jurgen Habermas Communication and the Evolution of Society, 1979 and Karl-Otto
Apel, Understanding and Explanation: A Transcendental-Pragmatic Perspective, 1979).