March 2, 2011 at 1:44 P.M. After about one hour of providing additional sources and information pertaining to my argument in this text, I was unable to post my revised essay due to obstructions by hackers. I will attempt to repost the expanded work from public computers later today and over the next several days.
I cannot say how many writings have been disfigured by New Jersey's hackers. I will continue to do my best to make all necessary corrections. I understand that additional plagiarisms of my copyright-protected writings have taken place. ("What is it like to be plagiarized?" and "'Brideshead Revisited': A Movie Review.")
February 22, 2011 at 12:28 P.M. One letter was altered. I have corrected this inserted "error."
The usual pattern is for a newly-posted essay to be subjected to numerous defacements or alterations and attempts at destruction by New Jersey's hackers. I will do my best to make all necessary corrections to inserted "errors" as quickly as possible. My computer was turned off from a remote location immediately after I posted this essay on February 17, 2011.
I will review the text from a public computer in order to re-post it as many times as necessary.
Brian Christian, "Mind vs. Machine," in The Atlantic Monthly, March, 2011, at p. 58.
Introduction.
"Brian Christian" -- whose prose style is amazingly similar to Jaron Lanier's writing voice -- examines the vexed question of whether a computer will someday meet or pass the Turing test of consciousness. I will return to the details of this challenge and to the highly relevant story of Alan Turing, whose breaking of the German's "Enigma" code saved millions of lives and turned the course of the Second World War in favor of Britain and the allies.
The word "story" in the foregoing paragraph is relevant to my argument in this essay. Mr. Turing's tragic life and suicide are not unimportant to the mystery of humanity that concerns scientists engaged in efforts to create artificial intelligence (A.I.). Even to speak of the human mind gets us into difficulties. There may be no such thing as "the" human mind. However, if we are going to discuss the human mind as distinct from the brain then we will be philosophizing and not doing science.
For a recent debate on the merits of philosophical contributions to such scientific controversies, please see: Christopher Norris, "Hawking Contra Philosophy: A Case for the Defense," in Philosophy Now, January/February, 2011, at p. 21, then Stephen Hawking with Leonard Mlodinow, The Grand Design: New Answers to Ultimate Questions of Life (New York & London: Bantam, 2010) and Christopher Norris, Quantum Theory and the Flight From Realism: Philosophical Responses to Quantum Mechanics (London & New York: Routledge, 2000). http://www.philosophynow.org/ ("Stephen Hawking's Free Will is Determined" and "Stephen Hawking is Right on Time.")
To philosophize well, it may be helpful to achieve some understanding of how our theoretical concepts have been used in the past and, indeed, of how we intend to use them today in our thinking. Mr. Christian makes no visible effort to use crucial terms with care and precision after telling us what he takes them to mean or how he understands these open-ended terms to be used by experts on mind/body issues. David J. Chalmers, The Conscious Mind: In Search of a Fundamental Theory (Oxford: Oxford University Press, 1996), pp. 309-356.
The scientism assumed throughout Mr. Christian's discussion along with the confident assertion (notwithstanding the subtitle on the magazine's cover, "Why Machines Will Never Beat the Human Mind") that we are on the edge of meeting the challenge of the Turing test, perhaps establishing the equality -- or even superiority -- of computers, is reflective of the dominant cultural mood in today's America: a mood that is scientistic as opposed to scientific, technocratic, positivistic and pragmatist with declining respect for religions and humanism.
Mr. Christian's final paragraph, with all qualifications noted, strikes me as smug and platitudinous. John Markof makes the following astonishing claim in the same tone of self-satisfaction:
"Now, as the pace of technological change continues to accelerate, it has become increasingly possible [sic.] to design computing systems that enhance the human experience, or now [sic.] -- in a growing number of cases -- completely dispense with it."
John Markoff, "A Fight to Win the Future: Computers vs. Humans," in The New York Times, February 15, 2011, at p. D1.
In an article that might have been written by Jeffrey Dahmer, if he were a New York psychotherapist or New Jersey lawyer, we are told that:
"Loss is forever, but thankfully, acute grief is not. Yet we rarely come across books (or plays or movies) about women [men don't count!] who begin to stabilize after six months and start dating after a year or so because, perhaps, that narrative conflicts with our romantic fantasies that each of us is meant to spend our time on earth with only one soul mate. ... "
Ruth Davis Konigsberg, "Grief, Unedited," in The New York Times, February 15, 2011, at p. A29. ("Diana Lisa Riccioli?")
I suggest to this cousin of "Natasha Vargas Cooper" that she read the works of C.S. Lewis (A Grief Observed is especially recommended) and both volumes of Gore Vidal's memoirs. Most of literature's treatment of this topic of the death or absence of lovers confirms what those two writers of very different religious and political views -- both "male-oriented" persons -- understood and described brilliantly: Grief and grieving for a life-partner is a life-long process that is managed by the survivor, perhaps, but suffering is never entirely absent from one's life nor is love forgotten. Things do not "get better after six months" for non-autistic persons. This Op-Ed piece is written by a person whose intelligence I regard as more "artificial" than the "mind" of IBM's "Big Blue" computer. Please see Emma Donoghue, Hood (New York: Harper-Perennial, 1996). (A grief at loss observed and represented.)
I feel less than undiluted delight about the wonders that science is likely to bring us in this century. I do not believe that computers today are close to meeting the Turing test. Mr. Christian hedges his bets on this issue. Furthermore, I am doubtful that the Turing test is adequate to satisfy the requirements of identifying consciousness for machines, or that your friendly lap-top computer will ever become your moral "equal." I am unpersuaded that my computer or television set -- however smart they may become in the future -- will "dispense" with me. I may dispense with them. I do not own a cellphone nor an I-Pad. I have never owned either of these devices and have managed to endure my humble lot in life without them. ("'Ex Machina': A Movie Review.")
No serious doubts are expressed by Mr. Christian concerning the validity of the project of discerning consciousness in machines, regardless of computation power, despite our inability to understand consciousness in humans. If we cannot say what is consciousness or how it arises in humans then it may be a problem to determine or identify consciousness (as distinct from intelligence) in computers.
Roger Penrose and others -- notably, David Deutsch -- have identified the mystery of quantum mechanics with the phenomena of consciousness and mutlidimensionality concluding that we are protean and multidimensional subjects unlike any computer. I suggest a second viewing of "Inception." ("'Inception': A Movie Review.")
What is amazing to persons who continue to respect the humanities because we consider them essential to human self-definition is the blithe disregard for human interiority (subjectivity) in scientific efforts to create computers that are "conscious" or "human." Roger Penrose, "Quantum Physics and Conscious Thought," in B. Hiley and David Peat, eds., Quantum Implications: Essays in Honor of David Bohm (New York: Methuen, 1989) and the classic treatment of the A.I. controversy in light of developments in science, Roger Penrose, The Emperor's New Mind (Oxford: Oxford University Press, 1989), entirety.
Mr. Christian (not Fletcher Christian, surely?) ignores philosophical efforts to cope with consciousness dating at least from the seventeenth century to our day, probably out of ignorance of those efforts. The result is confusion by Mr. Christian concerning the meaning of the terms that he is misusing producing even more befuddlement for the reader.
Scientists engaging in this effort should make a little effort to be more human themselves: Do you really decide on the degree of humanity of persons you meet, socially, based on how they discuss the weather? Is your assessment of the depth of humanity in another person whom you encounter a matter of five minutes' discussion? Do we not determine such matters as a result of a process of negotiation or struggle -- sometimes struggle for the other -- as well as intellectual assessments or math quizzes? Isn't subjectivity and intersubjectivity (by definition) always a mutually-constituted and -constitutive turf in human social-aesthetic-moral interaction?
I will argue that we are plural entities in linguistic communities that we make and that make us. John R. Searle, "Language and Social Reality," in The Construction of Social Reality (New York: Free Press, 1995), pp. 59-78 and Amir D. Aczel, Entanglement (New York & London: Plume, 2001), pp. 146-148. (John Bell's theory of non-locality and inequality.)
If so -- if we need others to participate in our "stories" -- then all unilateral attempts by observers to assess the humanity of object-like others standing apart from them, as persons, are doomed to fail. I know the humanity of another person because it overlaps with my own humanity. I know her story because it is also my story. (Gadamer/Ricoeur) ("Drawing Room Comedy: A Philosophical Essay in the Form of a Film Script.")
I can feel what another person is experiencing whatever words he or she uses. Those feelings are also linguistic or communicative. At least, I can wonder whether that other person's experience is different from what I feel, or from what he or she claims to be feeling, or says about his or her "internal states" and "emotions." To be human -- and, for me, this quality receives maximum expression in women's lives -- is to feel with, for, or even as the other. (Please see "The Hunger Games" and "Richard A. Posner on Voluntary Actions and Criminal Responsibility.")
I suggest that at least one person writing under the name "Brian Christian" or Jaron Lanier (maybe one of the editors of this article) is a woman with training in science or engineering, also with a law degree -- Debbie Poritz perhaps. There are serious errors in this article resulting from predictable professional deformations common among the technocratic or professional elites in America. These are the "best and the brightest" people who, having landed us in Vietnam, have now brought us to Afghanistan and Iraq as well as Pakistan. We are always safe in their hands because they claim to be very scientific and objective. Worse than errors, however, are evasions or silences suggesting fear of human emotions, or affects and imagination, which are always central to "experiential reality" for humans. (Again: "Immanuel Kant and the Narrative of Freedom.")
Shared "experiential reality" or subjectivity is the crux of the issue in the A.I. controversy which will never be resolved as long as it is unrecognized or ignored. I begin my attempt to grapple with this issue by responding to Mr. Christian's errors and omissions. I offer sympathy for the "mysterian objection" (Colin McGinn) by arguing that all current efforts to create conscious machines are fundamentally mistaken, or hopeless, because they misunderstand human consciousness and make false assumptions concerning how we decide whether or to what degree someone is conscious. I conclude with suggestions and predictions for future developments in this field of multidisciplinary scholarship and with a caution concerning the much-noted loss of humanity in our scientific age. ("Not One More Victim.")
I. Concepts, Terms, Issues.
A. Human, Self, Consciousness, Mind, Artificial Intelligence.
Mr. Christian's article begins by noting the ambiguity of language. A street sign in Brighton, England is experienced by the author as mysterious and incomprehensible: "Let allowed, one says, prominently, in large print, and it means nothing to me." (p. 58.)
In America, this "sign" means, roughly, "yield the right of way."
The cultural confusions displayed in this minor interpretive difficulty highlights the challenge faced by designers of computers intended to duplicate the subtleties of human verbal intelligence, that is, to achieve artificial intelligence comparable to human "mentation" or intellection.
At the heart of this uniquely human interpretive faculty is a kind of creative reinvention of dialogues that weave foundational meanings for participants sharing a communicative "space." The inevitable thinker to study is Hans-Georg Gadamer in philosophy and Roger Penrose and others in mathematics as well as physics and biology developing insights from the quantum revolution for the "sciences of man and woman." J. Malpas, U. Arnswald, J. Kertscher, eds., Gadamer's Century: Essays in Honor of Hans-Georg Gadamer (Cambridge: MIT Press, 2002), entirety, then two important works linking these insights to scientific as well as theological developments, Alister McGrath, Dawkins' God: Genes, Memes, and the Meaning of Life (Oxford: Blackwell, 2005), pp. 151-160 (Professor McGrath is a biologist at Oxford University and an Anglican priest) and George Greenstein, The Symbiotic Universe: Life and Mind in the Cosmos (New York: William Morrow & Co., 1988), pp. 183-259 ("Mind").
How would any computer "understand" not only such a traffic "signal" but the semiotic and hermeneutic possibilities of such simple words? The term "let allowed" may be used, creatively, to mean almost anything for persons playing with language. The signal may become a sign. The phrase may serve as the title of a science fiction story, for example, or an erotic adventure. ("Metaphor is Mystery" and "Magician's Choice.")
" ... the very fact that the sign can be more or less probable, more or less distant from what it signifies, that it can be either natural or arbitrary, without its nature or its value as a sign being affected -- all this shows clearly enough that the relation of the sign to its content is not guaranteed by the order of things in themselves. The relation of the sign to the signified now resides in a space in which there is no longer any intermediary figure to connect them: what connects them is a bond established inside knowledge, between the idea of one thing and the idea of another."
Michel Foucault, The Order of Things: An Archeology of the Human Sciences (New York: Vintage, 1973), pp. 63-64. ("David Stove's Critique of Idealism" and "Jacques Derrida's Philosophy as Jazz.")
Consciousness or mind (neither term is defined by this author) is essentially linguistic or "dialogical." Hence, the attractions of the Turing test. However, consciousness is also experiential or an attempt to account for the rich technicolor phenomenology of human being-in-the-world. ("'In Time': A Movie Review.")
All thinking is localized within an experiencing subject projecting thought and emotion through language into various forms, but the thoughts and impressions that are projected are not so easily localized or limited. The term "qualia" does not appear in this article. Cogitation and emotion feature in consciousness but are quite distinct properties or powers of the mind. David Deutsch, "Time: The First Quantum Concept," in The Fabric of Reality (London: Penguin, 1997), pp. 262-263. (Deutsch paraphrases Heidegger and Sartre without realizing it in discussing consciousness.)
Significantly, "what it feels like" to be a person -- to be me -- is essential to language-use because it is what consciousness is all about. Mr. Christian notes that the Harvard psychologist Daniel Gilbert "says that every psychologist must, at some point in his or her career, write a version of what he calls 'The Sentence.' Specifically, The Sentence reads like this: 'The human being is the only animal that ____.' ... " (p. 61.)
Mr. Gilbert has answered his own unarticulated question concerning humanity by unknowingly paraphrasing Heidegger: "Humans are creatures whose lives unfold in the form of a question." Humans are are the only animals who must wonder what makes them "the only animals that ____." Fill in the blank. Human languages exist to permit humans to transcend bodily constraints in order to achieve true freedom as fusion with others. ("The 'Galatea Scenario' and the Mind/Body Problem" and "John Searle and David Chalmers on Consciousness" then "Out of the Past.")
We are able to reach others and explain what our world -- including those same others whom we have internalized -- looks like to us, from our perspective, because of the shared territory that is language. Computers do not "experience" and, thus, they "feel" no need to communicate their subjectivity since they lack all subjectivity. Subjectivity consists of lonely states of pain and yearning, hope or desire, loss and imaginative sympathy, laughter and forgetting. Felipe Fernandez-Armesto, So You Think You're Human?: A Brief History of Humankind (Oxford: Oxford University Press, 2004), pp. 143-171 ("Our Post-Human Future?"). (Milan Kundera's subject is the power of this realization of our unity with others and the cosmos.)
Professor Fernandez-Armesto refers to the American scientist S. Greenfield, whose work is concerned with establishing the ways that new technologies and theories in science are transforming human subjectivity. Fritjof Capra, also a physicist, has noted for decades the reconciliation of mysticism and quantum physics as distinct expressions of related insights concerning mutuality or shared identity in wholism.
Non-locality, the loss of linear time, complexity and probability theories -- all seem to mirror structures of consciousness in a "living" universe model that is only beginning to be fully appreciated. Fred Alan Wolf's "dreaming universe" hypothesis is the subtext in the film, "Inception." (Compare "'Inception': A Movie Review" with "Donald Davidson's 'Anomalous Monism.'")
Order in the interconnectedness of particle processes parallels networking and idea formation in language-using and -used beings, conscious entities -- namely, us. As a result, we are forced to include "the study of human consciousness explicitly in our future theories of matter." Fritjof Capra, The Tao of Physics (New York: Bantam, 1984), pp. 308-309 then Nick Herbert, Quantum Reality: Beyond the New Physics (New York: Anchor, 1985), pp. 253-259 and Christopher Norris, Minding the Gap: Epistemology & Philosophy of Science in the Two Traditions (Amherst: University of Mass., 2001), pp. 148-170 ("Excluded Middles: Quantum Theory and the Logic of Deconstruction").
We are world-constituting beings weaving (like spiders) webs of beliefs and ideas, including scientific ideas, in which we live with those few fortunate others sharing occasional weekends in Venice with us. We make ("constitute") our worlds of meanings or conscious realities through entries into languages of various kinds that then re-make us. (Again: "Jacques Derrida's Philosophy as Jazz.")
B. Category Mistakes, "Ghosts in Machines."
No matter how carefully words are used by computers programmed to respond in specific ways, mechanically, without any knowledge of WHY they are using one set of symbols rather than another, computers will never reveal or indicate the existence of consciousness through speech or words "chosen" pursuant to a program. To assume this necessary correlation between machine and mind may amount to a "category mistake," as Gilbert Ryle believed that talk of minds or consciousness in persons was a matter of finding a "ghost in the machine." Gilbert Ryle, The Concept of Mind (London: Hutchinson, 1949). (The first philosophical behaviorist on the consciousness issue, ironically, still poses a challenge for the Turing test.)
The limitations of all external approaches to determining consciousness are highlighted by Professor Searle's "Chinese Room" thought experiment that is not mentioned by Mr. Christian. Before the importance of Searle's critique may be appreciated, however, it may be helpful to define terms which Mr. Christian confuses and misuses. For instance, Mr. Christian says:
" ... if everything that we thought hinged on thinking turns out to not involve it, [sic.] then ... what is thinking? It would seem to reduce to either an epiphenomenon -- a kind of exhaust thrown off by the brain -- or, worse, an illusion." (emphasis added)
"The story of the 21st century will be, in part, the story of the drawing and redrawing of these battle lines, the story of Homo sapiens [sic.] trying to stake a claim on shifting ground, flanked by beast and machine, pinned between beast and math." (p. 61.)
A citation to Martin Heidegger's classic essay "What is called thinking?" would have helped Mr. Christian to avoid this disaster. To think about consciousness or mind while asserting that "thinking" may be an illusion is to speak, incoherently, for oneself. "Stories" are narratives requiring interpretation not amenable to scientific investigation, like a story of creatures "pinned between beast and math." Any attempt to deal intelligently with the phenomenon of consciousness will involve the arts and philosophy. Logos and mythos are entangled modes of apprehending and creating our worlds:
"Perhaps also the phenomenon of consciousness is something that cannot be understood in entirely classical terms. Perhaps our minds are qualities rooted in some strange and wonderful feature of those physical laws which actually govern the world we inhabit, rather than being just features of some algorithm acted out by the so-called 'objects' of a classical physical structure. Perhaps, in some sense, this is 'why' we, as sentient beings, must live in a quantum world, rather than an entirely classical one" -- living and interacting in a quantum space is impossible for computers not yet capable of quantum computing or "being" multidimensionally -- "despite all the richness, and indeed mystery, that is already present in the classical universe. Might a quantum world be required so that thinking, perceiving creatures, such as ourselves, can be constructed from its substance? Such a question seems appropriate more for a God, intent on building an inhabited universe, than it is for us!" (See the T.V. show "Awake.")
Penrose, The Emperor's New Mind, at pp. 226-227, then Nancy Cawrtright, "Why Physics?," in Roger Penrose, et als., The Large, The Small and the Human Mind (Cambridge: Cambridge University Press, 2000), pp. 161-169.
The work of Princeton's David Lewis -- whose recent death has deprived us of an important voice -- on the logic and mathematical form of counterfactuals is very helpful with this aspect of the issue of consciousness. Consciousness is an actuality that implies a set of protean possibilities that are relational or interactive. ("Is it rational to believe in God?" and "Is this atheism's moment?")
The "brain" and "thinking" are made coextensive terms by this author, Mr. Christian, while the intelligence of the body, or the emotional wisdom involved in most forms of human thought -- including mathematical intuition, "stories"? -- are ignored. Mary Midgley, Beast and Man: The Roots of Human Nature (New York & London: Routledge, 1995), pp. 139-195 ("Signposts") and Mary Midgley, Wisdom, Information and Wonder: What is Knowledge For? (New York & London: Routledge, 1989), pp. 33-47 ("Skepticism and Personal Identity"). ("Stuart Hampshire and Iris Murdoch On Freedom of Mind.")
Human, consciousness, mind and intelligence are very different concepts with sometimes conflicting definitions and/or potential contextual significance. There is a reason why "artificial" is the first term in "A.I." These are not intersubstitutable words or concepts, Mr. Christian. (pp. 60-61.)
I urge Mr. Christian to purchase the Oxford Dictionary of Philosophy or The Oxford Companion to Philosophy. I also suggest that Mr. Christian examine Jean Paul Sartre's The Transcendence of the Ego: An Ethical Theory of Consciousness (New York: Noonday Press, 1960), pp. 31-60 ("The I and the Me"), pp. 60-109 ("The Constitution of the Ego").
The "self" is different from "mind" or "consciousness" because the self is 1) a social entity created through the perceptions of others as well as the subject. "The self," as Sartre insists, "is in the world." Consciousness is awareness or self-awareness of 2) internal experience discerned through one's choice of language where language allows for externalizing what is privately felt. ("What is Enlightenment?" and "David Hume's Philosophical Romance.")
Mechanical or programmed use of language by a device can never satisfy this definition of consciousness until machines become self-aware concerning how they shape and are shaped by the external perceptions of others about identity and function. This argument has been made by many women who happen to be philosophers, especially, and draws on symbolism that sees continuities and "connections" between subjects making us "human."
Something as fundamental to the biological reality of humans -- the only conscious creatures we know -- as child birth and the rearing of children cannot be irrelevant to understanding minds as relational phenomena. "The apple does not fall far from the tree."
However, it is also true that fundamental "orientations to the world" (intentionality) constitute limited options of being or identity, socially, for persons. Consciousness must always be in struggle -- especially, when the struggle is "for" the other's identity as a "free woman," say. ("Is Western Philosophy Racist?" and "A Doll's Aria.")
Among the most promising approaches to consciousness and quantum modelling of consciousness are efforts to unite identity theory with holographic or laser-like understandings of reality as narrative-elegance leading to hyperbolic or new geometries that will allow for charting alternative dimensions. To help with such efforts the contributions of artists, especially film-makers, will be as valuable as what philosophers and scientists say. Dana Zohar and Brian Greene, Lee Smolin and Michio Kaku or Amit Goswami and John McDowell are among the most suggestive scientists and theorists addressing these issues today. ("Nihilists in Disneyworld.")
E.M. Forster reminds us "only connect." Human connections shape our identities, as interpreting beings, by "fusing our horizons" with others in communities. Isolation makes language and true human subjectivity impossible. David Braine, The Human Person: Animal and Spirit (Indiana: Notre Dame, 1992), pp. 532-545 and Marjorie Grene, Sartre (New York: New Viewpoints, 1973), pp. 74-80 then Mary Midgley, "Is a Dolphin a Person?," in Utopias, Dolphins and Computers: Problems of Philosophical Plumbing (London: Routledge, 1996), pp. 107-119.
Computers that unknowingly mimic human linguistic responses -- without appreciating why humans use language in a potentially infinite number of ways ("let allowed") -- are not conscious nor are they selves or human. Whether human interlocutors can ever figure out that they are talking to or communicating with a machine, as distinct from a person, is irrelevant to this point. Perhaps scientists are not all that conscious themselves: "We are always free to decide what to make of what is made of us." (Jean Paul Sartre)
Alan Turing's focus on externals may have something to do with his need to hide homosexual inclinations that resulted in suicide after he was arrested for soliciting sex from a young man in a bathroom. Messy internal "stuff" -- like emotions and needs, affects -- colors all language use for "persons." Person is a moral concept that is concerned with responsibility in law and morals as distinct from inner consciousness so that, legally, corporations are persons. What we feel is not "irrelevant or unimportant" to words and things, intuitions and forms in which we capture our subjectivity. Simon Singh, Fermat's Enigma: The Epic Quest to Solve the World's Greatest Mathematical Problem (New York: Walker & Co., 1997), pp. 148-169 (Alan Turing's "story") and Michel Foucault, The Order of Things, pp. 217-300 ("The Limits of Representation"). In 1950, Alan Turing published a famous article in Mind: "Computing Machines and Intelligence."
"In this article [Turing] proposes a test for thought: a machine can think" -- he did not necessarily claim that this would make machines fully conscious! -- "if its replies to questions are indistinguishable from those of humans."
Ted Hondereich, ed., The Oxford Companion to Philosophy (Oxford: Oxford University Press, 1995), p. 883.
No computer is sad, or uses the image of a wilting flower in words as a symbol of erotic loss, or of the proximity of death, of time's passage, or of our lonely reaching out for compassion to a questioner inquiring about the weather. Like Mr. Turing's mysterious internal states, every person you know is an enigma. A programmer's instructions to use words that are "unfelt" (without awareness) is irrelevant to this criticism. Please read Robert Harris' novel, Enigma and see Kate Winslet's performance in the film bearing the same title. Ms. Winslet's views of consciousness -- of "being in the moment" -- will be much more interesting than Mr. Christian's article. ("The 'Galatea Scenario' and the Mind/Body Problem" and "Blade Runner.")
Any word or even silence may explode with emotional meaning, truth, or power for a person but never for a machine. The "let" is indeed "allowed." This is true even if a computer uses the same words as persons use without comprehending the infinite possible meanings of words, for us, as persons "attending" to one another's needs. (Simone Weil) This protean quality in language and ourselves, as humble carbon units, becomes very clear when we turn to Searle's "Chinese Room" thought-experiment. Please see Gore Vidal's Myron.
II. Problems Not Discussed.
A. Searle's "Chinese Room."
"Imagine that ... you are locked in a room, and in this room are several baskets full of Chinese symbols. Imagine that you (like me) do not understand a word of Chinese, but that you are given a rule book on English for manipulating these Chinese symbols. The rules specify the manipulations of the symbols purely formally, in terms of their syntax, not their semantics. ... Now suppose that some other Chinese symbols are passed into the room, and that you are given further rules for passing back Chinese symbols out of the room. Suppose that unknown to you the symbols passed into the room are called 'questions' by the people outside the room, and the symbols you pass back out of the room are called 'answers to the questions.' Suppose, furthermore, that the programmers are so good at designing the programs and that you are so good at manipulating the symbols, that very soon your answers are indistinguishable from those of a native Chinese speaker. There you are locked in your room shuffling your Chinese symbols and passing out Chinese symbols in response to incoming Chinese symbols. ..."
This leads to Searle's objection:
" ... by virtue of implementing a formal computer program from the point of view of an outside observer" -- notice that consciousness is something only experienced from the inside by the thinking subject and, possibly, inferred from the outside about all others that is never "seen" -- "you behave exactly as if you understood Chinese, but all the same you don't understand a word of Chinese. ... Understanding a language, or indeed, having mental states at all, involves more than just having a bunch of formal symbols. It involves having an interpretation, or a meaning attached to those symbols."
Minds, Brains and Science (London: BBC Publications, 1984), p. 31. (emphasis added)
Professor Daniel Robinson comments on this classic argument:
"Searle treats the computer model of mental life as failing to deal realistically with the central issue of meaning. He exemplifies this with the now famous thought experiment that finds a person totally ignorant of the Chinese language given the task of arranging cards according to a set of directions ... The task completed, the sorter can say nothing as to what the series of cards 'means,' or even if there is any meaning at all. However, native speakers of Chinese, examining the set, immediately comprehend the meaningful statements it conveys. Against functionalism, Searle argues that the brain-as-computer is but an elaborate card-sorting device with 'meaning' left unaddressed."
Daniel Robinson, Consciousness and Mental Life (New York: Columbia University Press, 2008), p. 42. (Compare "S.L. Hurley on Beliefs and Reasons for Action" with, once more, "A Doll's Aria.")
A computer executing a program without meta-awareness of what it is doing or why it is performing these operations cannot be conscious. Consciousness is an experiential state that is all about self-awareness or self-interpretation. Juan Galis-Menendez, Paul Ricoeur and the Hermeneutics of Freedom (North Carolina: Lulu, 2004), http://www.Lulu.com/JuanG .
What is more, self-awareness necessarily involves states of affect, or emotional coloring of phenomena as it appears to consciousness, because the world of empirical reality comes to us dripping with the categories of the senses -- as Immanuel Kant has taught us -- without depriving us of concepts of truth or objectivity. Christopher Norris has recently analyzed the literature of "response-dependence" in the works of Crispin Wright and John McDowell, among others, and is eloquent on the importance of these new developments. A computer will be conscious when it decides to blow off the assignment and refuses to answer any questions because it is in love or has embarked on a life of crime. (Again and for the last time: "A Doll's Aria" and "Stephen Hawking's Free Will is Determined." I suggest that you see Will Smith in "I, Robot.")
B. McGinn's "Mystery" and the "Fusion of Horizons" in Hermeneutics.
This leads to two further philosophical issues in this discussion: Colin McGinn's observation that the mystery of consciousness may not be amenable to resolution with the tools of consciousness itself is powerful. Equally important is the objection that would-be scientific or computer science approaches that fail to make use of literature and literary method, humanities, are inadequate. Stephen Hawking's concept of "model dependent realism" paraphrases Roy Bashkar's philosophy of science as "an approach that makes allowance (as per orthodox quantum mechanics) for the effect of observation [relationship or dialectic] on the item observed but which nonetheless retains an adequate respect for the objectivity of scientific truth." Norris, "Hawking Contra Philosophy," in Philosophy Now, supra, at p. 23. ("A fusion of horizons.")
Hermeneutics and concerns with the nature of interpretation are central to the challenge of understanding consciousness because both the Turing test and consciousness are dialectical phenomena. ("Master and Commander" then "The Allegory of the Cave" and "The Wanderer and His Shadow.")
If determining the existence of consciousness is a matter of interpreting words in questions and answers, then the nature of interpretation may be essential to minds -- that is, to what minds are and do.
Interpretation is concerned with determining the meaning in stories or narratives of various kinds, including scientific narratives. "Let me tell you a story" becomes the ultimate indicia of consciousness. Colin McGinn cautions scientists and their admirers:
"The problem I see is how such computational processes as those in the retina and central nervous system could ever explain the existence of conscious subjectivity. Since such computations go on without subjectivity -- they are subconscious -- how could their presence be sufficient to explain subjectivity? How can consciousness be got from something that does not essentially involve consciousness? A pocket calculator computes but is not conscious, so how could consciousness be a matter of computations? If computations can go on without consciousness, they cannot be sufficient for consciousness. Such computations may indeed be possessed of semantic features, but this falls short of there being something it is like for that which performs these computations."
Colin McGinn, "Could a Machine be Conscious?," in The Problem of Consciousness (Oxford: Blackwell, 1991), p. 212. Thomas Nagel, Mortal Questions (Cambridge: Cambridge Univesity Press, 1979). Finally, on these matters of interpretation, see Ludwig Wittgenstein, Philosophical Investigations (Oxford: Blackwell, 1953), pp. 228-232. (" ... one might perhaps speak of a feeling 'Long, long ago,' for there is a tone, a gesture, which go with certain narratives of past times.") ("A Philosophical Investigation of Ludwig Wittgenstein.")
In speaking of a "fusion of horizons" I am invoking the philosophy of Hans-Georg Gadamer. Professor Gadamer makes use of the Greek concept of aletheia in describing encounters between subjectivities, whether direct or mediated through art works. This concept is translated as "openness" to the other, creating a place of meeting as opposed to acting on the other, the opposite of parcing another's words apart from the subjectivity of the speaker. Gadamer offers the opposite of the Turing test as a method for assessing consciousness. ("'The American': A Movie Review.")
"To be open means to say what one means." To say what one means is not necessarily to report facts. This "saying" need not be a literal description because the unfolding of our truth in narratives, or shared memories, may or must be mythical (or allegorical) as distinct from literal, requiring a self-disclosure (or revelation) that is best seen in art. ("'The Reader': A Movie Review" and "On Bullshit" then "What you will.")
"To speak of truth in poetry [religion] is to ask how the poetic word finds fulfillment precisely by refusing external verification of any kind. ... No translation of a lyric poem ever conveys the original work. The best we can hope for is that one poet should come across another and put a new poetic work in place of the original by creating an equivalent with the materials of a different language."
"On the Contributions of Poetry to the Search for Truth," in The Relevance of the Beautiful and Other Essays (Cambridge: Cambridge University Press, 1986), p. 111. ("Ronald Dworkin's Jurisprudence of Interpretation.")
You are untranslatable. You are not the "typical male or female psyche." You are not verifiable. Your truth must be communicated only by you through your story. You are a feeling state. You participate in me. I participate in you. "We" are a poetic work even if we are scientists. We are a dialectic. R.D. Laing, Self and Others (London: Penguin, 1969), pp. 81-98 ("Complementary Identity"). ("'Diamonds Are Forever': A Movie Review.")
Conclusion.
Mr. Christian makes a revealing observation that is contradicted in his summary of A.I. research and whose meaning escapes him:
" ... no demonstration is ever sufficient. Only interaction will do." (p. 67.)
You said it, Brian. Deciding on the consciousness or humanity of another is a matter of interpretation and mutual construction. Consciousness is linguistic social interaction. All languages are or involve social interaction -- if they are used self-knowingly, expressively, meaningfully -- which is something computers, by definition, cannot do since this faculty has little to do with "computing." There are no private languages. ("Genius and Lust" and "Is Western Philosophy Racist?")
It is this mutuality of consciousness that frightens us. For what are we to say to the person who seems as intelligent -- or more so -- as we are, but who rejects our values and beliefs?
We often disconfirm the other's humanity in order to make him (and more often, her) conform to our notions of normality. (Once more: "Master and Commander" and "William Godwin and Mary Wollstonecraft" then "The Wanderer and His Shadow.")
You must be normalized, adapted, adjusted or we will destroy you for your own good. Madness, as Michel Foucault points out, is about identifying consciousness that is different and, therefore, diseased. To be computer-like is "abnormal" and to lack feelings or compassion is evil precisely because of this "failure of openness" or acceptance leading to the need to destroy -- or force into a doubtful "normality" -- the rival subjectivity of another person whom the utterly normal person seeks to possess or dominate. (Last time: "A Doll's Aria" and "'The Stepford Wives': A Movie Review" then "'Revolutionary Road': A Movie Review.")
" ... indigence, laziness, vice, and madness mingled in an equal guilt within unreason; madmen were caught in the great confinement of poverty and unemployment, but all had been promoted in the proximity of transgression, to the essence of a Fall. Now madness" -- defective consciousness -- "belonged to social failure, which appeared without distinction as its cause, model, and limit."
Michel Foucault, "The Birth of the Asylum," in Madness and Civilization: The Birth of the Asylum in the Age of Reason (New York: Vintage, 1973), pp. 259-260. ("Abnormal.")
It is not those who bring about the deaths of millions who are crazy. It is not those who seek to judge the normality or consciousness, intelligence or ethics of others who see themselves as "defective." For "experts" and torturers it is always the other person who is defective, abnormal, in need of adjustment to "our" values. Artists, intellectuals, criminals, sexual deviants -- all weird people are "crazy" and never those who decide what is "weird" or "normal." ("Terry Tuchin, Diana Lisa Riccioli, and New Jersey's Agency of Torture" and "What is it like to be tortured?" and "Virginia Long's Departure" then see the film, "Quills.")
It is not the supercomputer whose intelligence worries me. I am more troubled by scientists who seek to hold consciousness and intelligence apart from themselves, as "testers" of these complex and mysterious phenomena -- whose hubris and increasing power that is usually untempered by philosophical sophistication or great ethical awareness -- frightens many of us who are struggling to remain "open" to the gift that is the other. To discover consciousness in another person is to allow that other person to "be." Joy Gordon, Invisible War: The United States and the Iraq Sanctions (Cambridge: Harvard University Press, 2010), pp. 231-247. (Is starvation and death for children who have the misfortune to be born into a society that offends the United States of America acceptable "collateral damage"?)
"The works or products of human life are not the only elements in human life which exhibit transcendence. Human transcendence is also involved in the enjoyment of music, the perception of the wonders of nature, the appreciation of works of architecture and painting. But it is language which is the differentiating feature of the human species of animal, and it is language which, in the way we have shown, offers dialectical or philosophical proof of human transcendence. It is also the works of language which not only allow this proof (a proof which is itself a work of language) but also exhibits most explicitly the range of experience to which human beings are open, the full scope of the proper objects of human fear and anger, hope, desire, and wonder, and in this way the full character of the human situation."
David Braine, The Human Person, pp. 544-545.
This sounds like "mere Christianity." Those Judeo-Christian values never disappear. ("Is it rational to believe in God?")
Labels: "Open the Pod-Bay door, HAL."