You are on page 1of 2

Erwin Duran

Syntax does not mean semantics. Simulation does not mean duplication.
-- J.R. Searle. Lately, my (late) Monday night Honors PHI.2930 class (or," The Ethics of Artificial Intelligence") has been covering the very meat of the Strong A.I. debate, introducing notable, figures, arguments and counter-arguments in what seems to be a. Through the mass grave of dead-end philosophical and scientific ideas and arguments, we may just find both Alan Turing and Alonzo Church cornered against a wall, desperately clinging to the Computationalist dream while John Searle's adherents load their firearms, eager to snuff out that nagging misconception about computers and their computing potential. So, what was Alan Turing shielding from oblivion? Why, the holy grail of artificial intelligence itself: the possibility of engineering a thinking, feeling, perceiving, conscious robot. A feat of man that (some say) will no doubt infinitely transcend our accomplishments in taking flight, taming the stars, splitting the atom, artery-clogging fast food drenched to hell with perservatives, and so forth. An intriguing recent assignment involved hour-long conversations with so-called chatterbot programs. I honestly never heard of such constructs before, but I immediately envisioned some chat window reminiscent of the AIM interface upon the TAs previewing our first conversation with ALICE. There's also Jabberwacky, with whom ALICE communicated with in the last conversation. The point of the assignment was a simple application of the Turing Test, which basically states that if a computer program could carry out a natural conversation with a human being to the point of being indistinguishable from any talkative snob (as judged from an outsider), then that computer program possesses genuine intelligence. Although Turing was no doubt a mathematical demigod, his own theoretical proclamations linking with the Computational (algorithmbased) nature of consciousness and mental states proposed by philosopher Hilary Putnam stirred an effective counterargument from John Searle years later. In a famous thought experiment known as the Chinese Room, Searle demonstrated that a computer program could appear to understand and produce results, but ultimately evade genuine understanding of what the hell was going on in the first place. My input stream "I LOVE YOU" would probably be reciprocated by " YOU TOO, ERWIN! I MISS YOU!", but the program would not have a clue about the fundamental concept "to love" -- it's only returning a line dictated by its program. That's it. Searle describes the concept of "Intentionality", among other things, as a key ingredient of consciousness. As humans, when we say or think of things, we mentally project or aim our thoughts at certain points (i.e. " Lol at Hipster Hitler" is about the webcomic; " This gun is pointed at your head" is about the gun). Excerpts: W/ ALICE -- did the program pass the Turing Test?

a: Can we just cut the bull and leave this place? b: Do you mean you and me? Let me think. How? a: You and I hand in hand. b: We have a lot more in common than you think, Wild Boar Friend in the German. a: What's your astrological sign? b: Saggitarius. a: Really? b: For real. a: Me too. b: I'm glad we agree, Wild Boar Friend in the German. a: Children and animals will be involved in our relationship, says one website. b: Kids are cool. a: I'm planning to give away all my kids. b: Does that make you feel superior? a: It's completely logical.
ALICE W/JAB (JAB IS HUMAN)

Human: I am trying to learn it. ALICE: What is the difficulty? Human: I don't see the difficulty ALICE: Look harder. Human: I think I see a pattern developing. ALICE: You believe that? Where do you see it? Human: It's a little scary ALICE: a pattern developing is a little scary. I suppose that makes sense. Human:.You and I both understand it well ALICE: We have a lot more in common than you think, unknown person. Human: Yeah, unlike some stupid humans I met ALICE: That's interesting. Do you like talking to me? Human: Yes, very much ALICE: Thanks I like talking to you too, unknown person. That seems like a lot. Human: Things are not always what they seem

*a = Erwin *b = ALICE

You might also like