Jipi and the Paranoid Chip Read online

Page 2

“Paranoid schizophrenics in cells. Mentally healthy interviewers in cubicles somewhere else. And these software thingies on the Net.”

  “Yes,” says Mr. Cardoza, seemingly relieved that Jipi was not completely hurled off the track by his diagram. He tries to draw a circle in the middle of the triangle and ends up with sort of a mashed ovoid with a Pac-Man-like indentation in one side. “And then there was a very simple piece of software that would sit in the middle and set up two-way conversations between randomly chosen pairs of entities.” He draws radii from vertices into the circle and then elbows them back out to other vertices. “So sometimes you’d have a normal person talking to a normal person. Sometimes a paranoid schizophrenic talking to a piece of software.”

  “Okay, I get the idea. But I’m guessing they weren’t told.”

  “That’s right. It was like a game. At the end of each conversation, the link would be cut”— cutting motions with the pen now—”and each participant would be asked to give his opinion as to whether the entity he’d just been conversing with was a paranoid schizophrenic, a normal human, or a piece of software.” A big question mark, off to one side, now throws the whole diagram off balance, and reminds Mr. Cardoza that there are about 3 square meters of blank whiteboard that he hasn’t touched yet. He begins drawing in other, randomly spaced and sized question marks, striving for some kind of visual balance, but when he steps back away from the whiteboard after drawing each one, he finds that it’s just thrown the totality of the diagram out of whack, forcing him to step forward again and put an additional question mark somewhere else. Pretty soon the whiteboard is looking like the Riddler’s jumpsuit.

  “Why?”

  “The purpose of the experiment, remember, was to evolve software that could distinguish a paranoid schizophrenic from a normal human just by talking to them on the Net,” Mr. Cardoza mumbles, kind of distracted by ongoing prosecution of the question mark balancing thing. “And so if a given piece of software gave the correct answer—”

  “Meaning, it succeeded in distinguishing between a normal human and a paranoid schizophrenic—”

  “Yes. Then it would be allowed to reproduce. If it gave the wrong answer, it would be terminated. Over time, the software evolved so that it got very good at identifying paranoid schizophrenics.”

  “I understand that part. But you said that, at the end of the conversation, each participant was asked to give its opinion.”

  “That’s correct,” he says uneasily, sensing that Jipi’s dragging him toward some kind of conceptual briar patch.

  “But that means that the opinions of the humans—the normal ones and the paranoid schizophrenics—were being counted too.” Jipi points at a couple of randomly chosen question marks as if they support this assertion, and Mr. Cardoza, hoist by his own graphical petard, becomes unnerved.

  “I suppose so. Remember, Jipi, I didn’t invent this crazy experiment, I’m just—”

  “Why should the humans’ opinions have counted?”

  Mr. Cardoza presses his lips together and makes his mustache bristle with compressed air, which is what he always does when he’s deep in thought. When he’s completely flummoxed he will valve the air into his cheeks and make them into perfectly smooth-shaven hemispheres. In Jipi’s experience, perhaps 1 adult male out of 10,000 wears cologne. Mr. Cardoza is one of these. She has always wondered why all of that men’s cologne is on sale at department stores and duty-free shops if, in the average major city, only about a hundred people are actually wearing it. But Mr. Cardoza basically never wears the same cologne twice, which helps to explain it. He never wears too much of it, and he always picks it so that it will complement, in some sense, what he is doing on this particular day. Today, he smells faintly like a rich, tasteful Middle Eastern gentleman, and Jipi wonders what is in store.

  But today he does not bulge his cheeks out in defeat. Instead he gets a determined and implacable look on his face. “This was all explained to me at 2 in the morning,” he says. “Be patient.” He sets his pen down. “The fumes of this pen kill my brain cells.” He drinks coffee and stares out the window for a few moments, watching the big Goto earthmovers clawing up sewer slop down in Intramuros. “Okay. Remember, the goal of the experiment was to create software that would identify the paranoid schizophrenics by conversing with them on the Net.”

  “Right.”

  “Now, I’m no mental health professional, but from what I know of paranoid schizophrenics, I’d think that the idea, should it occur to them, that they were conversing, on the Net, with software daemons programmed to hunt them down so that they could be incarcerated, is just the kind of thing that would really set their teeth on edge. Does that work for you?”

  “Sounds very reasonable, Mr. Cardoza.”

  “And so then it would be very important for these daemons to be evolved in such a way that they could converse with people on the Net, at least in a limited way, without arousing the suspicions of paranoid schizophrenics.”

  “Bingo. And so, during the experiment, if the human participant was able to peg the software daemon as being a piece of software, and not a human being, then that software daemon would be killed.”

  “Right. But one that could pass for a human being would be allowed to reproduce, et cetera, et cetera.”

  “Okay, Mr. Cardoza, it all makes sense. So after the experiment’s been going on for a long time, they’ve got this population of highly evolved software daemons that, number one, can identify paranoid schizophrenics by conversing with them, and, number two, cannot be recognized as mere pieces of software by the paranoid schizophrenics they are talking to.”

  Mr. Cardoza smiles and holds up his index finger. “Almost.”

  “Almost? What did I miss?”

  “I am not a genetic programmer,” Mr. Cardoza says, “but my understanding is that this kind of evolution is extremely slow. It takes thousands, or millions, of generations to get anything that actually works.”

  “Homer was telling me about this,” Jipi says.

  “Mr. Goto?”

  “Yeah.” It’s the second time Jipi has mentioned her sometime boyfriend by name, and she’s nor doing it for effect, but it always seems to galvanize Mr. Cardoza, who is clearly one of these guys for whom the entire world of business, finance, politics, etc. is just a superficial skin supported by, but hiding, a much deeper and more complicated and interesting and, in the end, important infrastructure of personal relationships. This is not an unreasonable way for a Filipino businessman, or anyone, for that matter, to think about it. But Jipi’s irked, because (she’s starting to realize) she has this implicit belief-probably naive, in fact probably beyond naive and verging on eccentric or cultlike—that the information Homer imparted to her, when he went off on his (at the time) dull and pointless tangent about genetic programming, should be considered on its own merits, as a set of pure ideas, and not as evidence that a certain personal relationship exists between Person A and Person B. She just wants Mr. Cardoza to listen to the idea, in other words, and not to read between the lines and figure out the hidden implications of the fact that the idea was imparted to her by young entrepreneurial home-run slugger and civil engineering company heir Homa (Homer) Goto. Also, if Jipi were inclined to be insecure, she would worry that Mr. Cardoza has only hired her because he wants to get closer to the Goto family business empire. As it happens she’s not insecure, and she’s not worried about that at all. But even the fact that a guy as deft and sensitive as Mr. Cardoza is doing something that might make some other, hypothetical girl feel that way seems to imply that he sees nothing inappropriate about it—Jipi senses a boundary dispute, in other words. Or maybe she’s underestimating Mr. Cardoza. Maybe he senses that she doesn’t have an insecurity problem and so he says things to her he wouldn’t dream of saying to a girl who couldn’t handle it.

  All of which crystallizes in her brain during the time it takes her to say yeah, as a sort of germ of an insight that she’ll cultivate and unfold and discuss with her f
riend Teeb later. “The airline companies wanted to evolve autopilot daemons that could handle certain anomalous situations, like wind shear,” she says in the meantime. “But they couldn’t very well create a whole bunch of daemons and put them at the controls of jumbo jets and fly them into wind shear to find out which ones were fit to reproduce. So instead they simulated the wind shear on big computers, and they simulated the airplanes too, so that they could run the experiment on fast-forward and evolve these things in just a few years.”

  “Yes. That’s how most people create evolvers,” Mr. Cardoza says. “We have big computer installations all over Manila doing that kind of thing as we speak. But what if you want to evolve daemons that are supposed to interact with human beings?”

  “Oh! Then you’ve got a problem,” Jipi says.

  “Yes. Because there is no way to speed up human interaction.”

  “It’s going to be slow going.”

  “Every time a new generation of these daemons is evolved, it must be tested for evolutionary fitness by having it interact with human beings. Sometimes, as in this case, it might have to interact with several human beings for several hours at a time! Only after this has happened can the ‘breeder’ make a decision as to which daemons will be killed, and which will be. allowed to reproduce.”

  “So with this paranoid schizophrenic thing-you’re telling me that either it had to be a huge experiment, with thousands and thousands of volunteers, or that it’s not going to produce any results for many years. Either way, what does Mindshare Management have to do with it?”

  Again with the index finger. “You are forgetting there is one other possibility,” Mr. Cardoza says. “You told me that the airplane companies created computer simulations of wind shear, so that they could speed up the evolutionary process. Why not try the same approach here?”

  Jipi sees the answer immediately, but it takes several minutes to make herself believe it’s possible. She gets pretty involved with thinking about this, and eventually realizes that several minutes have gone by, during which Mr. Cardoza has fielded a couple of important phone calls, and she has at some point raised both hands up to the top of her head and begun massaging her scalp, tracing those little meandering crevices between the plates of her skull. “Uh,” she finally says, “so you’re telling me that they created software to simulate the thought processes of paranoid schizophrenics?”

  “Remember,” Mr. Cardoza says, “that some of the conversations, in this experiment, were between normal humans and paranoid schizophrenics. Others were between normal humans and software daemons. At the end of each conversation—” he starts flourishing his pen at the whiteboard’s question marks.

  “The normal human would have to give an opinion as to whether the entity he’d just been conversing with was a paranoid schizophrenic or an evolver.”

  “Yes.”

  “So, if you hooked up the experiment in the right way—”

  “If you killed the evolvers that were easily recognized as evolvers, and allowed the ones who seemed, to normal humans, like paranoid schizophrenics, to reproduce—”

  “Eventually,” Jipi says, “you’d evolve some software that behaved, on the Net, just like a paranoid schizophrenic.”

  “Correct.”

  “And then,” Jipi concludes, “you could speed up the whole experiment. Because you could just fire all of the authentic human paranoid schizophrenics—”

  “Who probably weren’t such great employees anyway,” Mr. Cardoza says in a very discreet sotto voce.

  “—and use the schizo-daemons in their place, just like the wind-shear simulation software—zillions of times faster.”

  “At that point, they were able to run the experiment in hyper-speed,” agrees Mr. Cardoza. “And eventually, they generated some extremely highly evolved software daemons that were capable of sifting out paranoid schizophrenics from the vast torrent of interaction that moves across the Net every day.”

  “Well, that’s really cool, Mr. Cardoza,” Jipi says. Actually she’s lying about this because she is still a bit troubled by some of the implications. But Mr. Cardoza is her boss, and he hired her for her nice personality, and she’s diligent about doing what he hired her to do. A lot of having a job, she’s figured out, is playing a role. Lots of girls are good at having jobs because the same fun role-playing impulse that causes them to enjoy shopping for clothes and experimenting with looks serves them well in this sense. Jipi’s never been that kind of girl, particularly, but Teeb certainly is, and as soon as Jipi moved down here to Manila and got a job, Teeb insisted that they do a lot of heavy-industrial clothes shopping. The job/shopping linkage was completely obscure to Jipi at the time and only recently has she gotten it; now she can plainly see why you’d want to be able to doff your job persona at the end of the day, as easily as peeling off stockings.

  “But I still don’t understand—”

  “What this has to do with Mindshare Management.”

  “Right.”

  “The job we’ve been hired for, actually, has nothing to do with the paranoid-schizophrenic recognition software,” Mr. Cardoza says. “It’s about a by-product of the experiment. The schizo-daemons.”

  “What about them?”

  “When the contract was finished, Lamarck Logic still had, living in its systems, a number of these schizo-daemons, which were interesting but commercially useless. Their management began searching for some way to turn a profit from them.”

  “How can you make money selling software that acts like a paranoid schizophrenic?”

  Jipi asks the question rhetorically, but Mr. Cardoza nods calmly and says, “Yes. That is the question they asked themselves. Well, it turns out that such software is just what the doctor ordered for certain commercial applications— particularly in the security industry.”

  “You mean securities, like stocks and bonds?”

  Mr. Cardoza laughs, not unkindly, and says, “That’s a great idea, but I was talking about car alarms.”

  “Car alarms?”

  “Exactly. Think about it. What is a car alarm? It is a network of sensors distributed throughout the vehicle. Some of them listen for the sound of breaking glass. Some sense opening doors. Others are tuned to pick up motion. Some of them use a kind of radar to sense the presence of nearby human bodies. All of these sensors are wired into a central brain—a computer—which monitors the inputs that it receives from them, and then tries to make a judgment call as to whether the car is being, or has been, stolen. This is by no means an easy calculation.”

  “Tell me about it!” Jipi can’t walk to work without passing several cars whose alarms have gone off for no good reason.

  “It’s notorious! False car alarms are a blot on the urban landscape all over the world!” says Mr. Cardoza. He’s rising to the occasion, and some color is coming into his face. “Why? Because the software living in the brains of car alarms is just too stupid to tell the difference between a stray signal, like a pedestrian brushing against the vehicle, and an actual break-in. What’s needed is not better sensors, but better software. Lamarck Logic saw a market niche!”

  “But that’s totally wrong! If the problem is too many false alarms, then it seems like a paranoid schizophrenic is the last thing that you want calling the shots.”

  “Well, that’s your opinion as a person who is frequently annoyed by false alarms,” Mr. Cardoza says. “But if you are the chief executive officer of a car alarm company, the last thing you want interpreting the data is a computer brain with an easygoing and mellow personality. What you would like is a brain that was smart enough to detect spoofing.”

  “Spoofing.”

  “It is sort of a general term, nowadays, meaning any attempt to take advantage of a literal minded computer program by meddling with its inputs. An example from when I was a boy:

  Early car alarms could detect a door being opened, but not a window being broken. If you smashed a window and crawled in without opening the door, the car alarm did n
ot understand that you were up to something.”

  Jipi just tries to restrain the impulse to smile at what sounds a hell of a lot like a confession about Mr. Cardoza’s misspent youth.

  “Of course nowadays the sensors are much more elaborate,” he continues, blushing slightly, “but it is still possible to spoof a car alarm’s brain by feeding it a combination of inputs that will convince it that everything is normal.”

  “I see where you’re going,” Jipi says. “Paranoid schizophrenics are suspicious by nature-when they see something that looks like normal life, they don’t just assume that everything is normal.”

  “Right,” says Mr. Cardoza, “instead, they assume that the news, the stock market, the

  Internet and so on are all being manipulated by some kind of monstrous, hidden conspiracy that just wants everyone to think that everything is normal.”

  Like the Black Chamber. Jipi thinks, but does not say, this. Instead she says, “Okay. So I’m guessing that Lamarck Logic cut a deal with a car alarm maker.”