from Wikipedia, the free encyclopedia
Screenshot by ELIZA
ELIZA, here in Emacs

ELIZA is a computer program developed by Joseph Weizenbaum in 1966 , which was supposed to show the possibilities of communication between a human and a computer via natural language .

Weizenbaum chose the name ELIZA based on the play Pygmalion by George Bernard Shaw . The program can simulate various conversation partners using scripts . It has become known for the superficial simulation of a psychotherapist who uses the non-directive methods of client-centered psychotherapy according to Carl Rogers .

Weizenbaum wrote the program in MAD - SLIP for an IBM 7094 that used the CTSS time-sharing system from the Massachusetts Institute of Technology .

ELIZA can be seen as an early implementation of the Turing test . However, it would not have passed since a user can find out quite easily that he is communicating with a machine.


In a simple version, ELIZA uses a thesaurus , i.e. a structured dictionary. The entered sentence is searched to see whether it contains a word that is available in the thesaurus and then based on this word for synonyms , but mostly for general terms . The program also contains a collection of phrases on various topics, often questions or requests to continue. The following question-answer pair can be used as an example:

User: "I have a problem with my father."
ELIZA: "Tell me more about your family!"

In this case, the program has implemented the assignment of father to its generic term family as defined in the thesaurus and selected a phrase relating to the family . This simple operation shows the limitation of the program, which the user does not understand at all. So the keyword principle also leads to the following apparent "dialogue":

User: "War is the father of all things."
ELIZA: "Tell me more about your family!"

If the program can't find a word with which to do something, it responds with general fallback phrases such as B.

“I haven't understood that yet, can you explain it to me?” Or
"I don't understand anything about that, let's talk about another topic."


Weizenbaum justified the choice of the psychotherapist as a simulated interlocutor with the fact that such an interlocutor is allowed not to have to show any knowledge of the world without losing his credibility. In his article, Weizenbaum uses an example to illustrate this: If the human interlocutor utters the sentence “I went on a boat” and the computer replies with “Tell me something about boats”, the person will not assume that the interlocutor has no knowledge of boats.

The communication behavior of test persons towards the program corresponded to that towards a human interlocutor. Obviously, it didn't matter too much to them whether the person answering the other end of the line was actually human or a computer program. All that mattered was that the answers and questions appeared "human". This is the so-called Eliza effect, which is used in many chatbots today.

Most of the test subjects in the experiments were even convinced that the “interlocutor” actually understood their problems. Even if they were confronted with the fact that the computer program they had "talked to" converted simple statements into questions based on a few simple rules and certainly without "intelligence", "reason", "empathy" etc., they often refused to accept this.

Weizenbaum was shocked by the reactions to his program, especially the fact that practicing psychotherapists seriously believed that it would lead to an automated form of psychotherapy . Not least because of these experiences, he developed into a social critic . This Weizenbaum development is the subject of a documentary entitled Plug & Pray , which was released in 2010.

Further development

In further developed versions, a sentence can also be broken down grammatically and analyzed in order to e.g. B. to recognize negations or to be able to distinguish questions from statements. In more complex and efficient further developments, the thesaurus is usually replaced by an ontology . This means that more complex dependencies can be processed and the conversation history can be dealt with. It may then not be necessary to save complete sentences, since the answer can be composed of several sentence fragments and a sentence can be varied accordingly. This enables answer variants such as:

"Tell me more about your family"
"Tell me more about your hobby"
"Tell me more about your job"

With further developed systems, applications can now be realized for outlined subject areas, e.g. B. a timetable information. Even decades after it was first developed, such a system quickly reaches its limits. Approaches to understanding can be postulated within these narrow limits , provided the system gives a suitable answer to a query. However, even such further developments do not achieve a real understanding.

At the Google I / O conference in May 2018, an experimental “digital assistant” was presented. B. can make reservations in restaurants or appointments at the hairdresser automatically.

See also


  • Joseph Weizenbaum: ELIZA - A Computer Program For the Study of Natural Language Communication Between Man And Machine . In: Communications of the ACM . 1st edition. June 1966, ISSN  0001-0782 .
  • Joseph Weizenbaum: The power of computers and the impotence of reason . 11th edition. Suhrkamp Verlag, Frankfurt 2000, ISBN 3-518-27874-6 (first edition: 1978).
  • Detlef Borchers: A misunderstanding turns 40 . In: c't . No. 23 , 2006, ISSN  0724-8679 , p. 40 ff .
  • Stefan Höltgen, Marianna Baranovska (eds.): Hello, I'm Eliza. 50 years of conversations with computers. 1st edition. Projekt Verlag, Bochum 2018, ISBN 978-3-89733-467-0 (first edition: 2018).

Web links

Individual evidence

  1. Joseph Weizenbaum: The power of computers and the impotence of reason .
  2. ^ Judith Malek-Mahdavi, Jens Schanze: Plug & Pray. Documentary with Joseph Weizenbaum about the consequences of ELIZA. Retrieved November 26, 2014 .
  3. ^ Matthias Kremp: Google Duplex is scary good. In: Spiegel online. Retrieved May 14, 2018 .