Knowledge technology

from Wikipedia, the free encyclopedia

The knowledge technology is a name for the connection of a collection of some of over two thousand years old formal methods for knowledge extraction and use of modern methods of knowledge engineering and mathematics.

Subject matter and classification of the subject

The term knowledge technology first appeared in the late 1980s. The discomfort of some computer scientists about the term "artificial intelligence" led to new words like intelligence technology or knowledge technology. In order to make the transition from the information society to the knowledge society, other technologies were required. Some of the necessary things had been known for over 2000 years: the Scholastic dialectic as a functional model and the ontologies as a data model. Some weaknesses have been eliminated by new ideas and the old non-numerical algorithms such as arborization to search for solutions capable of consensus with computer-aided processes such as resolution and unification have been automated and their performance significantly increased. Since the aim of the old methods was primarily to gain and secure knowledge, the first word component is best defined epistemologically:

Knowledge is any formal or formalized product of knowledge .

In the past, formalization was always meant linguistically on the basis of formal logical figures and rules of closure. These can, however, be transferred to today's mathematical notation without any problems, for example for programming in Prolog. A probabilistic interpretation was unknown and was only recently added. The second word component does not necessarily require information technology automation, but a well-ordered, consistent structure of the formal means used:

Technology refers to a consistent information architecture made up of objects ( ontology ) and methods (generally non-numerical algorithms) that can be used to acquire, test, manage and use knowledge. It is not necessary, but can be automated with IT means.

Knowledge technology is thus primarily a branch of computer science with references to philosophy, linguistics, epistemology and mathematical logic as well as other branches of mathematics, especially probability theory. Its practical use is above all the support and provision of an arsenal of methods for knowledge management in every modern organization. In practice, the constructive project documentation that was created on this basis was particularly important, which brought significant progress, especially in the structuring of project management and documentation, and made it possible, even after years, to quickly assess the effects of changes in conditions on the project and its documentation Find.

history

In the history of knowledge technology, two sections can be distinguished, separated by about 500 years.

Dialectic - antiquity

The first is the Scholastic Dialectic . It begins with Socrates († 399 BC) and his Socratic dichotomy, the recognition of the difference between reality as the fundamentally linguistically accessible environment of human-cognitive systems and reality as a set of certainties of a (individual) person. Reality determines the behavior of a person, it is what "affects" him.

Pyrron von Elis († around 270 BC) showed with a simple contradiction proof that a person cannot fundamentally recognize the reality and thus the truth of a statement. He can only try to bring his reality as close as possible to reality. Therefore there is only one reality but as many realities as there are people. Around the same time, the ontology was developed as a conceptual theory based on the theory of categories by Plato and Aristotle .

The dialectic as a method of correct discussion, first mentioned by Plato, therefore does not assume that one of the discussants is in possession of the truth and can and should convince the others of it. On the contrary, the parties should exchange their certainties and try to achieve a result that is as realistic as possible. Therefore, a compromise is to be avoided as a matter of principle: Nobody can know whether the positions given up by the discussants in their interests are "true" or "false", so that a compromise is very likely to lead to suboptimal solutions. Only the consensus in which each person discussing modifies his or her realities according to reason and thus brings them into harmony with those of his discussion partner can hope to achieve an optimal, i.e. realistic, solution.

This dialectic was an essential part of the statecraft of the Roman Empire and was continued in the Christian Church after its decline. The syllogisms of formal logic (formulated in natural language) were the essential thought mechanism that determined the discussion.

The largely standardized procedure was that the conditions that the individual participants placed on a solution were collected in a list called a vexillum (flag). Contradictions were then resolved by looking for alternatives in a non-numerical algorithm called Arbor Porphyriana after its inventor, Porphyrios von Tyros († approx. 301), which showed a solution. The procedure largely corresponds to the resolution in today's Prolog interpreter; However, a modern implementation is incomparably more efficient not only because of the much higher processing power of a computer, but also especially because of the parameterization of the terms and the unification of variables that this enables.

Scholastic Dialectic - Middle Ages

In the Middle Ages, this school of thought in the high schools (Paris, Oxford, Heidelberg, Prague) for the Scholastic Dialectic was developed epistemologically primarily by Thomas Aquinas († 1274) and by Wilhelm von Occam († April 9, 1347 in Munich). Occam used logical methods in many directions. For example, his model of "inner speech" served him to develop a method of conceptual distinction, which brought significant progress in the establishment of ontologies: according to Occam, a rational person formulates a statement "inwardly" as evidence and then only expresses that Conclusion. If the panelist is induced to give the full evidence, the medium of evidence defines the central concept that the discussant is actually talking about.

Loss of meaning

At the beginning of modern times, two weaknesses in the application of logic in the Scholastic Dialectic became clearer: One was the idea that every task should be formulated as a well-defined problem that could then be solved by listing the conditions in full with the logical algorithms of an arborization . Scholasticism had no models of thought ready for the increasing complexity of networked systems and the problems that come with it. These have only recently been developed (for example PlanMan) and successfully used to analyze and master multidimensional problem areas.

The second difficulty was the consistent formalization on the basis of logical statements. Formal logic assumes that every term is true or false and defines the links between them and , or and not , which, according to the logical figure, i.e. the arrangement of the linked terms, derive a result that is also strictly true or false.

However, more than 1500 years earlier, Pyrron von Elis had proven that certainties could not be classified as true or false. The whole logical machine on which Scholastic Logic was based was thus more than doubtful. One could hope that from certainties, at least for the individual human being, she would also derive suitable certainties and thus provide a useful model of at least individual realities.

This hope was not reliable, however, and indeed specialists were still able to deal with it - it is still trained and used today, for example by Jesuits. In the case of laypeople who were not logically trained, the results were more a matter of luck, especially with more complex derivation chains. In particular, multiple AND operations turned out to be very unreliable.

Further development and use in modern times

The second section begins around 1980. The Scholastic Dialectic was rediscovered as a knowledge technology. Now ontologies should be implemented as a data model and resolution as the functional basic mechanism of logical programming in Prolog interpreters. Both difficulties came up again.

The complexity shows up, among other things, when specific subject areas count between 10 and 100 thousand terms and with an assumed quadratic complexity (because of the relations) one comes to 100 million to 10 billion potential relationships. The functional side becomes correspondingly complex if one relies on the complete recursion with reset automatically offered by a Prolog interpreter to find a solution and does not want to add or program any heuristics. And finally, the results that were found with considerable effort even with automation were not even reliable because of the problem of formal logic described.

The solution for this only came when the theory of plausible reasoning developed by Georg Pólya around 1960 became known and used. It replaces the truth values ​​true and false with plausibilities, i.e. probabilities between 1 and 0. The logical operators AND, OR and NOT are replaced by the corresponding combinations of the probabilities. The conjunction corresponds to the multiplication and the disjunction to the addition with subsequent normalization. As a result, AND chains approach probability 0 fairly quickly, even with high plausibilities, while OR chains, conversely, tend towards probability 1. If the probabilities are known to some extent on the basis of statistics, physical models or similar knowledge, then the full apparatus of the Scholastic Dialectic of modern knowledge technology is available again.

Finally, methods are currently being developed to reduce the complexity on the object as well as the method side by means of heuristics and contexts to a degree that can at least be handled with modern data processing systems.

Consensus and compromise

There are three basic models according to which action decisions are made:

  1. The dictation . An instance makes the decision and enforces it with all means at its disposal. There are many examples. Every fascist system works like this. (Fascistoid is a system in which an authority believes that it is in possession of the truth and therefore has the right to enforce its ideas).
  2. The compromise . Compromise means reaching the agreement of two or more parties about a decision, an action to be taken or the tolerance of a project in which both sides have to give up something. The ancient dialectic understood compromise as a formal procedure, namely the mutual promise of contending parties to submit to the decision of a self-chosen arbitrator and to accept his decision or to lose a certain amount previously deposited with the arbitrator as a penalty.
  3. The consensus . Consensus also means that two or more parties agree to a decision, an action to be taken or the toleration of a project. Here, however, all those involved part with a good feeling. Everyone's interests were taken into account and the best possible solution found.

Over time, the meanings differentiated. Today, compromise is generally understood to mean an agreement in which each of the parties gives up parts of their terms and demands. On the other hand, the consensus aims to modify, specify and adapt both the task and, if necessary, a solution that has already been considered, as well as the conditions listed by the parties involved, so that at least the conditions considered necessary are all or can be met.

This gave rise to the already mentioned conviction that compromises by giving up ideas and conditions usually inevitably lead to suboptimal solutions even for all those involved, while a consensus makes the achievement of the optimal ones at least likely. The consensus is therefore sometimes referred to as a win-win solution. A procedure similar to the procedure recommended by knowledge technology based on the Scholastic Dialectic is the Harvard concept for negotiation preparation and conflict resolution.

In the literature on consensus and the path to it, it is noticeable that a modification of the task or the problem-solving is hardly even mentioned. You can read an example of this under Consensus differences in the decision-making process. The Wikipedia entry is empty for the only method mentioned here, the "systemic principle of consensus". This reduces the procedure to a simple democratic procedure without safeguards such as several decision-making bodies (chambers), minority protection, arbitrators and courts. Mechanisms for introducing new knowledge and for modifying or enriching the solution are not even rudimentarily provided. Participants or those who feel disadvantaged are inevitably forced to either compromise or other democratic solutions such as demonstrations or refusal.

The ancient as well as the scholastic dialectics therefore used a technically sophisticated, step-by-step procedure that did not know any coordination, not even "informational", but by enriching the knowledge of all participants, early definition of the conditions to be set by each participant in a solution with ongoing modification - and the possibility of clarification during the entire course of the project and the possibility of major changes in the intended solution with experienced moderators almost always leads to a "real" consensus.

It is important that in no solution step, no one involved should feel a victory or a defeat, which then inevitably force a compensation and thus a compromise. Note that voting is counterproductive for this reason, because they always have winners and losers. They are also ethically questionable. This is often overlooked. They make the losers in the vote the means to implement the wishes of the majority and thus clearly violate Kant's pragmatic imperative . Without suitable safeguards, which are never and cannot be made in normal discussions, they therefore violate human dignity. And of course this is all the more true for other methods of bringing about decisions by means of power or violence.

The list of conditions is the central document of a problem-solving process carried out according to the rules of scholastic dialectics on the basis of modern knowledge technology.

It is always at the beginning of a project or a task solution on which a consensus is to be reached. It is kept as a central document and constantly updated over the entire course of the project. Extensions, changes and additions to the project must also be incorporated into them. As a rule, this also generates new conditions. Again, consensus must be reached on them in the same way as on those that have already been adopted. It goes without saying that the old conditions can also change or be integrated with new ones.

Condition lists - the vexillum

The list of conditions is usually created during an initial discussion for the new project. The conditions that the participants set for a solution are written on flip charts or metaplan boards under a short task description ("thesis") line by line on a vertical line. This creates the image of a flag, which is why the list was called "Vexillum" by the scholastics; this name or its German translation has survived to this day flag formation . The following example is based on Lay's book:

A wage is only fair if

  • it is paid in the contractually agreed amount,
  • the amount is based on the market value of the work product,
  • the amount is based on the market value of the work,
  • the recipient can provide for himself and his family appropriately,
  • he also protects the recipient in old age,
  • its height does not endanger the continued existence of the company,
  • its amount does not suggest the dismissal of the employee,
  • its height does not endanger the workplace (automation, relocation ...),

Obviously, even the mentioned conditions are hardly capable of reaching a consensus, and each of the parties involved in a wage negotiation, for example, could certainly add more, which make a consensus even more difficult. The usual recommendation to get the parties involved to modify or withdraw their conditions through discussion generally fails. For this reason, advisors usually recommend using other decision-making methods when making a decision. Unfortunately there are few promising ones. The questionable nature of votes has already been pointed out.

On the other hand, the Scholastic Dialectic and thus also the knowledge processing provide a set of methods based on formal-logical methods, how the terms of a vexillum can be classified, converted, merged and modified. The semantics are either not changed, or the authors can control this procedure and thus approve or reject it. Occam also extended these methods into the ontological area through the model of "inner speech": his conceptual distinction allows, also in agreement with the discussion participants, to clarify the concepts they use and, if necessary, to sharpen the context ontologies of the participants to set them apart from one another in such a way that the opposites disappear or can be easily eliminated by modifying the conditions and often by choosing a suitable word. Finally, the arborization offers another, likewise formal-logical procedure to determine a consensus-based set of alternative conditions.

This is made much easier if the individual terms of a vexillum are formally transformed - again with the participation of, of course, and only with the consent of the respective authors. This is an important task of the moderator. It is particularly useful if negative instead of positive formulations are used. The fulfillment of negative conditions is usually easier to control than that of positively formulated demands. "The bank account must never have a negative balance" is easy to check, "there must always be sufficient liquidity", if only because of the lack of clarification of the facts.

This is reflected in the theory of formal logical proofs : There are routine standard methods with which one can prove that something is not the case - for example the contradiction proof , which constructs a contradiction to known or predicted statements. However, there are no general and more broadly applicable methods for proof of existence, i.e. the strict proof that an assumed fact is correct.

With modern means of logical programming , these processes and conversions can be largely automated. The resolution and unification processes built into programming systems such as Prolog expand the performance of the ancient algorithms considerably by resolving logical variables.

Formal logic processes and techniques

As already said, knowledge technology differs from the usual methods of preparing and making decisions on the one hand through its consistent orientation towards consensus instead of a compromise and, on the other hand, through the wide range of scientifically justifiable and, in some cases, thousands of years of proven formal logic and mathematical procedures, which are necessary for a consensus make the necessary modifications, conversions and mergers of terms of a vexillum, i.e. the conditions and their assumed types and connections, possible in the first place.

First, the conditions must be checked for their type - again, of course, again in agreement with the authors and the other discussants:

  • Are the conditions necessary for a satisfactory problem solution? If they are only useful, they are crossed out but collected separately.
  • Are there sufficient conditions , if they exist, the thesis is always fulfilled? They are also collected and can later be used to modify the proposed solution. Perhaps you can fulfill a lot more wishes with them without any effort than originally hoped.
  • Are all conditions capable of consent, or can they at least be made capable of consent? Sometimes this requires a change or even reformulation of the solution thesis. Few methods draw the ever considered, because in them the ultimate goal by the previously created specifications then or in the specification is laid down, or at least seen that. This is precisely the aim to optimize the solution from the point of view of all those involved.
  • Are all necessary conditions met or can they be met? As a rule, individual conditions must be replaced by alternatives. The appropriate method is arborization, a non-numerical algorithm originating from Porphyrius (died between 301 and 305 in Rome), named after him as Arbor Porphyriana and which has become the standard method of analysis to solve any problem in which conflicting conditions give the parties involved a quick Difficulty finding consensus about the optimal solution. At the beginning of the 80s of the last century, its automation became the most important basic technology of "artificial intelligence" as a resolution.

If it is not possible to find enough alternatives, one can often assign the semantics used by different discussants to different context ontologies with a conceptual distinction. This is particularly successful when the participants represent different problem views or parties - such as employees and employers. As a rule, this does not solve the contradictions by itself, but creates a prerequisite because now those affected become clear that they are "not talking about the same thing" and therefore everyone can more easily accept the other's point of view in the respective area of ​​application.

The method to be used is based on Occam's concept of "inner speech". Here, the discussants are asked for the reasons for the statements they have made, which are not explicitly mentioned, and put into logical evidence. From the medium of the evidence, the term and the context can then be derived about which the respective "really" is talking.

A formal conversion is worthwhile even if the conditions are sufficient. It makes sense to collect these separately. Since one of them is enough to justify the project, OR can be assumed as the conjunctor of the sufficient list of conditions. The conversion into AND-linked necessary conditions for insertion into the original vexillum is then a purely formal and, if necessary, also formalizable process - a prologue interpreter, to which the flag is entered as a prologue program, carries out this conversion automatically, without the human being Users become aware at all. It simply consists of the negation of all statements and their connection by a logical AND. Modern logic formalizes this as de Morgan's rule : the negation of an alternative is the conjunction of the negated terms and vice versa.

Finally, the above-mentioned justification for the multiple use of formal logic in spite of the Socratic dichotomy should be briefly discussed here, when the logical terms and operations are replaced by equivalent probabilistic ones on a case-by-case basis. The theory of plausible inductive reasoning was published in two books by the Hungarian mathematician George Pólya .

Pólya describes "evidence" of the kind as plausible conclusions

The victim was poisoned with cyankali.
The defendant obtained Cyankali shortly before the murder.
_________________________________________________________________

So the defendant is (presumably) the killer.

Both assumptions are particular, and thus nothing at all follows from them according to the rules of classical logic. Therefore, this conclusion is certainly not enough as the only piece of evidence. If there are several such circumstantial proofs , a person somehow adds up probabilities and at some point they become certainty for him.

Since people therefore see plausibilities as probabilities, Polya maps them to them. He defines evidence patterns that correspond to the classic evidence figures. If you treat them on a basis that is modeled on the calculation of probability, you can increase Polya's measure of plausibility more and more to almost 1 by adding the probabilities with several different instances of a suitable proof pattern with the same conclusion. Formally correct logical proofs, the precedents of which are only certainties, may also provide plausible justifications for the final proposition, the certainty of which can in principle be calculated. The dialectical methods based on classical logic, such as the conceptual distinction, therefore continue to apply if certainties are only assumed instead of truths and these are correctly estimated or - better - calculated according to probability theory.

Constructive knowledge technology

Knowledge technology is therefore the technological support and implementation of epistemology as the basis of all scientifically justifiable processes for the acquisition, verification and use of knowledge. Their central problem has always been verification: The Socratic dichotomy, i.e. the proven fact that a person cannot fundamentally recognize reality, was the core problem for more than two millennia. Epistemology endeavored to find methods and criteria as to whether and how realities could be approximated to reality and success could be proven, measured or estimated. The well-known Faust monologue ("... and knows that we can know nothing ...") is the best evidence of the frustration of this endeavor. Because of this unsuccessfulness, epistemology remained a fairly theoretical and consequential branch of philosophy.

That changed with constructivism , whose leading representatives in the German-speaking area are Rupert Lay , Paul Lorenzen and Paul Watzlawick . The idea is to give up the futile attempts to optimally approximate realities to reality. Instead, insights should serve to construct suitable and useful realities - hence the name. “Suitable” is a technical-economic criterion and “useful” is a sociological-ecological criterion: the results should be both efficient and socially optimal. This aspect enabled a far more rational, result-oriented construction of realities and their implementation in technical processes and products.

Constructive management and documentation of technical projects

In contrast to traditional epistemology, this new approach turned out to be surprisingly fruitful and useful, especially in technical and economic practice. The conception and, above all, the long-term project documentation of projects of all types and sizes, which up to now often began as a "wish list" due to the lack of a solid theoretical basis and ended in the chaos of unfulfilled and uncontrollable conditions, was based on the creation of lists of conditions and their systematic transformation into and Supplemented by a wide variety of documentation of the project progress, controllable and maintainable: every change in conditions, e.g. due to a change in the law or new requirements, can be easily found in the documentation and - if necessary - taken into account by making appropriate adjustments to the affected project parts.

The documentation is understood as a "protocol of the knowledge processes" that led to the implementation decisions and developments. Based on the condition catalogs and the trees derived from them, as well as other constructs such as a hierarchical user model, its semantic network emerges as a superstructure over the essentially unchanged, familiar project documentation. This structure enables the conception and implementation of administration, research, problem identification and version management tools that were previously not feasible.

This new object model and its practical application in the conception, management and documentation of the most varied of plans and projects establish constructive knowledge technology as its most successful and useful application to date.

literature

  • Lay, Rupert: Fundamentals of a Complex Philosophy of Science, Vol. 1: Fundamentals and Science Logic . Freiburg im Breisgau: Josef Knecht Verlag, 1971.
  • Lay, Rupert: Communication for Managers . Düsseldorf: ECON Taschenbuch-Verlag, 1991. ISBN 978-3-612-21137-8
  • Lay, Rupert: How to treat each other sensibly . Düsseldorf: ECON Taschenbuch-Verlag, 1992. ISBN 978-3-430-15935-7

Individual evidence

  1. Schnupp, Peter; Leibrandt, Ute: Expert systems - not just for computer scientists. Berlin: Springer 1986
  2. Geiger, Daniel: Knowledge and Narration - The Core of Knowledge Management. Schmidt, Berlin 2005, p. 10. ISBN 978-3-503-09085-3 and Mentsch, Dietrich 1989: Text and Image Optimization. Theoretical prerequisites for the practical optimization of print and AV media: comprehensibility research and knowledge technology. In: Antos, Gert; Augst, Gerhard (ed.): Text optimization. Making texts understandable as a linguistic, psychological and practical problem. Frankfurt am Main / Bern / New York / Paris: 8-37
  3. See Schnupp, P .; Huu, CTNguyen: Expert Systems Internship. Heidelberg: Springer 1987 and Thuy, NHC; Schnupp, Peter: Knowledge processing and expert systems. Oldenbourg 1989
  4. ^ Karl Ernst Georges: Comprehensive Latin-German concise dictionary. Hannover 1913 (reprint Darmstadt 1998), Volume 1, Sp. 1372
  5. ^ Lay, Rupert: Communication for Managers. Düsseldorf: ECON Taschenbuch-Verlag, 1991; Lay, Rupert: How to treat each other sensibly. Düsseldorf: ECON Taschenbuch-Verlag, 1992
  6. Rupert Lay: Ethics for Managers, Methods of Successful Attack and Defense
  7. http://www.projektwerkstatt.de/hoppetosse/hierarchNIE/reader/entscheid02.html
  8. http://www.slideshare.net/pscheir/ontologie-et-al-begriffsdefinUNGEN-im-kontext-wissensreprsentation
  9. http://www.fh-wedel.de/~si/seminare/ws04/Ausbildung/5.Prolog/LogPro5.htm
  10. http://www.uni-due.de/imperia/md/content/computerlinguistik/benutzermodellierung_fische.pdf
  11. http://plato.stanford.edu/entries/ockham
  12. ^ Pólya, George: Mathematics and Plausible Reasoning. Princeton NJ: Princeton University Press 1968, 2 volumes