Wednesday, July 28, 2010
Following the common definition of ontology by Gruber, perhaps more precisely worded by Guarino, ontologies are formalization of concepts. Everybodies experience and ample experimental results show that concepts have a radial structure and are influenced by the context. The concept of "pet" is different when the context is "running in the garden" (prototype dog or cat), "the name of my daughter's pet when she was 3 was Bella" (stuffed animal) or "Peter's pet is swimming in the pond" (goldfish).
What is the context implied when a concept is formalized in an ontology? Are we tacitly assuming that no context is affecting the concept? Aerts, Rosch and Gabora (2008) posit a "ground state" for a concept, which is the state the concept is in before any context is operating on it; it is clearly different from person to person and more a fiction which results from their expertimental setting in Aerrts and Gabora (2005) than an observable concept state.
Ontologies in information science are necessary to explain how a concept is used in an information system application. This application gives the context for the concepts. The ontology is appropriate if it formalizes the concepts as they are used in the application -- more should not be demanded.
A concept in one information system is in a different context than in another one. Understanding the relations between the same concept used in two different applications (i.e. in two different contexts) is difficult; this is the "interoperability problem". The metthods described by Aerts, Rosch and Gabora (2008) applying a quantum mechanical framework may be useful.
An ontology is right, if it describes the concept as it is used in the application; an ontology can only be judged in the context of the application. To construct ontologies outside of an application which fixes the context is not meaningful and judging an ontology without considering the context is inappropriate; it is not defined what would be a "good ontology" and what would be a "bad ontology" without a fixed context. -- I fear we are trying to often to much to produce an Ontology, describing reality, not our concepts of reality influenced by context.
Gabora, L.; Rosch, E. & Aerts, D. Toward an Ecological Theory of Concepts Ecological Psychology,, 2008, 20, 84-116
Aerts, D. & Gabora, L. A theory of concepts and their combinations I: The structure of the sets of contexts and properties Kybernetes, 2005, 34, 167-191
Aerts, D. & Gabora, L. A Theory of Concepts and Their Combinations II: A Hilbert Space Representation Kybernetes, 2005, 34, 0402205
Tuesday, July 27, 2010
Ontologies describe concepts, concepts humans form in their heads of real things - physical or abstract - which exist outside of their heads. Concepts are more or less detailed, as the circumstances require; the real (outside) things are always fully determined: there is no dog without a definite fur color or gender - but a concept of a dog a friend talks about may remain of unspecific color or gender, as long this is not relevant for the story he is telling.
We construct concepts in reaction to outside stimuli - something we see or a word we hear. Concepts are in a state. If a conversation continues, the concept becomes more specific; for example, the hair color of the dog may become mentioned.
Rosch has pointed out, that concepts are not fixed sets of things, but have a radial (graded) structure: some exemplars are better examples than others. Robins are "better birds" than penguins or ostriches. The structure of a radial category may change under the influence of a context: if the discussion focuses on Antarctica, penguins become "better" examples of birds.
How to model concepts and the effects of contexts on them? Consider additional information as a piece of context, which changes the state of the concept. Aerts and Gabora in "A theory of concepts and their combination" (2005) suggest that quantum mechanics provides a framework to model the state of concepts and the action of context on the state of the concept.
I have constructed concrete models of the concept "pet", following the experimental data they report in the paper. Starting with a little determined state of the concept "pet", where the pet can be a dog, a cat, a goldfish, additional context, e.e. a statement "the pet is running in the garden", makes the state of the concept more determined: goldfish is excluded, dog or cat becomes more likely. The formal model does identify absurd combinations of contexts, e.g. "the pet runs in the garden" and "the pet is a fish" - humans in conversation will in this moment ask for clarification...
What have I learned? The confusion between concepts and things in reality is difficult to avoid - we have a natural tendency to talk about the things, but meaning the concepts. Accepting that concepts have states makes the ontology enterprise much more difficult: when we give the ontology of a concept, which state are we describing?