Friday, December 02, 2005
Center of Excellence Proposal
à
The Taos Institute
(on the possibilities)
[bead thread on curriculum
reform]
Dr Baru
We appreciate the response.
I have a role as lead architect for enterprise integration projects, in industry. We see our work as needing to be as informed as possible about leading edge distributed data exchange.
The issue, as you are aware, is in having real architecture really in place - so that some objective (social/cognitive/profit) analysis can be done on how the architecture is being used and what the limitations are, as experienced. In the IC, this is generally discussed as "cognitive engineering",a nd is related to things like evolutionary psychology, cultural antropology etc.
I am interested to know if the simple version of your interesting architecture is being used by a community that is not mostly computer scientists, and if so are there objective measures in place to make judgments about the correctness of design features?
You may know that there is a high level discussion, in standards working groups, about the attachment of a logic, even the OWL lite, to a controlled vocabulary (when properly organized into a hierachy). An alternative approach is to use Topic Maps to encode the controlled vocabulary and to use TM process models to compute with elements of that vocabulary. But the attraction of RDF and the W3C standards continues and is required of most "government work" here in the States.
I am interested in your review of CoreTalk, a system large scale data exchange concept based on standardizing around iconic elements that express specific behaviors, and which have a minimal binary exchange (like EDI) structure. This structure is designed using something that I think of as "string theory" where views of social/business reality are reflected in the design - both of the data exchange and the behavioral aspect of the icon. Interchange (interoperability) is achieved by a virtual engine, which exists.
A large macromedia presentation can be obtained from a download at:
http://www.bcngroup.org/beadgames/safeNet/one.htm
Klausner's work is somewhat related to the Mark 3 Knowledge Foundations Inc's knowledge processor. Dr Richard Ballard has extended Shannon information theory (in the 1970s) by generalizing from the inversion of a matrix operations... and established that "information" is a conserved phenomenon, having specific degrees of freedom which is captured as an "simple" n-ary, < r, a(1), a(2) , . . . , a(n) > where n is dependant on the time and the set of informational constraints (discovered form question-answer process).
If the informational constraints are defined using a small set (18) of semantic primatives, then one starts to have an extension of John Sowa's semantic primatives, work done ni the formal Soviet Union (as applied semiotics), and the periodic table type primatives of Tom Adi.
(I am not sure why I got into this discription of technology, but I have to mention five other groups, to be complete:
1) SchemaLogic technology developed by Breana Anderson
2) The conceptual roll-up technology, NdCore, from Applied Technical Systems
3) The conceptual roll-up technology, Readware
4) Polylogics
5) Ultrastructure (Jeff Long)
as well as my own work on differential and formative ontology.
None of these technologies use RDF/OWL or Topic Maps - but there could be translations... as long as the logic was separated.
In fact, the notion of separation of function(behavior)/structure(data encoding) is difficult as long as the logic is attached to the controlled vocabulary. Gerald Edelman (Nobel prize in (1982?) for immunology) and others talk about this as "degeneracy" and suggest that this degeneracy is necessary in the production of interpretation of language/reality.
I see no movement at W3C to recognize this (biological fact?), but I do see the capability of producing and using degeneracy in practice in these "non-standard-based systems and in the OASIS working groups.
We would be interested in your group's comments, for the record.
Dr Paul S Prueitt