[264]                           home                           [266]

 

Sunday, December 04, 2005

 

 The BCNGroup Beadgames

National Project à 

Challenge Problem  à

 Center of Excellence Proposal à

[bead thread on curriculum reform]

 

 

The Taos Institute

(on the possibilities)

 

 

John, Paul (Werbos) and Dick,  (continued from [264] )

 

Notes made on the back of Sowa’s two papers referenced at [263], during breakfast at my regular breakfast place in Taos:

 

<picture to be inserted>

 

My work at University of Texas at Arlington is discussed at [264]. 

 

At core to the difference between discrete and continuous is the notion of “next-to”.  A postulate is required to create the real numbers from the set of axioms (Peano axioms) used to produce the counting numbers.  The history of mathematics places this postulate at the center of the development of real number analysis.  See Edward Huntington’s “The Continuum and other types of serial order”, Dover Publication 1917.  Huntington’s book is small (80 pages) and accessable to any interested reader.  Dedekind’s postulate is that, given two distinct real numbers, there is always a three real number (distinct from the other two) that is “in-between”.

 

The development of modern real analysis does in fact stem from the Dedekind postulate and from the work on axiomatics by Cantor.  Alex Zenkin, as Paul Werbos pointed out, questions Cantor based on specific errors. Dedekind’s postulate can be questioned based on a similar analysis of what one allows as an “induction” in developing the foundations to the real number theory.  These limitations over real analysis might not have been ignored during the entirely of the twentieth century, had it not been for the seduction of computing with electric machines.

 

These are not esoteric discussions, having nothing to do with everyday reality.  Our day to day reality, in the United States at least, is strongly governed by computer science; and the limitations of computer science, as discussed by Godel and others, strongly impacts our social reality.  The current greatest known threats to our social reality are

 

1)       biological in the form of the easily predicted flu pandemic

2)       the potential of a thermo-blast electromagnetic burst that could fry the complete on-line information infrastructure

 

The limitations of formalism must be understood, if we are to build resilient information systems that are able to come back on line easily after an attack using threat #2.   The threat #1 could be addressed if our science develops an appreciation of the differences between natural (biological) systems and computing architectures.  (see: Prueitt )

 

Freshman college students could understand the difference, if a new curriculum was presented in the fashion I have recommended at LiberalArtsCore.

 

The basic defining notion from real analysis (mathematic topology) is “next to”. 

 

In the language of the Orbs (ontology referential bases), the assertion that two referents “a” and “b” are next to each other is stated using the “syntagmatic” triple

 

<  a, r, b >

 

The n-ary that Ballard uses at the core of the Mark 3 is asserted:

 

< r, a(1), a(2), . . . , a(n) >

 

where n is dependant on a full enumeration of degrees of informational freedom.

 

The relationship “next to” is weaker that the relationship “similar to” or the relationship “co-occurs with”. 

 

Lev Goldfarb should be mentioned here. His works points out, in a very scholarly fashion, that the category of metrics found in modern real analysis is based on the induction of the integers.  This induction is not the only one that could provide the formal basis for scientific thought. 

 

My conclusions are drawn from decades of living with these types of perplexing issues. 

 

In developing the Orb standard, one elevates weakest (categorical-abstract) relationship possible and uses this as the default meaning to the “r” in a syntagmatic unit,

 

< a, r, b >

 

This weakest categorical-abstract relationship can be used without either

 

1)       semantics (i.e. the imposition of “meaning” to data or symbols) or

2)       logic

 

Encoding of localized data form measurements of “whatever” can be done using a key-less hash table, as discussed in the Orb Notational Paper.

 

Patterns of these relationships are then stood up as a semiotic system, requiring human reification and interpretation.  I define “ontological reification” to mean “validation of categories of meaning associated or assigned to symbols or data patterns”.

 

At that point it is possible to attach semantic structure and in some cases create EDI (electronic data interchange) standards. 

 

The focused criticism I have made, over the past decade, regarding the W3C semantic web standards processes is that preliminary and real issues related to abstract formalization (mathematics and logic), human communication and the standardization processes themselves have been ignored while IT consultants trade on the seduction of proposals.  These proposals are marketed as able to solve the problem of mediating human discourse, while also solving the data interoperability debacle. (see for extended discussion starting at [188]. 

 

Various commercial systems exist that address the semantic extraction problem.  But the problem is ill posed, because meaning cannot be completely extracted using algorithms.  The “semantic web” standards compound this ill-posed status by attempting to create “upper ontologies” and precise definitions to the meaning of concepts. 

 

Those who really understand the semantic web standards, like James Hendler (at Univ of Maryland) could help clarify that the Semantic Web technology aims to standardize “syntax” and data structures, only.  But there is very much less glamour in doing this, when compared with the seduction of defining meaning, and so egotism is selected for.  Professor Hendler has on more than one occasion black-balled proposals written by the BCNGroup, because the alternative that we propose would end support for semantic web standards.  When he does this, he is really manifesting the control that the community of academic computer scientists has on the advance (or hindrance) of computer based mediation of human communication. 

 

I point out; again, that various commercial systems exist that addresses the semantic extraction problem.  However, the business claims made in selling these systems, is far too strong.  Autonomy Inc’s development and marketing (1996 to the present) of its push-pull technology is one of the best examples where a great deal was promised and much less was delivered.  Dr. Michael Lynch, founder of Autonomy, had a deep and full understanding of the set of problems that he was being supported to address.  But the execution of the design and development was hijacked by the same communities that have been creating W3C standards (one right after another… on infinititium).  The mistakes are invariant, and yet the lessons learned, up to now; have not been learned by the agencies or industry. 

 

Semantic extraction technology, as defined by this market place, creates a type of “machine induction”.   To be more precise, a certain human community creates software.  In this community all share a set of misunderstanding about the nature of logic and human intelligence.  It is argued, based on grounded scientific arguments, that these software systems could never be “correct” from the point of view of society.  However, this same software community defines “correct” to be “creates an output”, and thus hijacks the “meaning” of correct.  For example, software systems are incompatible and non-interoperable by design.  

 

Human induction is a far different phenomenon.  The grounding that an informed human has is extensive and not entirely represented by modern science, even now in 2005.  The physical phenomenon involves a perception of structure and the assignment of meaning.  But human awareness also has a root in the present moment – what I call “situational pragmatics”. 

 

One aspect of this physical grounding in the present moment must be the quantum – chemical interface in the physical brain.  I discuss this interface in my chapter four:

 

Grounding The Tri-level Architecture in Neuropsychology and Open Logic

 

I made some additional comments in pencil this morning on my understanding of John Sowa’s using of fiber bundles, category theory and spreading activation.  I will write these notes up later. 

 

 

Paul Prueitt

Taos Institute