Saturday, August 21, 2004
.
Minimal architecture
for the Anticipatory Web
More complete architecture
for the Anticipatory Web
Some of the formal
constructions needed by the Anticipatory Web
The Hilbert encoding that is patented by PriMentia works with three concepts.
1) Hilbert Encoding Concept 1: The concept of a ordered set of numbers, which is represented in geometry as a “discrete” line
2) Hilbert Encoding Concept 2: The mapping of that set of numbers to physical memory so that
a. The mapping makes a one to one correspondence between each number in the set and a data structure called a hash container.
b. The containers are of equal size so that one can immediately go to the leading memory register of the m-th container simply by knowing where the memory map starts and what the size of the containers are.
c. The containers have a front and a back, with the front containing the number and the back continuing space for additional information, including pointers to other registers. (see figure below)
3) Hilbert Encoding Concept 3: The set membership question can be solved in less than n steps where 2^n is larger than the size of the set of numbers. If the memory mapping is done properly, and the executions are minimally coded, then set membership questions can be resolved in n fetch plus compare steps.
à A Systematic Review of all
Software Patents ß
simplest data structures for Orb key-less hash
more complicated data structures for the Orb key-less hash
The generalization of the Hilbert encoding will make use of the argument made by Goldfarb and (perhaps Ballard [1]) that the numeric model is not appropriate for information representation. What is needed is the encoding of arbitrary graph constructions where the notion of next to is often not completed in a unique fashion.
The question is proposed as a special project for one of the agencies. Readware and Ontologystream would like to include four or five scholars in this special project.
This non-uniqueness of path routing decisions is a halting condition for serial computer processors and thus has to be constrained by something additional. We constrain that “something additional” to the set convolution kernel.
In very general and simplified terms, the “convolution” is a passage over each element in a set. In classical mathematical integration, of a function, a convolution over the function’s domain has the function acting as the kernel. Each element is passed over and a multiplication is performed that takes the height of the function’s range element and multiplies this by the infinitesimal. This is done over a continuum so the very small sums are aggregated to be a real number.
For
those with a college calculus course in your background, this will triggers
some fond memories. The set
convolutions required by Orb arithmetic must solve the set membership question at
least once for each element of the set, and in some cases might need to solve
the set membership problem for each element of the power set over the set of
elements.
If the set of atomic elements is small then this can be done in the Hilbert encoding because of Hilbert Encoding Concept 3 (stated above)
Traversal problem
A complete path traversal is needed on most of the Orb transformations. This path traversal is in fact a form of set convolution, and can have a non-trivial kernel.
In the above figure we illustrate the path traversal problem. In the CCM patent, for example, a path traversal problem is eliminated by only using tree structure with only one parent. This effectively is the reduction to a 2-ary that Ballard has rightfully objected to. At any node there is only one node that is connected “above” and thus no halting problem occurs in the traversal “up” trees in the NdCore technology from Applied Technical Systems Inc.
The Orbs are in fact rendered into the more general graph structure pictured above. So the graph traversal task is different.
The decomposition of the Orbs into the ordered triples < a , r, b > is for the encoding processing into an Orb construction in the form of a hash table, not optimal, or a keyless hash table, which is optimal in terms of a solution for the set membership problem.
Notice that stratified theory suggests that local encoding of information is a different process than global encoding, so that information can be manipulated locally or globally.
Set of n-aries
As discussed by Ken in a previous bead, the measurement of substructural invariance creates categorical invariances, cA atoms. These invariances are then extracted from data flow via a measurement in the context of simple or more complicated rules (again we mention Clear Forest and Text Analysis International Corporation).
The result is both a periodic table of cA atoms and a set of known compounds whose function can be estimated using qualitative structure activity analysis.
Using both machine-computed data mining and human cognitive priming best performs this estimation. The computer can provide poor results and mislead the human, or the human can be unaware of how to interpret what would otherwise be clear information developed purely by algorithmic means.
So one has to be mature about this kind of technology and realize that responsibility has to be assumed for the interpretation and that the algorithms should be transparent and agile so that humans can understand what the computer is actually doing.
[1] I am not certain how
Ballard regards the discussions by Goldfarb on the limitations of the numeric
model. Ballard’s position is that
language has not been particularly effective in making true knowledge
transparent to humans, and I certainly agree.
Comments by Ballard regarding the limitations of mathematics indicate
that Goldfarb, Prueitt and Ballard have commonality in how we see the success
of Hilbert mathematics.