Saturday,
August 21, 2004
Foundational work in Knowledge Science
This note has three parts.
1) Anticipatory Web technology: a continuation of architectural information about building Anticipatory Web technology from the previous bead [83]
2) House hearing panel: a discussion about a planned up coming House hearing panel on information technology evaluation and procurement by the Intelligence Community
3) National Project: continuing discussions about the planning for a National Project to establish a science of knowledge systems
A map of point to point Internet transactions during one day
A Systematic Review of all Software Patents
Ontology referential bases (Orb) constructions are being developed to store “localized”, or “schema independent”, information into computer memory. The local information is captured by mining processes, or developed directly by a human or human community. The generic form of these encoding are an n-tuple in the form:
< r, a(1), a(2), . . . , a(n) >
where r is a relational variable and the set of nodes { a(1), a(2), . . . , a(n) } are references to topics. (See the Topic Map standard.) In the current (In-memory Ontology referential base) InOrb technology the n-tuples are expressed as elementary 2-tuples having the form:
< a, r, b >
where a and b are locations in a space of structures and r is a non-specific variable. (See Orb notational paper) Viewing InOrb constructions uses a freeware browser called a SLIP browser (Windows only – with apologies).
SLIP browser over three-letter semantic ontology
The local information is made “global” via a class of mathematically precise transforms called “convolutions”. The convolution passes very quickly over each element in a set of Orb constructions and rules/procedures are performed when appropriate. The Hilbert encoding is used to functionally instrument any element of a large class of convolutions over a set of Orb constructions.
Any category of data mining technique can be immediately composed only using in memory Orb convolutions so the Orb constructions provide a universal data encoding for any and all data mining techniques.
Many of the current deployed data mining techniques are almost un-usable because of proprietary restrictions by the vendor.
Discussions will occur with House committee staff about the BCNGroup’s role in planning one or two panels in support of transforming the IC investigations.
Our testimony will address three aspects of the problem of improper evaluation of information technology procurements.
1) The past practices where consulting groups made decisions based on tuff protection and/or narrowly defined economic gains to associates
2) The present gather-analysis-use potential from anticipatory web of information technologies and the Human-centric Information production (HIP) paradigm
3) A future model informationally transparent Safe Net concept to be used by educators, individual citizens, governments and businesses on a very low cost and zero cost basis.
More will be placed into this bead, as information is made available about the planning processes for the hearings. (last update at: 8/22/2004 10:07 AM)
for distribution
Folder with three short
brochures on
Capitalization, Public Precision, and the Knowledge Sharing Foundation
(these brochures are
printed front and back on single sheet of paper and)
(then folded to
produce a 8 by 3 2/3 inch brochure)
(The Knowledge Sharing
Foundation brochure was added - 8/23/2004 10:52 AM)
Example of testimony to be given to The Congress when
asked
The advent of extensive measurement and monitoring of the social discourse is already here and being practiced by large corporations hoping to reduce advertising cost by developing anticipatory profiles on the types of customers interested, or potentially interested in products. However, these corporations do not as yet know how to use systems only partially developed in the intelligence community, and as a consequence the public monitoring of social discourse is primarily used to identify individuals who are speaking poorly about the corporation or one of the corporation’s products. (see bead thread on the use of intelligence technology to measure social discourse.)
The development and use of Anticipatory Web technology has been carefully governed by an informal community, in the context of groupthink, of MBAs whose comprehension of anticipatory web theory is low and whose desire for short term profits reasonably produces predatory behaviors in the current consulting / think tank environments [71], [72], [73], [74].
The value related to anticipatory web information from real time examination of the global social discourse to business-to-business transactions and business-to-customer transitions is in the trillions of dollars. Many positive things may happen as anticipatory technology is better understood and deployed.
One architecture for measuring and reporting the
thematic structure of social discourse
To work well, the anticipatory web technology has to be actually fully developed, used properly, and understood by everyone who is touched by the information.
When anticipatory web technology is not fully developed, is used improperly (as it is being used improperly now), and not understood by anyone; then the only way to justify major investment by venture capitalists, including by the CIA’s venture capital arm In-Q-Tel, is to use the technology to become a big brother watching the social discourse for people who disagree politically or who are saying bad things about products or companies. (See Intelliseek marketing materials on their web site.)
National Project to establish a science of knowledge systems has four components:
1) The development of a K-12 curriculum for the knowledge sciences
2) The use of virtual private networks and multiple user domains to provide a Safe Net Anticipatory Commerce infrastructure
3) The development of Knowledge Sharing Foundation core environments where Human-centric Information Production capabilities are tested and fielded using a super-distribution principle related to use metrics and use based fees
4) The creation of knowledge science academic departments in community colleges and universities
The use of memetic monitoring technology has many potential abuses, which are not covered by law and which must be addressed in the near future by legislation. However, one of the benefits of measuring memetic patterns is that the Safe Net can be made free of porn, deceptive advertising and identity spoofing. The way in which this is done has to be absolutely transparent to anyone who wishes to inquire.
Privacy issues are addressed by not monitoring messages that are explicitly marked as private within a group, and when every member of the group that is generating or receiving the message has indicated an awareness of all others in the conversation.
Deep packet inspection of all communications, not deemed private, is performed using categorical abstractions and stratified ontology.