Tuesday,
August 24, 2004
National Science Foundation Cutbacks
. . . it is highly unlikely that the NSF will have sufficient funding to support innovative research.
All patriotic Americans agree that sending more billions to Halliburton is far more important than mere science. Only troublemakers would disagree.
John Sowa August 24, 2004
Commentary, by Paul Prueitt
I respectfully claim that the situation is more complex then this, John.
I encourage others to come into a principled discussion about what the situation is and what are the causes of this situation. Communications on this can be anonymous or authored and signed, and will be posted into this bead thread. Send communications to
For more information on recent proposals to cut NSF’s budget see:
http://www.cra.org/govaffairs/
For information on addition computer science profession community opposition to cuts in Total Information Awareness (TIA) programs at DARPA
http://www.cra.org/govaffairs/content.php?cid=19
The BCNGroup’s planning for a National Project establishing a new academic discipline would increase funding in areas where many in the computer science professional community would find opportunities. Many computer scientist long for a change in the culture that they must work within.
However, the maintenance of a professional community with taxpayer money for the purpose of maintaining that community has to be discussed also. Computer science receives perhaps three to six times the funding that mathematics and formal systems receives. Can this be justified based on objective criterion?
Has academic computer science become a social welfare state, whose right to high salaries goes unquestioned even when other programs, like properly teaching the freshman mathematics classes, goes unnoticed?
I believe that there was 1.3 Billion in direct funding of computer science departments, last year, but this might be an incomplete number.
1) Has this professional community become too powerful politically?
2) Has this community failed to find a way to simplify computer science so that our society has a decreasing need for programmers?
Cyber security is the primary, but not only illustration of the circumstances of the situation. The security problems come from the software industry’s products. The, mainstream, software community’s response is to create new generations of partial solutions, rather than to move to Open Source code where the vulnerabilities are radically reduced. The Open Source community flounders because of the competitive pressures caused by the consulting community, whose jobs depend on products sold today, and who neither care about the long term, or systematic, problems being caused by this cycles of economic re-enforcement between the two groups.
The fact that a high percentage of national “secrete information” is purely proprietary should raise some eyebrows, but so far this has not happened.
Using Open Source code reduces cyber security problems because one is easily able to have an effective deep packet inspection. Within the Open Sources development methodologies, most code can be inspected and thus the bit stream between applications becomes fully understood in most cases. But the trade off is the software industry, as it currently is organized around profit making. Experiences in the Open Source movement suggest that the means to make a living should come not from writing original code but in the use of code to do things that are not measured in the lines of code produced. A moment’s reflection on this might be needed to see the issues as being indirect value obtained by using software, rather than in creating it (once again) from the beginning.
The BCNGroup founding committee believed that society should move beyond Open Source to something like what Sandy Klausner has suggested.
The non-compatibility problems are something to expect as computer science matured. Over time, the compatibility issues should be worked out and eliminated. This is not what is happening. See paper on the Federal Enterprise Architecture.
Pure market forces appear to need non-compatibility to establish to establish competitive advantage. But the market can work if there is not hidden monopoly, as there is in the current system. One has to be able to find some evolutional pressure that provides compatibility, otherwise society as a whole loses out as software companies constantly create artificial problems in compatibility.
Database non-interoperability provides additional illustrations of the failure of pure market forces to create a mature and socially functional computer science.
The Behavioral Computational Neuroscience Group (BCNGroup) was founded in 1997 with the mission to simplify computer science and to reintegrate computer science back into mathematics. The path to this simplification requires that artificial intellectual paradigms and the types of programs at DARPA like the cognitive systems program be more completely peer reviewed by the entire community of scholars, not simply those who have vested interests in higher funding levels.
Many scholars see the current DARPA cognitive systems programs as pure science fiction. Over the past four decades, DARPA and NSF have invested billions of direct dollars into the earlier incarnation of Artificial Intelligence. (see http://www.darpa.gov/ipto/ )
* will be able to reason, using
substantial amounts of appropriately represented knowledge;
* will learn from their experiences
and improve their performance over time;
* will be capable of explaining
themselves and taking naturally expressed direction from humans;
* will be aware of themselves and
able to reflect on their own behavior; and
* will be able to respond robustly to
surprises, in a very general way.
The same program managers, example Dr. Barbara Yoon, who incorrectly defined a “neural network” to be nodes and links between nodes in the DARPA, Neural Network Study (1987), are still at DARPA peddling this nonsense. We have asked Dr. Yoon to make a statement about this, but her life long association with DARPA shields her for testimony. The DARPA managers and the DoD consultants, whose professional judgments allow this type of funding to continue, treat the scholarly discussions by Penrose and Hameroff and others with great distain. There is no open objective discussion, inspite the calls for conferences on this topic.
Communications from BCNGroup
to Office of Secretary of Defense (2003)
The Anticipatory Web technology separates the measurement of data invariances from the interpretation of data. There is no confused notion that a computer program “knows what it is doing”. Anticipatory Web design is grounded in cognitive neuroscience and in general systems theory.
According to the Wall Street Journal, the United States economic system purchased a little under $500 Billion in new hardware and software each quarter last year. And this sum does not include the massive level of consulting funding that is associated with systems management.
And yet, alternatives like Open Source software and even more economical, CoreTalk, struggle to find acceptance. The problem is economic monopoly that controls the profession of software development and most of the academic computer science community.
The Anticipatory Web information cannot be controlled for third party financial exploitation, anymore that the human mind can be controlled for third party financial exploration. The truth finding mechanisms that are part of the human soul will see the exploitation and discard it, if possible. The Anticipatory Web is not to be owned in the same way as Microsoft’s .NET or SUN’s Java platforms. Again, look at the principles developed for the Knowledge Sharing Foundation.
The computer scientists at DARPA, and NSF, would have the tax paying public believe that given enough funding we can create computational systems that are self-aware? What about water that flows up hill? When will this non-sense stop? It will stop when our society stands up to those who are using this confusion to maintain a software industry whose time has pasted, and yet whose hold on the government funding streams is un-disclosed.
Moral principles are also involved. To disclose these moral principles, one needs a clear understanding of the social science, the cognitive science and the general principles that the knowledge sciences would bring together. The planning process of the National Project has come to understand that the solution is a K-12 curriculum developed by the best minds in the national sciences, using the computer science infrastructure developed as the:
The Knowledge Sharing Foundation
First, humans are self-aware. If we treat humans properly and then perhaps we could just use the telephone? Knowledge management is a discipline that attempts to define some aspects of the use of communication systems to “manage” the production and sharing of human knowledge. This discipline is often taught, or certified, without curriculum informing the student about the issues of complexity in social interaction, but there is an attempt to at least introduce these issues. Reconciliation of cultural differences, for example is a key element to most knowledge management certifications.
Scholars like I. Prigogine, Robert Rosen, Lev Goldfarb, Sir Roger Penrose, find that mathematics itself needs to be extended to account for physical complex systems theory, general systems theory and formalism that is open to axiomatic perturbations and to new work on the nature of induction and deduction. We find that the human mind is far more intricate than what is captured in Aristotle’s logic, Newton’s physics or Hilbert’s mathematics.
We want to answer the question, “What is next?”
Why would one feel that classical, Hilbert, mathematics is perfect, when the foundations of mathematics itself says that there is a conflict between completeness, what ever that is, and consistency. Why should we feel that Aristotle’s logic is perfect when the mismatch between normal everyone experience and “rational” arguments is evident. Why should we feel that the Newtonian paradigm is perfect as a paradigm for the life sciences, when Newtonian’s paradigm does not even fully account for many physical phenomenon not involved with living systems.
By consistency, one means “rational” in some very well defined sense. But where does one find this well-defined sense of rational?
There is an alternative, in principle, to keeping computer science support at NSF and DARPA at the levels that have been created very professional organizations now narrowly dedicated to maintaining and increasing funding. The value proposition is also to this community, if only they would look beyond the current funding practices. The fact that funding for NSF, and for IT programs at NSF is deemed too high is due to, in part, the failures of IT to provide full value for past expenditures. The value proposition is not in funding more of the same, but in funding the next great advance in human culture, the “knowledge age”.
We hope that the entire Community of Scholars would make a judgment about the proper levels of funding for computer science. We mean the whole community, not just those who now have a dependency on a funding level that are justified only based on past funding levels.
The entire community of scholars has not been asked, but if they were and we could wrestle ourselves away from the pessimism, the scholars would likely vote to invest this money in a stable and free operating system with stable and free functionality that has safe and open source code. Right? How much could that cost? 110M?
Is there an end to computer science’s development?
A Systematic Review of all Software Patents
We think so. Let us finish this work. Now the question is, what is next?