We present a design for a class of computers whose •instruction sets• are based on LISP. LISP, li... more We present a design for a class of computers whose •instruction sets• are based on LISP. LISP, like traditional stored-program machine languages and unlike most high-level languages, conceptually stores programs and data in the same way and explicitly allows programs to be manipulated as data. LISP is therefore a suitable language around which to design a stored-program computer architecture . LISP differs from traditional machine languages in that the program/data storage is conceptually an unordered set of linked record structures of various sizes, rather than an ordered, indexable vector or integers or bit fields of fixed size. The record structures can be organized into trees or graphs. An instruction set can be designed for programs expressed as such trees. A processor can interpret these trees in a recursive fashion, and provide autoaatic storage management for the record structures. We concentrate here on the issues of memory management in such a computer, and the reasons why a layered design strategy is not only desirable and natural but even mandatory. A prototype VLSI LISP microprocessor has been designed and fabricated for testing. It is a small-scale version of the ideas presented here, containing a sufficiently complete instruction interpreter to execute small programs, and a rudimentary storage allocator. We intend to design and fabricate a full-scale VLSI version of this architecture in 1979.
Large scale data mining in both government and privacy sector applications has raised as yet unso... more Large scale data mining in both government and privacy sector applications has raised as yet unsolved problems of privacy, intellectual property and other controversial uses of information. Neither our legal system (privacy and intellectual property laws) nor information hiding technologies (secure private computation techniques) offer a means by rich inferences can be gleaned from data without running afoul of social and legal norms. As an alternative, we are developing system architectures that provide Information Accountability. Accountable systems will assist users in seeking answers to questions such as: Is this piece of data allowed to be used for a given purpose? Is a string of inferences permissible for use in a given context, depending on the provenance of the data and the applicable rules? Information accountability will emerge from the development of three basic capabilities: policy-aware audit logging, a policy language framework, and accountability reasoning tools. A policy-aware transaction log will initially resemble traditional network and database transaction logs, but also include data provenance, annotations about how the information was used, and what rules are known to be associated with that information. Cryptographic techniques will play an important role in Policy Aware systems, but unlike the current reliance of privacy designs today, cryptography will be more for the purpose of creating immutable audit logs and providing verifiable data provenance information, than for confidentiality or access control.
It is hard to build robust systems: systems that have acceptable behavior over a larger class of ... more It is hard to build robust systems: systems that have acceptable behavior over a larger class of situations than was anticipated by their designers. The most robust systems are evolvable: they can be easily adapted to new situations with only minor modification. How can we design systems that are flexible in this way? Observations of biological systems tell us a great deal about how to make robust and evolvable systems. Techniques originally developed in support of symbolic Artificial Intelligence can be viewed as ways of enhancing robustness and evolvability in programs and other engineered systems. By contrast, common practice of computer science actively discourages the construction of robust systems.
Classical mechanics is deceptively simple. It is surprisingly easy to get the right answer with f... more Classical mechanics is deceptively simple. It is surprisingly easy to get the right answer with fallacious reasoning or without real understanding. To address this problem we use computational techniques to communicate a deeper understanding of Classical Mechanics. Computational algorithms are used to express the methods used in the analysis of dynamical phenomena. Expressing the methods in a computer language forces them to be unambiguous and computationally effective. The task of formulating a method as a computerexecutable program and debugging that program is a powerful exercise in the learning process. Also, once formalized procedurally, a mathematical idea becomes a tool that can be used directly to compute results.
ver the next few decades, two emerging technologies—microfabrication and cellular engineering—wil... more ver the next few decades, two emerging technologies—microfabrication and cellular engineering—will make it possible to assemble systems incorporating myriads of informationprocessing units at almost no cost, provided all units need not work correctly and that there is no need to manufacture precise geometrical arrangements among them. The shift to this technology will precipitate fundamental changes in methods for constructing and programming computers, and in our view computation itself. Microelectronic mechanical components have become so inexpensive to manufacture we can anticipate integrating logic circuits, microsensors, actuators, and communications devices on the same chip to produce particles that could be mixed with bulk materials, such as paints, gels, and concrete. Imagine coating bridges and buildings with smart paint that senses and reports on traffic and wind loads and monitors structural integrity. A smart-paint coating on a wall could sense vibrations, monitor the pr...
This thesis presents a sketch-based interaction system that can be used to illustrate the process... more This thesis presents a sketch-based interaction system that can be used to illustrate the process of reasoning about an electrical circuit in an educational setting. Recognition of hand-drawn shapes is accomplished in a two stage process where strokes are first processed into primitives like lines or ellipses, then combined into the appropriate circuit device symbols using a shape description language called LADDER. The circuit is then solved by a constraint-propagation reasoning component. The solution is shown to the user along with the justifications that support each deduction. The level of detail and the speed of the solution playback can be customized to tailor to a student's particular learning pace. A small user study was conducted to test the performance of the recognition component, which revealed several recognition problems common to almost all of the users' experiences with the system. Suggestions for dealing with these problems are also presented Thesis Supervi...
Traditional automated synthesis techniques for circuit design are restricted to small classes of ... more Traditional automated synthesis techniques for circuit design are restricted to small classes of circuit functions for which mathematical methods exist. Professor Gerald J. Sussman and his group have developed computer-aided design tools that can be of much broader assistance. The work has developed the idea of analysis by propagation of constraints. Guy L. Steele developed a language to support such programming, Johan de Kleer studied causal and teleological reasoning in the recognition of circuit function from schematics, and Howie Shrobe has worked on constraint satisfaction and the development of an interactive knowledgebased system for substantially supporting VLSI design. Jon Doyle has studied belief revision via truth maintenance and non-monotonic logics, as well as self-conscious adaptive deliberate reasoning programs. Richard Waters, Charles Rich, and Howie Shrobe have developed the idea of a programmer’s apprentice.
Classical mechanics is deceptively simple. It is surprisingly easy to get the right answer with f... more Classical mechanics is deceptively simple. It is surprisingly easy to get the right answer with fallacious reasoning or without real understanding. To address this problem we use computational techniques to communicate a deeper understanding of Classical Mechanics. Computational algorithms are used to express the methods used in the analysis of dynamical phenomena. Expressing the methods in a computer language forces them to be unambiguous and computationally effective. The task of formulating a method as a computerexecutable program and debugging that program is a powerful exercise in the learning process. Also, once formalized procedurally, a mathematical idea becomes a tool that can be used directly to compute results.
This paper proposes three research topics within the general framework of Automatic Programming. ... more This paper proposes three research topics within the general framework of Automatic Programming. The projects are designing (1) a student programmer, (2) a robot programmer and (3) a physicist's helper. The purpose of these projects is both to explore fundamental ideas regarding the nature of programming as well as to propose practical applications of AI research. The reason for offering this discussion as a Working Paper is to suggest possible research topics which members of the laboratory may be interested in pursuing.
The purpose of this short document is to exhibit how a HACKER-like top-down planning and debuggin... more The purpose of this short document is to exhibit how a HACKER-like top-down planning and debugging system can be applied to the problem of the design and debugging of simple analog electronic circuits. I believe, and I hope to establish, that this kind of processing goes on at all levels of the problem-solving process--from specific, concrete applications, like Electronic Design, through abstract piecing together and debugging of problem-solving strategies.
In this dissertation I propose a shift in the foundations of computation. Modern programming syst... more In this dissertation I propose a shift in the foundations of computation. Modern programming systems are not expressive enough. The traditional image of a single computer that has global effects on a large memory is too restrictive. The propagation paradigm replaces this with computing by networks of local, independent, stateless machines interconnected with stateful storage cells. In so doing, it offers great flexibility and expressive power, and has therefore been much studied, but has not yet been tamed for general-purpose computation. The novel insight that should finally permit computing with general-purpose propagation is that a cell should not be seen as storing a value, but as accumulating information about a value. Various forms of the general idea of propagation have been used with great success for various special purposes; perhaps the most immediate example is constraint propagation in constraint satisfaction systems. This success is evidence both that traditional linear computation is not expressive enough, and that propagation is more expressive. These special-purpose systems, however, are all complex and all different, and neither compose well, nor interoperate well, nor generalize well. A foundational layer is missing. I present in this dissertation the design and implementation of a prototype general-purpose propagation system. I argue that the structure of the prototype follows from the overarching principle of computing by propagation and of storage by accumulating information-there are no important arbitrary decisions. I illustrate on several worked examples how the resulting organization supports arbitrary computation; recovers the expressivity benefits that have been derived from specialpurpose propagation systems in a single general-purpose framework, allowing them to compose and interoperate; and offers further expressive power beyond what we have known in the past. I reflect on the new light the propagation perspective sheds on the deep nature of computation.
It is hard to build robust systems: systems that have acceptable behavior over a larger class of ... more It is hard to build robust systems: systems that have acceptable behavior over a larger class of situations than was anticipated by their designers. The most robust systems are evolvable: they can be easily adapted to new situations with only minor modification. How can we design systems that are flexible in this way? Observations of biological systems tell us a great deal about how to make robust and evolvable systems. Techniques originally developed in support of symbolic Artificial Intelligence can be viewed as ways of enhancing robustness and evolvability in programs and other engineered systems. By contrast, common practice of computer science actively discourages the construction of robust systems.
We propose to develop a computer aided design tool which can help an engineer deal with system ev... more We propose to develop a computer aided design tool which can help an engineer deal with system evolution from the initial phases of design right through the testing and maintenance phases. We imagine a design system which can function as a junior assistant. It provides a total conversational and graphical environment. It remembers the reasons for design choices and can retrieve and do simple deductions with them. Such a system can provide a designer with information relevant to a proposed modification and can help him understand the consequences of simple modifications by pointing out the structures and functions which will be affected by modifications. The designer's assistant will maintain a vast amount of such annotation on the structure and function of the system being evolved and will be able to retrieve the appropriate annotation and remind the designer about the features which he installed too long ago to remember, or which were installed by other designers who work with ...
International Journal of High Speed Electronics and Systems, 1992
The Supercomputer Toolkit is a family of hardware modules (processors, memory, interconnect, and ... more The Supercomputer Toolkit is a family of hardware modules (processors, memory, interconnect, and input-output devices) and a collection of software modules (compilers, simulators, scientific libraries, and high-level front ends) from which high-performance special-purpose computers can be easily configured and programmed. Although there are many examples of special-purpose computers (see Ref. 4), the Toolkit approach is different in that our aim is to construct these machines from standard, reusable parts. These are combined by means of a user-reconfigurable, static interconnect technology. The Toolkit’s software support, based on novel compilation techniques, produces extremely high-performance numerical code from high-level language input. We have completed fabrication of the Toolkit processor module, and several critical software modules. An eight-processor configuration is running at MIT. We have used the prototype Toolkit to perform a breakthrough computation of scientific impo...
This raport outlines the problem of intelligent failure recovery in • problem-solver for electric... more This raport outlines the problem of intelligent failure recovery in • problem-solver for electrical design. We want our problem solver to learn as much as it can from its mistakes. Thus we cast the engineering design process in terms of Problem Solving by Debugging Almost-Right Plans, a paradigm for automatic problem solving based on the belief that creation and removal of "bugs" is an unavoidable part of the process of solving a complex problem. The process of localization and removal of bugs called for by the PSBOARP theory requires an approach to engineering analysis in which every result has a justification which describes the exact set of assumptions it depends upon. We have developed a program based on Analysis by Propagation of Constraints which can explain the basis of its deductions. In addition to being useful to a PSBDARP designer, these justifications are used in Dependency-Directed Backtracking to limit the combinatorial search in the analysis routines. Although the research we will describe is explicitly ebout electrical circuits, we believe that similar principles and methods ere employed by other kinds of engineers, including computer programmers.
International Journal of Circuit Theory and Applications, 1980
A major component in the process of design is synthesis, the determination of the parameters of t... more A major component in the process of design is synthesis, the determination of the parameters of the parts of a network given desiderata for the behaviour of the network as a whole. Traditional automated synthesis techniques are either restricted to small, precisely defined classes of circuit functions for which exact mathematical methods exist or they depend upon numerical optimization methods in which it is difficult to determine the basis for any of the answers generated and their relations to the design desiderata and constraints. We are developing a symbolic computer-aided design tool, SYN, which can be of assistance to an engineer in the synthesis of a large class of circuits. The symbolic methods produce solutions which are clear and insightful. The dependence of each parameter on the individual design desiderata and circuil constraints can he easily traced. linear in voltages and currents, the equations are non-linear in the component parameter values. Our synthesis aid is based on analysis by propagation of constraints 3 . This analysis method guides the use of symbolic algebraic methods 4 in combining constraints which describe circuit elements and their interconnections to determine the behaviour of a circuit. In this paper we show how propagation analysis can he inverted to determine constraints on the individual parts from the desired behaviour of the circuit. The method is based on the observation that locally, analysis and synthesis are very similar: The problem of finding the resistance which permits a given current flow at a given potential is equivalent to the problem of determining the current that flows given a resistance and a potential. Our method is successful for several reasons. It does not try to invert a complete analysis. The method of propagation of constraints deals with only a small part of the problem at a time. It is an incremental deductive method which first solves whatever suhprohlems can he solved easily. After picking off the easiest rThis paper describes research done at the Artificial Intelligence Laboratory of the Massachusetts Institute of Technology. Support for the laboratory's research is provided in part by the National Science Foundation under Grant MC577-04828.
ver the next few decades, two emerging technologies-microfabrication and cellular engineering-wil... more ver the next few decades, two emerging technologies-microfabrication and cellular engineering-will make it possible to assemble systems incorporating myriads of informationprocessing units at almost no cost, provided all units need not work correctly and that there is no need to manufacture precise geometrical arrangements among them. The shift to this technology will precipitate fundamental changes in methods for constructing and programming computers, and in our view computation itself. Microelectronic mechanical components have become so inexpensive to manufacture we can anticipate integrating logic circuits, microsensors, actuators, and communications devices on the same chip to produce particles that could be mixed with bulk materials, such as paints, gels, and concrete. Imagine coating bridges and buildings with smart paint that senses and reports on traffic and wind loads and monitors structural integrity. A smart-paint coating on a wall could sense vibrations, monitor the premises for intruders, and cancel noise. Even more striking is the amazing progress in understanding the biochemical mechanisms in individual cells, promising that we'll be able to harness these mechanisms to construct digital logic circuits. For coherent behavior from vast numbers of unreliable microsensors, actuators, and communication devices interconnected in unknown ways, apply the lessons of cellular cooperation in biological organisms.
With access control and encryption no longer capable of protecting privacy, laws and systems are ... more With access control and encryption no longer capable of protecting privacy, laws and systems are needed that hold people accountable for the misuse of personal information, whether public or secret.
The authors discuss the development of intelligent techniques appropriate for the automatic prepa... more The authors discuss the development of intelligent techniques appropriate for the automatic preparation, execution, and control of numerical experiments.
Uploads
Papers by Gerald Sussman