Great Principles of Computing
Great Principles of Computing-Website by
Peter Denning
Computation
- Representations hold information.
- Computation is a sequence of representations.
- Representations can be compressed, but not too much.
- Computations can be open or closed.
- Computations have characteristic speeds of resolution.
- Complexity measures the time or space essential to complete computations.
- Finite representations of real processes always contain errors.
(
Quelle)
Communication
- Information can be encoded into messages.
- Data communication always takes place in a system consisting of a message source,
an encoder, a channel, and a decoder.
- Information in a message source places a hard lower bound on channel capacity
for accurate reception (Shannon Capacity Theorem).
- Messages corrupted during transmission can be recovered during reception (Error Correction).
- Messages can be compressed.
- Messages can hide information.
(
Quelle)
Coordination
- A coordination system is a set of agents interacting within a finite or infinite game toward a common objective.
- Action loop is the foundational element of all coordination protocols.
- Coordination tasks can be delegated to computational processes.
- The protocols of coordination systems manage dependencies of flow, sharing, and fit among activities.
- It is impossible to select one of several simultaneous or equally attractive alternatives within a preset deadline (Choice Uncertainty Principle)
- All coordination systems depend on solutions to the concurrency control problems of arbitration, synchronization, serialization, determinacy, and deadlock.
(
Quelle)
Recollection
- All computations take place in storage systems.
- Storage systems comprise hierarchies with volatile (fast) storage at the top and persistent (slower) storage at the bottom.
- The principle of locality dynamically identifies the most useful data, which can be cached at the top of the hierarchy.
- Thrashing is a severe performance degradation caused when parallel computations overload the storage system.
- Access to stored objects is controlled by dynamic bindings between names, handles, addresses, and locations.
- Hierarchical naming systems allow local authorities to assign names that are globally unique in very large name spaces.
- Handles enable sharing by providing unique-for-all-time object identifiers that are independent of all address spaces.
- Data can be retrieved by name or by content.
(
Quelle)
Automation
- Physical automation maps hard computational tasks to physical systems that perform them acceptably well.
- Artificial intelligence maps human cognitive tasks to physical systems that perform them acceptably well.
- Artificial intelligence maps tasks to systems through models, search, deduction, induction, and collective intelligence.
- Models represent processes by which intelligent beings generate their behavior.
- Search finds the subsets of states of a complex system that must participate in the final outcome of a task.
- Deduction locates the outcome of a task by applying rules of logic to move from axioms to provable statements.
- Induction builds models by generalizing from data about a complex task's behavior.
- Collective intelligence exploits large scale aggregation and coordination in networks to produce new knowledge.
(
Quelle)
Evaluation
- The principal tools of evaluation are modeling, simulation, experiment, and statistical analysis of data.
- Computing systems can be represented as sets of equations balancing transition flows among states.
- Network of servers is a common, efficient representation of computing systems.
- Network-of-server systems obey fundamental laws on their utilizations, throughputs, queueing, response times, and bottlenecks.
- Resource sharing, when feasible, is always more efficient than partitioning.
(
Quelle)
Design
- Design principles are conventions for planning and building correct, fast, fault tolerant, and fit software systems.
- Error confinement and recovery are much harder in the virtual worlds of software than in the real world of physical objects.
- The four base principles of software design are hierarchical aggregation, levels, virtual machines, and objects.
- Abstraction, information hiding, and decomposition are complementary aspects of modularity.
- Levels organize the functions of a system into hierarchies that allow downward invocations and upward replies.
- Virtual machines organize software as simulations of computing machines.
- Objects organize software into networks of shared entities that activate operations in each other by exchanging signals.
- In a distributed system, it is more efficient to implement a function in the communicating applications than in the network itself (end-to-end principle).
(
Quelle)