A formal system consists of axioms (statements that are true) and rules (how axioms can be manipulated). New axioms can be deduced from other axioms and rules. You formalize a system when you start defining these statements and rules.
A formal system is consistent when you can’t simultaneously prove and disprove an axiom. An inconsistent system would not be very useful e.g. a weather forecasting system that predicts it will rain and not rain at the same time.
A formal system is a complete system when all statements in the system can be proved or disproved thereby allowing you to know everything about the system.
See also:
- Gödel’s incompleteness theorem which says that a formal system can not also be a complete system.
Links to this note
-
Cynicism Is the Opposite of Optimism and Pessimism
Cynics believe that nothing ever changes at a fundamental level. By it’s nature, cynicism is a dead end where everything stays the same over time.
-
Experience Is an Illusion of System Completeness
We tend to think of experience in a given role is obviously good–it provides pattern matching and intuitions built up on lived events. If you consider experience as a process of formalization (i.e. formal system) of a given profession, then Godel incompleteness means there will always be more unproven yet true axioms in the field.
-
Gödel Incompleteness for Startups
An essay that relates Gödel’s incompleteness theorem (along with the Halting Problem) to startup disruption—arguing that all successful startups discover one or more G-statements and extract value by building a formal system around it.
-
Circular Specification Problem
Writing a specification with sufficient detail to know exactly what software one should build is as much work as writing the code itself. In many cases, fully specifying the work beforehand is not possible because we don’t know enough about the problem or the domain to begin with. This is why our codebases are always in a state of flux and never complete systems.
-
Knowledge Capture Loops Make for Good Systems
Real world systems for operating a complicated process don’t start out perfectly designed complete systems. New information reveals itself only after you’ve done it a few times. Failure modes you weren’t aware of become apparent only after the system breaks.
-
Gödel’s Incompleteness Theorem
A formal system (one that is consistent never yields a false statement) can not also be a complete system (containing all true statements)–there will always be statements that are unprovable yet true (i.e. G-statement).