George Boole, the father of symbolic logic, was born two hundred years ago today. His 1854 treatise, An investigation into the Laws of Thought, on Which are founded the Mathematical Theories of Logic and Probabilities, gave us what we now call Boolean algebra, variables that take on just two values, TRUE and FALSE, and the basic operations AND, OR and NOT. Boole could not have imagined that a century later this logic would form the foundation for digital computers.
Where would computational complexity be without Boolean algebra? We can use AND, OR, and NOT gates in place of Turing machines to capture computation: A polynomial-sized circuit of these gates can simulate any polynomial-time computable algorithm.
Stephen Cook analyzed the complexity of determining whether a formula φ of Boolean variables connected by AND, OR and NOTs was a tautology, a mathematical law. Cook showed that every nondeterministic polynomial-time computable problem reduced to checking if φ is not a tautology, or equivalently that (NOT φ) is satisfiable. That paper, as you well know, gave us the P v NP problem.
So here's to George Boole, whose desire to give mathematics a firm foundation produced simple tools powerful enough to describe everything.
Unfortunately later work with alternative operators (ternary majority, median algebras, eg. by Emil Post) seem to have fallen into the memory hole and very littte is done about it:
ReplyDeletehttp://www.sciencedirect.com/science/article/pii/S0012365X06004742
Though ternary majority definitely beats AND/OR/NOT for FPGA circuits
http://infoscience.epfl.ch/record/204262