You are currently browsing the tag archive for the ‘computational complexity’ tag.
(Notations, non-standard concepts, and definitions used commonly in these investigations are detailed in this post.)
Ferguson’s and Priest’s thesis
In a brief, but provocative, review of what they term as “the enduring evolution of logic” over the ages, the authors of Oxford University Press’ recently released ‘A Dictionary of Logic‘, philosophers Thomas Macaulay Ferguson and Graham Priest, take to task what they view as a Kant-influenced manner in which logic is taught as a first course in most places in the world:
“… as usually ahistorical and somewhat dogmatic. This is what logic is; just learn the rules. It is as if Frege had brought down the tablets from Mount Sinai: the result is God-given, fixed, and unquestionable.”
Ferguson and Priest conclude their review by remarking that:
“Logic provides a theory, or set of theories, about what follows from what, and why. And like any theoretical inquiry, it has evolved, and will continue to do so. It will surely produce theories of greater depth, scope, subtlety, refinement—and maybe even truth.”
However, it is not obvious whether that is prescient optimism, or a tongue-in-cheek exit line!
A nineteenth century parody of the struggle to define ‘truth’ objectively
For, if anything, the developments in logic since around 1931 has—seemingly in gross violation of the hallowed principle of Ockham’s razor, and its crude, but highly effective, modern avatar KISS—indeed produced a plethora of theories of great depth, scope, subtlety, and refinement.
These, however, seem to have more in common with the, cynical, twentieth century emphasis on subjective, unverifiable, ‘truth’, rather than with the concept of an objective, evidence-based, ‘truth’ that centuries of philosophers and mathematicians strenuously struggled to differentiate and express.
A struggle reflected so eloquently in this nineteenth century quote:
“When I use a word,” Humpty Dumpty said, in rather a scornful tone, “it means just what I choose it to mean—neither more nor less.”
“The question is,” said Alice, “whether you can make words mean so many different things.”
“The question is,” said Humpty Dumpty, “which is to be master—that’s all.”
… Lewis Carroll (Charles L. Dodgson), ‘Through the Looking-Glass’, chapter 6, p. 205 (1934 ed.). First published in 1872.
Making sense of mathematical propositions about infinite processes
It was, indeed, an epic struggle which culminated in the nineteenth century standards of rigour successfully imposed—in no small measure by the works of Augustin-Louis Cauchy and Karl Weierstrasse—on verifiable interpretations of mathematical propositions about infinite processes involving real numbers.
A struggle, moreover, which should have culminated equally successfully in similar twentieth century standards—on verifiable interpretations of mathematical propositions containing references to infinite computations involving integers—sought to be imposed in 1936 by Alan Turing upon philosophical and mathematical discourse.
The Liar paradox
For it follows from Turing’s 1936 reasoning that where quantification is not, or cannot be, explicitly defined in formal logical terms—eg. the classical expression of the Liar paradox as ‘This sentence is a lie’—a paradox cannot per se be considered as posing serious linguistic or philosophical concerns (see, for instance, the series of four posts beginning here).
Of course—as reflected implicitly in Kurt Gödel’s seminal 1931 paper on undecidable arithmetical propositions—it would be a matter of serious concern if the word ‘This’ in the English language sentence, ‘This sentence is a lie’, could be validly viewed as implicitly implying that:
(i) there is a constructive infinite enumeration of English language sentences;
(ii) to each of which a truth-value can be constructively assigned by the rules of a two-valued logic; and,
(iii) in which ‘This’ refers uniquely to a particular sentence in the enumeration.
Gödel’s influence on Turing’s reasoning
However, Turing’s constructive perspective had the misfortune of being subverted by a knee-jerk, anti-establishment, culture that was—and apparently remains to this day—overwhelmed by Gödel’s powerful Platonic—and essentially unverifiable—mathematical and philosophical 1931 interpretation of his own construction of an arithmetical proposition that is formally unprovable, but undeniably true under any definition of ‘truth’ in any interpretation of arithmetic over the natural numbers.
Otherwise, I believe that Turing could easily have provided the necessary constructive interpretations of arithmetical truth—sought by David Hilbert for establishing the consistency of number theory finitarily—which is addressed by the following paper due to appear in the December 2016 issue of ‘Cognitive Systems Research‘:
What is logic: using Ockham’s razor
Moreover, the paper endorses the implicit orthodoxy of an Ockham’s razor influenced perspective—which Ferguson and Priest seemingly find wanting—that logic is simply a deterministic set of rules that must constructively assign the truth values of ‘truth/falsity’ to the sentences of a language.
It is a view that I expressed earlier as the key to a possible resolution of the EPR paradox in the following paper that I presented on 26’th June at the workshop on Emergent Computational Logics at UNILOG’2015, Istanbul, Turkey:
where I introduced the definition:
A finite set of rules is a Logic of a formal mathematical language if, and only if, constructively assigns unique truth-values:
(a) Of provability/unprovability to the formulas of ; and
(b) Of truth/falsity to the sentences of the Theory which is defined semantically by the -interpretation of over a structure .
I showed there that such a definitional rule-based approach to ‘logic’ and ‘truth’ allows us to:
Equate the provable formulas of the first order Peano Arithmetic PA with the PA formulas that can be evidenced as `true’ under an algorithmically computable interpretation of PA over the structure of the natural numbers;
Adequately represent some of the philosophically troubling abstractions of the physical sciences mathematically;
Interpret such representations unambiguously; and
First that the concept of infinity is an emergent feature of any mechanical intelligence whose true arithmetical propositions are provable in the first-order Peano Arithmetic; and
Second that discovery and formulation of the laws of quantum physics lies within the algorithmically computable logic and reasoning of a mechanical intelligence whose logic is circumscribed by the first-order Peano Arithmetic.
Notations, non-standard concepts, and definitions used commonly in these investigations are detailed in this post.)
1. Since, by the Prime Number Theorem, the number of primes is , it would follow that determining a factor of requires at least one logical operation for each prime , and therefore cannot be done in polynomial time—whence —IF whether or not a prime divides an integer were independent of whether or not a prime divides the integer .
2. Currently, conventional approaches to determining the computational complexity of Integer Factorising apparently appeal critically to the belief that:
(i) either—explicitly (see here)—that whether or not a prime divides an integer is not independent of whether or not a prime divides the integer ;
(ii) or—implicitly (since the problem is yet open)—that a proof to the contrary must imply that if is the probability that is a prime, then .
3. If so, then conventional approaches seem to conflate the two probabilities:
(i) The probability of selecting a number that has the property of being prime from a given set of numbers;
Example 1: I have a bag containing numbers in which there are twice as many composites as primes. What is the probability that the first number you blindly pick from it is a prime. This is the basis for setting odds in games such as roulette.
(ii) The probability of determining that a given integer is prime.
Example 2: I give you a -digit combination lock along with a -digit number . The lock only opens if you set the combination to a proper factor of which is greater than . What is the probability that the first combination you try will open the lock. This is the basis for RSA encryption, which provides the cryptosystem used by many banks for securing their communications.
4. In case 3(i), if the precise proportion of primes to non-primes in is definable, then clearly too is definable.
However if is the set of all integers, and we cannot define a precise ratio of primes to composites in , but only an order of magnitude such as , then equally obviously cannot be defined in (see Chapter 2, p.9, Theorem 2.1, here).
5. In case 3(ii) the following paper proves , since it shows that whether or not a prime divides a given integer is independent of whether or not a prime divides :
Not only does it immediately follow that (see here), but we further have that , with a binomial standard deviation. Hence, even though we cannot define the probability of selecting a number from the set of all natural numbers that has the property of being prime, can be treated as the de facto probability that a given is prime, with all its attended consequences for various prime-counting functions and the Riemann Zeta function (see here).