Conjecture:fundamental mathematical group

In summary, the conversation discusses the fundamental elements of mathematics and their relationship to each other. The speaker proposes that a group consisting of non-negative integers and irrational numbers, with the operation of addition and subtraction, is sufficient to define most of mathematics. However, objections are raised regarding the stability of this group and the role of set theory and logic in mathematical definitions. The conversation also touches on the use of numbers, addition, and multiplication in mathematical proofs and the limitations of these operations in representing all of mathematics.
  • #36
SW VandeCarr said:
I'm arguing that they were not generated by a conscious extension of something more basic.
SW VandeCarr said:
Given {0.1, +} we learned to generate all the on the other positive integers
Just to be clear, you've changed your position on this, right?
 
Mathematics news on Phys.org
  • #37
Hurkyl said:
:confused: There are lots of was to generate all of the primes, should one want to do such a thing, without even the slightest hint of any controversy surrounding them.

As usual you're literally correct. However, an algorithm which can be inductively proven to generate all the primes and only the primes?
 
  • #38
Hurkyl said:
Just to be clear, you've changed your position on this, right?

Yes, as regards forming a group. The positive integers do not form a group under addition and subtraction, but one can generate all the the positive integers from N+1=N' where N' is the successor. (Actually I still think non negative integers can form a group with the weak form of subtraction that doesn't allow for negative numbers but I won't argue the point.) If you mean changing my position regarding natural numbers as a non derivative object class, no. As I said, you need the concept of integer in order generate more integers.
 
Last edited:
  • #39
SW VandeCarr said:
Yes, as regards forming a group.
Huh? I wasn't talking about that -- I was talking about the bit I quoted.



SW VandeCarr said:
As usual you're literally correct. However, an algorithm which can be inductively proven to generate all the primes and only the primes?
:confused: If an algorithm didn't generate all the primes and only the primes, I wouldn't have called it such.
 
  • #40
jimmysnyder said:
It can't. A set that contains nothing but the empty set is not itself the empty set. It is a set. Consider the power set of a non empty set. The power set contains the empty set.

A concept that was developed after 5-10,000 years of mathematical experience is not intuitive. Even someone as prolific as Euler almost certainly never thought of it. I actually tried to explain how something that was nothing wasn't nothing because it contained something that was nothing. The victims of my intellectual assault weren't mathematicians, but they were very bright engineers. My phrasing actually comes from them. Set Theory, at the level of ZFC, is mostly used by mathematicians for mathematicians (and logicians).
 
  • #41
SW VandeCarr said:
A concept that was developed after 5-10,000 years of mathematical experience is not intuitive. Even someone as prolific as Euler almost certainly never thought of it. I actually tried to explain how something that was nothing wasn't nothing because it contained something that was nothing. The victims of my intellectual assault weren't mathematicians, but they were very bright engineers. My phrasing actually comes from them. Set Theory, at the level of ZFC, is mostly used by mathematicians for mathematicians (and logicians).
A group is a set.
 
  • #42
Hurkyl said:
Huh? I wasn't talking about that -- I was talking about the bit I quoted.

OK. Then I haven't changed my position. As an object class the integers are not derivative. It takes the concept of integer and addition to generate the object class. (object class in the object oriented programming sense).

Confused: If an algorithm didn't generate all the primes and only the primes, I wouldn't have called it such.

Are you saying such an algorithm or formula exists?
 
  • #43
jimmysnyder said:
A group is a set.

Of course. What does that have to do with the discussion? The integers are a set. Euler wouldn't have known what you're talking about however.
 
  • #44
SW VandeCarr said:
A concept that was developed after 5-10,000 years of mathematical experience is not intuitive.
The concept has been around since antiquity. It has been in studied in philosophy at least as far back as Parmenides. We even have common English words relating to it. Heck, we even have specialized grammar for it.

That the empty type did not appear as a specific object in modern set theory until recently is because modern set theory is a recent invention. :-p
 
Last edited:
  • #45
SW VandeCarr said:
OK. Then I haven't changed my position.
Then you have contradicted yourself -- in the first of the quotes in #36, you state that
they were not generated by a conscious extension of something more basic.​
and in the second quote, you mention how we generate them as a conscious extension of something more basic.
 
  • #46
Hurkyl said:
The concept has been around since antiquity. It has been in studied in philosophy at least as far back as Parmenides. We even have common English words relating to it -- e.g. "nothing".

That the empty type did not appear as a specific object in modern set theory until recently is since because modern set theory is a recent invention. :-p

Exactly. The idea of nothing is well expressed intuitively with zero, but for some reason, Western mathematics didn't adopt it until the Middle Ages (from India via Arabia). However {} in not zero. We have 0, {0} and {}. Now the logical non mathematician might well ask what the difference is between these. I think even Euler might have asked that. I think he might have been even more perplexed by {{},{{}}}. Now with ZFC, it becomes clear what these expressions mean, but it's not intuitive. There's a reason why teaching math starts with the natural numbers and addition. There's cognitive order to the way the brain assimilates mathematics. There's a lot of research along these lines and it's not clear if the historical model of mathematical development is the best and most efficient way for introducing new concepts, but I doubt we'll be seeing {} in the first grade classroom anytime soon.
 
  • #47
Hurkyl said:
Then you have contradicted yourself -- in the first of the quotes in #36, you state that
they were not generated by a conscious extension of something more basic.​
and in the second quote, you mention how we generate them as a conscious extension of something more basic.

No. The object class of integers is basic. Historically and, I believe, cognitively, the integers with addition are basic. The objects in the class are generated as described. I've said this several times. There's no contradiction. The integers, I'm saying, were not the conscious extension of something more basic. However the number 112,985 is generated by a process involving integers. All you need to get started is {0,1,+}. Fractions, on the other hand, were a conscious extension of the concept of integers.

In any case, the first quote from post 36 doesn't show the context, and second quote says nothing about conscious extension.
 
Last edited:
  • #48
SW VandeCarr said:
Exactly. The idea of nothing is well expressed intuitively with zero
Well, yes and no. "Zero" captures an aspect of nothingness in a way similar to how "fifty" captures an aspect of United Statehood.

I think even Euler might have asked that. I think he might have been even more perplexed by {{},{{}}}.
And in the prototypical application of set theory, that makes perfect sense -- how often do you have occasion to treat a type of types of types as an object of study? Laypeople have trouble with logic -- let alone metalogic or metametametametalogic! (i.e. fourth-order logic)

Of course, I doubt Euler would have any difficulty understanding that set in various other applications. For example, I doubt Euler would have any difficulty understanding the hierarchy that set describes (depicted below as a graph), or its application as describing a container containing two containers, one empty, and the other containing an empty container.

[tex]
\begin{matrix}
& & \bullet \\
& \swarrow & & \searrow \\
\bullet & & & & \bullet \\
& & & & \downarrow \\
& & & & \bullet
\end{matrix}
[/tex]

The use of this set as denoting the natural number 2 is not meant to help you do arithmetic: it's meant to help you do set theory. The primary application of that construction is to let us use arithmetic facts to do certain kinds set theoretic calculations (and it's pretty darned good at it too). It's most familiar application -- to reduce Peano arithmetic to ZFC -- is a technical argument in model theory, nothing more. (Despite the tendency of people to try and read more into it)

That said, I do think it is rather pleasing construction -- we name the number N by making use of a set of N objects. And we conveniently have a set of N objects at hand: the set of natural numbers from 0 through N-1.

There's a lot of research along these lines and it's not clear
And the research I recall is that our instinctive concept of counting number starts becoming fuzzy around the number 5, if not earlier. I believe there's even a pretty good case that our instinctive notion of quantity only has three categories: 0, 1, and more than one. (Although I suspect that one is the fault of language, not instinct)
 
Last edited:
  • #49
SW VandeCarr said:
Historically and, I believe, cognitively, the integers with addition are basic...The integers, I'm saying, were not the conscious extension of something more basic.

This appeal to what is intuitive, what is derived, is cognitively flakey. And the reason why your epistemology would be better founded in the generality of category theory.

Number and addition would be but an example of the more general mathematical dichotomy of object and morphism. The fundamental entity and its space of actions.

And then the animal/infant research would argue that integers and counting are not a basic cognitive act. Although I know many people, including neuroscientists, have made this claim. All those experiments to "prove" that even newborns and chimps can count.

What is basic to brains, to cognition is dichotomisation - the division into figure and ground, event and context. Indeed, object and morphism. Brains find it very easy and natural to find the one among the many, the signal in the noise. Then with effort, the brain can make a succession of dichotomous identifications and carry in working memory the idea of several entities in several locations.

Two, three, and even four can be seen "at a glance". Get up to five or six, seven or eight, and with training people and chimps can make good guesses. Or switch to a second strategy of serial identification - in effect counting by jumping attention across locations. Smart animals with a lot of training (so not natural but socialised and scaffolded by humans) can mimic counting.

So integers are a derived concept if we are talking about the true cognitive basis of our "mathematical" knowledge.

And so is addition. Kids and chimps can be tested by pouring a squat glass of water into a larger taller one. They will think 1 + 0 < 1. There will seem to be less water when it fills a bigger glass.

Again, this is why it is a mistaken enterprise to hope to build the edifice of maths purely by construction from the bottom up using an atomistic entity like an integer and an atomistic action like addition. The "truth" of mathematics lies in the generality that constrains all maths in all its forms. Which is the reason why category theory is a better route to discovering its fundamentals.
 
  • #50
apeiron said:
This appeal to what is intuitive, what is derived, is cognitively flakey. And the reason why your epistemology would be better founded in the generality of category theory.

I will be happy to defer to you on category theory. I have only the most superficial knowledge of it. Again, I'm not trying to derive math. It's already been done. It's in the historical record of how math was actually developed. I think this has practical application as to how math might be more effectively taught. If I were to use a metalinguisitic approach, I would probably choose set theory, but I've already given my reasons why I excluded set theory. I'm simply asserting, based the historical record, that integers are fundamental without attempting to find the neurophysiologic basis for it. All I want to do is distill the record and make it a basis for teaching math. The very poor state of math education in the US needs some innovation. The "new" math of the 1970's was an abject failure.To say a project of using the actual mainstream evolution of mathematics as a model is flakey, misguided or doomed to failure is to say that mathematics up to and including differential equations is a failure. What I've done so far is no less or more rigorous than the standard textbooks on arithmetic, algebra and analysis. I'm not going any further than that.

The integers are a derived concept if we are talking about the true cognitive basis of our "mathematical" knowledge.

I believe it would be a very illuminating and possibly practical project to find the neurophysiologic basis of mathematics, but it's not my project.
 
  • #51
Hurkyl said:
Of course, I doubt Euler would have any difficulty understanding that set in various other applications. For example, I doubt Euler would have any difficulty understanding the hierarchy that set describes (depicted below as a graph), or its application as describing a container containing two containers, one empty, and the other containing an empty container.

[tex]
\begin{matrix}
& & \bullet \\
& \swarrow & & \searrow \\
\bullet & & & & \bullet \\
& & & & \downarrow \\
& & & & \bullet
\end{matrix}
[/tex]

Euler probably invented graph theory with his Konigsberg Bridge problem and the Eulerian circuit.

As far as using a container model for understanding the empty set, one could argue that a container is something. The conceptual problem, as it see it, is that while the idea of a collection is intuitive, the idea of an empty container inside of a container that is otherwise empty appears to contradict the stricture that there is only one empty set.

And the research I recall is that our instinctive concept of counting number starts becoming fuzzy around the number 5, if not earlier. I believe there's even a pretty good case that our instinctive notion of quantity only has three categories: 0, 1, and more than one. (Although I suspect that one is the fault of language, not instinct)

Whatever instinctive notions pre-school children have of number, most are able to quickly learn how to generate longer integer sequences.
 
Last edited:
  • #52
Perhaps I could mount a serious argument if you actually explained why you think the rationals are derived but the irrationals not. Both arose as a "conscious extension of something more basic." Both have examples arising in nature, as I indicated. In post #23 you gave historical reasons for choosing the integers - fine. This fails to explain the irrationals, which as I'm sure you know, historically arose later than the rationals.
 
  • #53
Ravid said:
Perhaps I could mount a serious argument if you actually explained why you think the rationals are derived but the irrationals not. Both arose as a "conscious extension of something more basic." Both have examples arising in nature, as I indicated. In post #23 you gave historical reasons for choosing the integers - fine. This fails to explain the irrationals, which as I'm sure you know, historically arose later than the rationals.

I don't know how to say it any better than I've already said it. Fractions are compositions of integers. The irrationals are not. No one consciously extended the concept of a fraction to the true nature of the irrationals. It took some time before people realized that you couldn't express pi with a fraction. 22/7 was used a lot by the ancients (and it's not a bad approximation), but they knew it wasn't quite right. Fractions were invented The irrationals were discovered. They did not arise from any conscious extension of fractions.They were discovered when people tried to express them as fractions. They were an unwelcome intrusion into an otherwise well ordered world.
 
Last edited:
  • #54
SW VandeCarr said:
I'm simply asserting, based the historical record, that integers are fundamental without attempting to find the neurophysiologic basis for it. All I want to do is distill the record and make it a basis for teaching math. .

Sorry, I didn't realize your mission here was paedagogic. You posted in a philosophy forum.

Even so, that would seem only to make it more important to base teaching on the things brains find most easy to grasp.
 
  • #55
SW VandeCarr said:
I don't know how to say it any better than I've already said it. Fractions are compositions of integers. The irrationals are not. No one consciously extended the concept of a fraction to the true nature of the irrationals. It took some time before people realized that you couldn't express pi with a fraction. 22/7 was used a lot by the ancients (and it's not a bad approximation), but they knew it wasn't quite right. Fractions were invented The irrationals were discovered. They did not arise from any conscious extension of fractions.They were discovered when people tried to express them as fractions. They were an unwelcome intrusion into an otherwise well ordered world.

The distinction you make here is arbitrary. [tex]\sqrt{2}[/tex] is obtained from 2. The process needed to obtain it is different from that to obtain 1/2, but the difference is not fundamental. People found it hard to accept irrationals, but they were a conscious extension of the rational number counting system that allowed people to express ratios in Euclidean geometry that they otherwise couldn't, just as rationals were an extension of integers allowing people to express parts of a whole (which similarly arise as ratios in Euclidean geometry).

In a more modern sense: rationals are needed so that you can divide. Algebraic integers are needed so that you can take square roots. Trancendentals are needed so that you can take limits. They are all extensions of simpler systems, both historically and logically.
 
  • #56
Ravid said:
The distinction you make here is arbitrary. [tex]\sqrt{2}[/tex] is obtained from 2.

The distinction I make is that the rationals can be expressed exactly (as a ratio of integers) and the irrationals can only be approximated. If that's arbitrary, so be it.
 
  • #57
SW VandeCarr said:
The distinction I make is that the rationals can be expressed exactly (as a ratio of integers) and the irrationals can only be approximated. If that's arbitrary, so be it.

[tex]\sqrt 2[/tex] is an exact expression of the square root of two. What I suggest you mean is that rationals can be expressed exactly as finite combinations of integers with finitary (binary) algebraic operations, e.g. division, as opposed to as solutions of an equation or a limit. But then again, [tex]\sqrt[/tex] is an unary algebraic operation on [tex]\mathbb R^+[/tex], so perhaps not even that.

The point is that this difference alone is not sufficient to distinguish between invention or discovery. You may say that the rationals and irrationals were constructed (though admittedly by different methods) or that the constructions were simply a realisation of something that was already 'there'. You should be worried about whether the distinctions you make are arbitrary, because if they are you will find your position difficult to defend.
 
Last edited:
  • #58
Ravid said:
[tex]\sqrt 2[/tex] is an exact expression of the square root of two. What I suggest you mean is that rationals can be expressed exactly as finite combinations of integers with finitary (binary) algebraic operations, e.g. division, as opposed to as solutions of an equation or a limit. But then again, [tex]\sqrt[/tex] is an unary algebraic operation on [tex]\mathbb R^+[/tex], so perhaps not even that.

The point is that this difference alone is not sufficient to distinguish between invention or discovery. You may say that the rationals and irrationals were constructed (though admittedly by different methods) or that the constructions were simply a realisation of something that was already 'there'. You should be worried about whether the distinctions you make are arbitrary, because if they are you will find your position difficult to defend.

OK. Instead of all irrationals, suppose I said only the transcendental numbers were not derivative.
 
  • #59
Hurkyl said:
Quoted: If an algorithm didn't generate all the primes and only the primes, I wouldn't have called it such.

You're not talking about a sieve, are you? To me, a sieve is a dumb algorithm which tests all odd numbers not ending in 5. That's not what I meant. I meant a smart algorithm which can produce all the primes and only the primes. Such an algorithm would also tell us the number of primes over any specified interval of natural numbers.
 
  • #60
SW VandeCarr said:
You're not talking about a sieve, are you? To me, a sieve is a dumb algorithm which tests all odd numbers not ending in 5. That's not what I meant. I meant a smart algorithm which can produce all the primes and only the primes. Such an algorithm would also tell us the number of primes over any specified interval of natural numbers.
That's not how the sieve works. It doesn't test any numbers at all and it deals with numbers divisible by 5 in the same way that it treats numbers divisible by any other prime. If it is used to find out if a particular number is prime it is an algorithm. It can't be considered an algorithm to find all prime numbers though since it would take an infinite number of steps to do that and an algorithm, by definition, can only take a finite number of steps.
 
  • #61
SW VandeCarr said:
You're not talking about a sieve, are you? To me, a sieve is a dumb algorithm which tests all odd numbers not ending in 5. That's not what I meant. I meant a smart algorithm which can produce all the primes and only the primes. Such an algorithm would also tell us the number of primes over any specified interval of natural numbers.

What is a "smart" algorithm? Do you mean an algorithm that has a polynomial complexity? I can write you an algorithm that can generate all primes (and only primes) and one for telling you how many primes are in a given finite subset of the naturals. You can tell the algorithm to output infinity whenever you enter a infinite subset.
 
  • #62
Focus said:
What is a "smart" algorithm? Do you mean an algorithm that has a polynomial complexity? I can write you an algorithm that can generate all primes (and only primes) and one for telling you how many primes are in a given finite subset of the naturals. You can tell the algorithm to output infinity whenever you enter a infinite subset.
So if I enter the infinite subset { 2, 4, 6, ... }, it will return infinity?
 
  • #63
Focus said:
What is a "smart" algorithm? Do you mean an algorithm that has a polynomial complexity? I can write you an algorithm that can generate all primes (and only primes) and one for telling you how many primes are in a given finite subset of the naturals. You can tell the algorithm to output infinity whenever you enter a infinite subset.

Then why do we need to estimate the number of primes less than x asymptotically with x/ln(x)?
 
  • #64
SW VandeCarr said:
Then why do we need to estimate the number of primes less than x asymptotically with x/ln(x)?
Because the algoritm is time consuming. An algorithm must complete in a finite number of steps. You would be suprised to learn how large some numbers can be and still be considered finite.
 
  • #65
jimmysnyder said:
Because the algoritm is time consuming. An algorithm must complete in a finite number of steps. You would be suprised to learn how large some numbers can be and still be considered finite.

Of course. That's why de facto, we have no provable formula that will generate all the primes and only the primes.
 
  • #66
SW VandeCarr said:
Of course. That's why de facto, we have no provable formula that will generate all the primes and only the primes.
Algorithm. You have not proved that there is no algorithm, you have only proved that you think there isn't one.
 
  • #67
SW VandeCarr said:
Of course. That's why de facto, we have no provable formula that will generate all the primes and only the primes.
No, that's obviously not a proof that such a formula does not exist. We do have an algorithm which generates all primes (sieve of Eratosthenes). It is an algorithm, admittedly not very efficient, but perfectly valid still.

Can you prove to me that there is no polynomial (possibly of very large degree) such that P(n) is the n-th prime ? That would be another algorithm. It would suffice to evaluate the polynomial at every integer to get all the primes. It would still take an infinite time, but it would be more efficient than the sieve of Eratosthenes.
 
  • #68
humanino said:
No, that's obviously not a proof that such a formula does not exist. We do have an algorithm which generates all primes (sieve of Eratosthenes). It is an algorithm, admittedly not very efficient, but perfectly valid still.

Can you prove to me that there is no polynomial (possibly of very large degree) such that P(n) is the n-th prime ? That would be another algorithm. It would suffice to evaluate the polynomial at every integer to get all the primes. It would still take an infinite time, but it would be more efficient than the sieve of Eratosthenes.

No. I'm saying we don't have an efficient formula to generate the nth prime. I'm not saying one can't possibly exist.
 
Last edited:
  • #69
jimmysnyder said:
Algorithm. You have not proved that there is no algorithm, you have only proved that you think there isn't one.

See my response to humanino. Also, mathematical formulas with numerical outputs and algorithms are implemented the same way, as computational steps.
 
  • #70
humanino said:
Can you prove to me that there is no polynomial (possibly of very large degree) such that P(n) is the n-th prime?
Yes. Let P be a polynomial of degree m such that P(n) is prime for all n.
[tex]P(x) = a_mx^m + ... + a_0[/tex]
Then P(1) = p where p is a prime, so P(1) = 0 (mod p). So for any k,
[tex]P(1 + kp) = a_m(1+pk)^m + ... + a_0[/tex]
[tex] = a_m + a_mb_m + ... + a_1 + a_1b_1 + a_0[/tex]
(where [tex]b_i[/tex] is divisible by p for all i)
= P(1) mod p
= 0 mod p
so P(1 + kp) = 0 (mod p) and either P(1 + kp) is divisable by p and is not prime, or is 0. But P only has m zeros or is itself the zero polynomial.
 
Last edited:
Back
Top