Is Zero a Real Concept or Just a Metaphysical Idea?

In summary: The Sumerians were the first to develop a counting system to keep an account of their stock of goods - cattle, horses, and donkeys, for example. The Sumerian system was positional; that is, the placement of a particular symbol relative to others denoted its value. The Sumerian system was handed down to the Akkadians around 2500 BC and then to the Babylonians in 2000 BC. It was the Babylonians who first conceived of a mark to signify that a number was absent from a column; just as 0 in 1025 signifies that there are no hundreds in that number. Although
  • #71
Nice to see you both took so much out of that little post. :rolleyes:

wrongusername said:
From wikipedia:



I think you're the first person I know to claim irrational, uh, numbers, are not numbers :-p

Wikipedia also regurgitates the myth that turbochargers run on "otherwise wasted energy".
 
Physics news on Phys.org
  • #72
SonyAD said:
It has everything to do with physical quantities. Physical quantities are the ultimate result and pursuit of any worthwhile computation. Even if these physical quantities are virtual (in the case of simulations - 3D games, for instance).

What is there you may wish to compute (or even be able to reliable compute without being able to test your predictions against reality while developing the required math) that is not of the physical world or in semblance of it?

The answer? Insanity. The pointless intellectual perdition modern mathematicians love to indulge themselves in while leaving such basic, fundamental questions as:

"What is the area of overlap between two random triangles?"
or
"What is the clockwise area of a complex polygon?"

Not even addressed.



What proof do you have they are even numbers? That's just a tenet of dogma.

Would you call infinity a number? Do you think it exists? Do you think it makes sense? That which has no value because it is boundless. Even though it has no value (because it is boundless), its value must also be greater at any time in its existence than it had ever been until that point. It must grow. Otherwise how can you accommodate the fact that, no matter how much you keep on churning decimals, you never quite get to its value?

Sounds like religion to me.

You will have to understand that mathematics, or rather 'mathematical activity' is formal manipulation. It does not correspond to any physical fact of nature. Mathematics makes no predictions about reality whatsoever! The formal premises/axioms might have their motivation in our intuition of various concepts or phenomena, but doing mathematics is expanding our calculus of formal 'truth'. Some find this 'insane' activity quite interesting.

However, playing on your premises; it is more clear now than ever how interconnected mathematics and physics is. To say that mathematicians have wandered on a lone road of pointless indulgence in abstract nonsense is more far off than you can imagine. Modern physics is quite clearly dependent on much of modern mathematics and make good use of the 'fallacious concepts' like the basic notions of irrational numbers and infinity.

You seem to have a clear-cut opinion of what a number is and what it is not, but can you define a number? How do you define value? What makes you think that you decide what is allowed to be called a number, and what isn't? What 'numbers' mean is exactly the usage of numbers, and the usage is not bound by anything but convention. What you call 'dogma' I call natural elasticity.

I could very well call infinity a number (of course in the context of e.g. the extended real line), but I don't - but this is a matter of convention. Infinity in the context of the extended real line is not a natural, rational or real number; but there is nothing which hinders the conventional definition of 'number' to include infinity in this case.

SonyAD said:
But, who knows? Maybe irrational numbers are why the universe is expanding, no?
Certainly, no.

EDIT: You will have to explain what you mean by a 'random triangle', :rolleyes: and how they are part of such basic and fundamental questions.
 
Last edited:
  • #73
Jarle said:
You will have to understand that mathematics, or rather 'mathematical activity' is formal manipulation. It does not correspond to any physical fact of nature.

Fascinating, the things I get you to say.

So basic mathematical operations such as addition, subtraction, etc. applied to natural numbers have no rooting in nor relation to anything of encompassing reality? Or what it means to be in a basket?

Adding two apples to three already in the basket does not make five apples in the basket? Or are you going to harper on the definition of an apple?

Jarle said:
Mathematics makes no predictions about reality whatsoever!

Really! It does not predict that 5 Kg of concrete added to 5 Kg already in the shopping trolley will make for 10 Kg of concrete in the trolley?

What is the purpose of mathematics then?

Jarle said:
The formal premises/axioms might have their motivation in our intuition of various concepts or phenomena, but doing mathematics is expanding our calculus of formal 'truth'. Some find this 'insane' activity quite interesting.

Cool. If only anyone were working on stuff remotely, remotely useful to me.

Jarle said:
However, playing on your premises; it is more clear now than ever how interconnected mathematics and physics is.

I only tackle the stuff I can wrap my head around. When someone knocks me out https://www.physicsforums.com/showpost.php?p=2789257&postcount=15" I scurry on back to my hole as soon as I come to and shut up.

Jarle said:
To say that mathematicians have wandered on a lone road of pointless indulgence in abstract nonsense is more far off than you can imagine. Modern physics is quite clearly dependent on much of modern mathematics and make good use of the 'fallacious concepts' like the basic notions of irrational numbers and infinity.

Not knowing/understanding what something is does not necessarily preclude its use. As intended or natural or otherwise.

Jarle said:
You seem to have a clear-cut opinion of what a number is and what it is not, but can you define a number?

Numbers have definite & definitive values, for starters. These values don't change the closer you look at them.

Jarle said:
How do you define value?

How do you prove an axiom?

Jarle said:
What makes you think that you decide what is allowed to be called a number, and what isn't?

Common sense.

Jarle said:
What 'numbers' mean is exactly the usage of numbers, and the usage is not bound by anything but convention. What you call 'dogma' I call natural elasticity.

The freedom to call a spade a fork.

Jarle said:
I could very well call infinity a number (of course in the context of e.g. the extended real line), but I don't - but this is a matter of convention. Infinity in the context of the extended real line is not a natural, rational or real number; but there is nothing which hinders the conventional definition of 'number' to include infinity in this case.

If you admit infinity (let alone call it a number) then, by implication, you must accept division by 0 makes sense. And you must share with us the exact value of any given number divided by 0 and how to compute this value.

A triangle being random only means there is no special/particular numerical relation between the lengths of its sides or any special value to its angles.
 
Last edited by a moderator:
  • #74
Reality only intrudes into mathematics at the level of its axioms. But it does intrude there.

Yes, once an axiom is assumed, then all else that follows is formal modelling - free of reality even if it happens to fly in a parallel path to reality.

But the creation of axioms is an informal exercise. You could chose something as a matter of whim, or because it seems "logical", or intuitive. But in fact axioms tend to get chosen because they seem true by some kind of generalised observation of the world. So our view of what is real does intrude at the start of things because all our ideas are ultimately grounded in our experience.

We can get much more specific about this business of getting started. For example, there is Peirce's logic of vagueness and process of abduction.

But we don't need to get that specific to see the central confusion that is being expressed in this thread.

Yes, formal systems based on axioms are no longer part of reality. And yes, axioms are "unreal" statements too. But yes, axioms are justified by something in the end, and it is our general informal impressions of what is true about our experiences of reality.

So numbers can both fail to really exist, while absolutely existing formally. And to the extent that the two worlds fly along in parallel, most people will never think about the essential difference. But philosophically, it matters that there is a gap and a relationship that bridges that gap.
 
  • #75
SonyAD said:
Fascinating, the things I get you to say.

So basic mathematical operations such as addition, subtraction, etc. applied to natural numbers have no rooting in nor relation to anything of encompassing reality? Or what it means to be in a basket?

I never said mathematics have no relation to reality, we use mathematical reasoning in many physical situations. The point you must understand is that mathematics does not correspond to physical reality; mathematics deals with definite formal rules of mathematical concepts which may very well be abstracted from physical situations. That does not mean it is 'rooted' in physical reality.

Natural numbers are such an example. Arithmetic is purely the formal manipulation of the symbols we call numerals according to definite rules, while still being incredibly useful in real situations. It is a critical fact of mathematics that it deals only in formality.

SonyAD said:
How do you prove an axiom?

Do you know what an axiom is?

SonyAD said:
If you admit infinity (let alone call it a number) then, by implication, you must accept division by 0 makes sense. And you must share with us the exact value of any given number divided by 0 and how to compute this value.

No, that is not an implication. However, division by 0 does make sense! (or more precisely: it can make sense)

http://en.wikipedia.org/wiki/Riemann_sphere



EDIT: Apeiron makes an important point when he separates the informal process of choosing ones axioms from the formal deduction which takes place afterwards. However, only the latter part is mathematics, or 'mathematical activity' (not ignoring the extreme importance of this process to mathematics).
 
Last edited:
  • #76
Hurkyl said:
You have your facts wrong; the Banach-Tarski paradox is not a logical contradiction.

What it does do is demonstrate vividly that the notion of "measure" does not behave well when applied to a calculation involving sets for which the notion of measure does not apply.
So what axiom, rule, etc., was broken in the Banach-Tarski paradox? Limits are defined as Lebesgue measurable in many cases. What is the difference between "does not behave well" and "logical contradiction"? To say the notion of measure does not apply is not how Lebesgue measures are defined.
 
  • #77
apeiron said:
So numbers can both fail to really exist, while absolutely existing formally.

How do you decide whether a number really exists, rather than just existing formally?
 
  • #78
my_wan said:
So what axiom, rule, etc., was broken in the Banach-Tarski paradox? Limits are defined as Lebesgue measurable in many cases. What is the difference between "does not behave well" and "logical contradiction"? To say the notion of measure does not apply is not how Lebesgue measures are defined.
Part of the definition of a measure is a specification of what sets are measurable (with respect to that measure).

If A, B, C, D, and E are disjoint measurable sets, then:
m(A union B union C union D union E) = m(A) + m(B) + m(C) + m(D) + m(E)

If A is measurable , A' is the transform of A by a Euclidean motion, and m is the Lebesgue measure, then A' is measurable, and:
m(A) = m(A')

Combining these, we can prove:

Theorem: If A, B, C, D, and E are disjoint measurable sets, m is the Lebesgue measure, A' is the transform of A by a Euclidean motion and similarly for B', C', D', and E', and A', B', C', D', and E' are disjoint, then
m(A union B union C union D union E) = m(A' union B' union C' union D' union E')



Note that all of the theorems above have the hypothesis that the sets involved are measurable. Some of the five sets constructed in the proof of the Banach-Tarski paradox are not measurable, and thus do not satisfy the hypothesis of the theorem.

(aside: the Banach-Tarski paradox uses the axiom of choice in the construction, which is why it comes up in discussions over that axiom)
 
  • #79
Jarle said:
How do you decide whether a number really exists, rather than just existing formally?

By prediction and measurement.

The best modern epistemologist IMHO is Robert Rosen who was a mathematical biologist and wrote great stuff on the modelling relation. His last book, Essays on Life Itself, is all about precisely this issue.

So say we are talking about the number 1. We have defined it formally as a model of "one-ness". It exists formally. But how do we find exactly 1 of anything in the real world? We take the model and use it to justify a process of prediction and test.

Is that one apple I see on the table? Well, I might have to walk all round it to be sure a second, or an infinity of apples, is not hidden just behind it, blocked from my line of sight. I might have to reach out and touch it, to be sure it is not a collection of carefully painted beetles holding some apple like formation. Or if we zoom in for a microscopic view, at some point everything dissolves into atoms (whatever they look like) and we can wonder where this exact apple starts and ends. Etc, etc. If you are talking about absolute certainty, then there are in reality an unlimited number of doubts that must be ruled out.

So can we ever really know that some physical thing is an exact example of one-ness, as formally defined? In pragmatic fashion, we can pretty quickly agree that we only have a single apple on the table. All doubt seems trivial. But doubt must always remain because reality does not seem completely measurable.

One, of course, seems the simplest number to relate to reality. Others like pi and infinity remain more troublesome.

And I think there is then another whole level to this particular debate to do with "doubt" within formal models themselves.

Quickly, axiomatic truths are in fact inevitably dichotomistic. You cannot confidently assert one thing without also creating the equally definite possibility of its opposite (thesis and antithesis as Hegel said). So you say discrete, I say continuous. You say local, I say global. You say event, I say context. You say stasis, I say flux. You say determined, I say random.

All basic concepts come in complementary pairs as all crisply definite assertions are symmetry breakings (the breaking of the symmetry of vagueness or ignorance into asymmetric polar opposites).

And so, secretly - it is rarely acknowledged, except by category theory! - that all mathematical systems have a fundamentally mixed nature. They must employ both ends of a dichotomy, even if they prefer to suppress awareness of one of the ends.

So with number theory. We have discrete numbers existing on a continuous line. Both aspects are essential to the formal model, but one aspect is suppressed.

For example, the number 1 is just taken as a discrete point (on a continuous line). This is a formal statement that seems to need no further "measuring". It just is.

But say we wanted to check? Well what we are really saying is that 1 is 1.000... Check its location on the line to as many decimal places as we like, and it will be exactly there. But of course, we also know there is a practical issue when it comes to establishing infinite facts. In practice, we can never arrive at a final count. The continuity that we have tried so hard to push out of view is here reasserting itself.

Again, the 1-ness of the number 1 is about the least troubling either in the real world, or within its own world, the realm of axiomatic formal modelling. But even for 1, there is a hidden duality behind the presumed monadic description. Counting appears based on the notion of fundamental discreteness, but for exactly this reason, it is just as much based (formally, axiomatically) on the assumed absence of fundamental continuity (and hence, in practice, by the suppression of what must also exist as part of initial state of possibility, back when axioms were being formed and reality was still intruding).

It is a case of A and not-A. You make a division and you must create two things. Both are equally real. But in your model, you just want to keep things simple and use the A. And suppress any non-A-ness. So 1 is discretely just 1, and you don't have to run round constantly stamping out threatened confusion from 1.00000... sometimes being actually 1.000001, or some other infinitesimal fluctuation.

However, when it comes to irrationals and infinities, people are more aware that the counter-balancing option of continuity is being actively suppressed (suppressed axiomatically). So they will protest and try to re-open the door to fundamental doubt. And if they go all the way back to axiom-formation, they can see doubt is justified - numbers are not real - but also that there was a reason why the formalism went with option A rather than option not-A.

Imaginary numbers are the same. To me, the notion of 2-dimensional numbers, or n-dimensional numbers, seems quite natural. A point is a constraint on a line, but a line is a constraint on a plane, which is a contraint on a volume, etc. So you can play about on the spectrum between the absolutely discrete (a zero-D point) and the absolutely continuous (an infinite, unconstrained, dimensionality).

Again, the fundamental continuity (or its suppression) becomes more obvious and so more troubling with imaginary numbers, but it is there for all numbers in a necessary fashion. To have A, you have to make not-A. To have a figure, you must have ground. To have an event, you must have context.

Which hopefully loops round to my initial points about vagueness, zero and the null set. For zero to be a local absence, it must exist in the context of a global presence. The formalism of set theory wants to throw away the {} along with its contents - treat them as a nothing as well. But it would be less confusing to accept them for what they are, a necessary part of breaking the symmetry of pure possibility. To have thesis, you must also have anti-thesis.

And vagueness then is this realm of the pure unbroken possibility, a state of infinite symmetry. Imagine a place which is neither discrete nor continuous, neither random nor determined, etc, etc. Yet can be divided into these crisp, mutually-defining, polarities.
 
  • #80
Hurkyl said:
Part of the definition of a measure is a specification of what sets are measurable (with respect to that measure).

If A, B, C, D, and E are disjoint measurable sets, then:
m(A union B union C union D union E) = m(A) + m(B) + m(C) + m(D) + m(E)

If A is measurable , A' is the transform of A by a Euclidean motion, and m is the Lebesgue measure, then A' is measurable, and:
m(A) = m(A')

Combining these, we can prove:

Theorem: If A, B, C, D, and E are disjoint measurable sets, m is the Lebesgue measure, A' is the transform of A by a Euclidean motion and similarly for B', C', D', and E', and A', B', C', D', and E' are disjoint, then
m(A union B union C union D union E) = m(A' union B' union C' union D' union E')



Note that all of the theorems above have the hypothesis that the sets involved are measurable. Some of the five sets constructed in the proof of the Banach-Tarski paradox are not measurable, and thus do not satisfy the hypothesis of the theorem.

(aside: the Banach-Tarski paradox uses the axiom of choice in the construction, which is why it comes up in discussions over that axiom)

Ok, this makes sense. My particular perspective is not built from a purely mathematical point of view. It is based more on my utilitarian application of math, and the applicability is not entirely dependent on a mathematicians nose for details.

It seems that, given what you provided, Banach-Tarski intentionally transformed disjoint measurable sets, or different equivalence classes, to impose just the effect it had.
 
  • #81
my_wan said:
It seems that, given what you provided, Banach-Tarski intentionally transformed disjoint measurable sets, or different equivalence classes, to impose just the effect it had.
Right.

People have a habit of focusing on conclusions too much -- they balk when they see a theorem that concludes a single ball can be rearranged into two balls.

However, much of the point of theorem proving is not to prove conclusions, but to derive hypotheses: Banach-Tarski is a vivid demonstration that non-measurable sets fail to have the geometric properties we would like to demand of shapes in three-space -- and that failure can manifest itself even when dealing with the nicest of shapes such as a ball.

(I believe previous demonstrations had involved more convoluted sets, so some might be inclined to intuit that all of the oddities might be confined to weird sets, but as long as you start and end with reasonable shapes everything worked out fine)

So the conclusion to be derived from Banach-Tarski is that when doing geometry, you should restrict yourself to measurable sets. This issue rarely comes up in practice, because we already make a habit of working with nice sets -- but it's good to understand the range of applicability of the tools you want to use. (and it is also good to know that you can render non-measurable sets non-existent with an appropriate denial of the axiom of choice)

(There's an adage I like -- you can't claim to understand what something is unless you also understand what it is not)
 
  • #82
apeiron said:
So can we ever really know that some physical thing is an exact example of one-ness, as formally defined?

The quest for an ontological 'one-ness'; the question of whether a physical thing actually possesses the quality of one-ness, or is an example of one-ness, seems to me as a completely useless exercise.

The important thing in the situation you brought up is that we 'see' one apple on the table. It is irrelevant for us if there actually is two (one being invisible or hidden). We still only count 1 apple, and behave accordingly. What matters to us is our picture of the situation, and how we think of it. This is where 1 enters reality, through counting the apple. The only ontology of the natural numbers you will find lies in the way we operate with them, so I claim that all numbers exclusively exist formally. It is only through formal use (arithmetic, labeling, enumerating, counting) natural numbers makes sense at all.

apeiron said:
Which hopefully loops round to my initial points about vagueness, zero and the null set. For zero to be a local absence, it must exist in the context of a global presence. The formalism of set theory wants to throw away the {} along with its contents - treat them as a nothing as well. But it would be less confusing to accept them for what they are, a necessary part of breaking the symmetry of pure possibility. To have thesis, you must also have anti-thesis.

I don't understand your insistence on that 0 is local absence; it is a formal character used in various ways. When measuring temperature 0 is just another temperature on the scale, it is not the 'absence of degrees'. It might be an arbitrary lottery number. It can mean 'false'. 0 does not represent some fundamental feature of reality, it is a tool (in non-mathematical usage).

But why do you say we "treat them as a nothing" in set theory? And how can you refer to what they "really are"? In themselves they are not more than symbols on paper, on a screen, or mentally before your eyes.
 
Last edited:
  • #83
Hurkyl said:
Right.

People have a habit of focusing on conclusions too much -- they balk when they see a theorem that concludes a single ball can be rearranged into two balls.

However, much of the point of theorem proving is not to prove conclusions, but to derive hypotheses: Banach-Tarski is a vivid demonstration that non-measurable sets fail to have the geometric properties we would like to demand of shapes in three-space -- and that failure can manifest itself even when dealing with the nicest of shapes such as a ball.

...
My issue came when, it seemed to me, limits could be defined as Lebesgue measurable without specific reference to the class it was measurable wrt, as if is was either an absolute property or not. In that case I didn't see a specific violation in Banach-Tarski. So even though this justifies my view that it inappropriately mixed equivalence classes, this wasn't new, and I was being over-judgmental in thinking Banach-Tarski represented a valid result in the standard formulism. That I can accept.
 
  • #84
Jarle said:
The only ontology of the natural numbers you will find lies in the way we operate with them, so I claim that all numbers exclusively exist formally. It is only through formal use (arithmetic, labeling, enumerating) natural numbers makes sense at all.

I agree that numbers exist only as part of a formal model. But then the argument becomes whether formal models only exist to model reality. A scientist kind of thinks so, therefore sees modelling as a relation with reality (which is where the prediction and measurement must come in). A mathematician may take the different view that once you have invented the realm of the formal model, you can just explore its interior space forever.

Jarle said:
I don't understand your insistence on that 0 is local absence; it is a formal character used in various ways. For example, when measuring temperature 0 is just another temperature on the scale, it is not the 'absence of degrees'. It might be an arbitrary lottery number. It can mean 'false'. 0 does not represent some fundamental feature of reality, it is a tool (in non-mathematical usage). I will say the same of {}.

I was responding to the issue of representing non-existence, nothingness, with a symbol that has a place on the numberline. As a limit on "thingness", clearly zero is not just another number (like 2 or 5). We can see this form the ill behaviour of 0 when we try to divide other numbers by it. 0 was only masquerading as merely another digit.

Counter-examples like temperature seem poorly chosen. The zero is normally set at some significant place. In centigrade, it marks the freezing point of water (the limit of liquid). Then as science really got to know the world, they re-set the zero to absolute kelvin - the limit on motion.

Jarle said:
But why do you say we "treat them as a nothing" in set theory? And how can you refer to what they "really are"? In themselves they are not more than symbols on paper, on a screen, or mentally before your eyes.

I'm not really understanding your point now. If you are saying that symbols seem highly arbitrary in relation to what they represent, then yes, this is a standard point in semiosis and other theories of symbol-grounding. It is only if a symbol is as detached as possible from what it is meant to represent that it can freely function as a symbol.

I think the real problem here is that you are taking the naive Saussurian view of symbols and not a Peircean one. In one, the meaning of symbols is just convention. In the other, the meaning has a process of development.

http://www.aber.ac.uk/media/Documents/S4B/sem02.html
 
Last edited by a moderator:
  • #85
apeiron said:
Counter-examples like temperature seem poorly chosen. The zero is normally set at some significant place. In centigrade, it marks the freezing point of water (the limit of liquid). Then as science really got to know the world, they re-set the zero to absolute kelvin - the limit on motion.

What I meant was that on the temperature scale 0 functioned as a demarcation, not the absence of anything. My point is that it seems useless, or pointless, to try to "pin down" the meaning of 0. It is often used where it is useful in relation to its properties in e.g. arithmetic, but it is not always so.

apeiron said:
I think the real problem here is that you are taking the naive Saussurian view of symbols and not a Peircean one. In one, the meaning of symbols is just convention. In the other, the meaning has a process of development.
http://www.aber.ac.uk/media/Documents/S4B/sem02.html

I am speaking of mathematics here, and the article did not seem to concern itself with it. It is not my intension to extrapolate my arguments to all of symbolism, so I don't think I am taking the Saussurian view (mainly because I have not heard of it before).
 
Last edited by a moderator:
  • #86
Jarle said:
I am speaking of mathematics here, and the article did not seem to concern itself with it. It is not my intension to extrapolate my arguments to all of symbolism, so I don't think I am taking the Saussurian view (mainly because I have not heard of it before).

Mathematics is a language - a system of words and grammar, symbols related by logic. Or as category theory puts it, objects and morphisms.

As such, it is appropriate when asking "what's it all about" - as you have been doing - to step back to general theories of such activities.

Both Saussure and Peirce agree that symbols must be essentially free in the way you describe to function as logic-related symbols in a formal system. But then they differred in how symbol systems are then related back to the realities they model - which is the live issue in this thread. Saussure saw a simple associative relationship - by convention - whereas Peirce saw a deeper developmental relationship - by pragmatic useage.

Science went with Peirce's pragmatism. Perhaps it is a cultural thing, but mathematicians seem to want to resist the idea that maths has a practical relationship to reality.

Of course, this behaviour is also functional in that it justifies pursuing patterns "for their own sake". Maths can go off into the wilderness of ideas without having to have some immediate purpose. (But, say the mathematicians, look how often this activity turns out to create the patterns that in fact are useful to science's next generation of models.)

So a lot of what I am hearing in your responses sounds like boundary maintenance - attempting to maintain the ingroup/outgroup line. You are one of us if you agree maths is apart from reality, one of them if you think maths is bound to reality.

And in fact, I am one of the other in thinking both things at the same time. Like all languages, maths gains its power by being apart from what it describes (0 can be made to mean anything I like), but exercises this power by then actually describing things (ooh, look, this is what I mean by 0).
 
  • #87
apeiron said:
Of course, this behaviour is also functional in that it justifies pursuing patterns "for their own sake". Maths can go off into the wilderness of ideas without having to have some immediate purpose. (But, say the mathematicians, look how often this activity turns out to create the patterns that in fact are useful to science's next generation of models.)

So a lot of what I am hearing in your responses sounds like boundary maintenance - attempting to maintain the ingroup/outgroup line. You are one of us if you agree maths is apart from reality, one of them if you think maths is bound to reality.

Our discussion is not about the motivation for pursuing mathematics, it is the status of mathematics itself. I can easily understand one motivation being to "model reality", but just as easily I can understand any other motivation. Whatever the reasons might be, and whatever is being studied, mathematicians are always 'doing mathematics' and it is quite important to be clear about what that is. If this is 'boundary maintenance', then it's important.

For example; as seen in this thread, it must be clear that there is a fundamental difference between criticizing the banach-tarski paradox for how it breaks with intuition, and, say, criticizing a logical error in the proof. If mathematics somehow ought to relate to reality, the former might be conceived as a relevant critique. It is, as Hurkyl said, an aesthetic or practical appeal. It must be clear that only the latter is relevant to mathematics. That is important boundary maintenance.

Consider the mathematical study of chess, or, say, rubik's cube. Is this modeling reality, or is it pursuing patterns for their own sake? I can hardly see the difference. Is this (or ought it be) somehow "less mathematics" than calculating the trajectory of a bullet?

EDIT: I will read the article, it seems interesting.
 
Last edited:
  • #88
Jarle said:
Our discussion is not about the motivation for pursuing mathematics, it is the status of mathematics itself. I can easily understand one motivation being to "model reality", but just as easily I can understand any other motivation.

Hmm, but epistemology is trickier than this. For something to have a status detached from our purposes? - that is exactly the kind of classical presumption that I was challenging here. While at the same time agreeing that maths does strive as its purpose for a maximum degree of such detatchment. However, then, the reason for pursuing a formal detachment is only, ultimately, to do a better job of modelling reality.

You can claim that other motivations are possible - such as aesthetics, a commonly cited purpose. But who can even define what aesthetics is really about (and so know that they are successfully pursuing it)? I am arguing that there is only one true purpose for developing any language, and that is to develop a modelling relation between a modeller and the modeled. It does not seem hard to show that is a widely accepted purpose of maths. And while "any purpose" comes as a possibility due to the great detachment of maths from reality (the creative freedom which makes it in the end so powerful as a reality-describing language), in practice, no other purpose really holds up. A putative purpose like aesthetics is rather unintelligible - except as a roundabout way of making a modelling connection back to reality (saying "beauty is truth" therefore beauty is what is most deeply real...if you cared to go out and measure that fact).

Jarle said:
For example; as seen in this thread, it must be clear that there is a fundamental difference between criticizing the banach-tarski paradox for how it breaks with intuition, and, say, criticizing a logical error in the proof. If mathematics somehow ought to relate to reality, the former might be conceived as a relevant critique. It is, as Hurkyl said, an aesthetic or practical appeal. It must be clear that only the latter is relevant to mathematics. That is important boundary maintenance.

The paradox shows a particular method has limits. It models reality quite reasonably across a large domain, but trips up at the final step. And there seems two proper responses to this kind of break-down in a model. First, we should recognise it as a paradox of the model and not of reality (as Hurkyl argued). And second it might suggest that a better model is still possible (and the way to find such a model could lie in going back and re-examining the axioms used, seeing if a better set of assumptions might lead us somewhere different). Getting back in touch with our intuitions about reality, I would argue.

And vagueness is an example of an ancient intuition which (along with dichotomies and hierarchies - the ways a vagueness can develop) has never really been mathematised. It is a path not yet properly explored, even though we have a whole bunch of first steps in that direction, such as Peircean semiotics, chaos theory, hierararchy theory, generative neural networks, etc.
 
  • #89
apeiron said:
I am arguing that there is only one true purpose for developing any language, and that is to develop a modelling relation between a modeller and the modeled.

Are you claiming that each (or most) individual developer of a language has this purpose as his motivation, or are you saying that this is how, regardless of individual motivation, language seems to strive towards? In either case (though the former is arguably wrong), I can't see how it is relevant to the status of the language itself. Actually; I would reserve myself to talk about the 'true purpose' of anything.

Mathematics is, without regard to extra-mathematical usage and motivations of individual mathematicians, a purely formal, syntactical discipline. The reason for this is not because we want it to be so in order to 'model reality better', it lies in the very nature of mathematics. We see that it is necessarily so when we see how mathematics is done. In the end, it seems that mathematics generally is pretty indiscriminate when it comes to the degree of applicability to physical modeling.

Besides: it is most likely that an individual cannot give a sufficiently complete account of his 'inner motivation' for doing what he does. Maybe this 'inner motivation' is not all that important either, or maybe it doesn't even exist. I would actually suspect that such reasons largely are created to compensate for the apparent lack of definite motivation. I personally can't honestly point to a more definite motivation than "it's interesting".

apeiron said:
Getting back in touch with our intuitions about reality, I would argue.

Perhaps this is what one would want. But such paradoxes are not due to a flaw in the mathematical process itself, the objections always comes from outside the discipline. In fact, I think most would agree that such a paradox is a positive thing, it shows us where to be careful. Now we know to restrict ourself to measurable sets if we want to preserve volume in rigid transformations. By using these sets we can again enjoy our intuitive feel.
 
  • #90
Jarle said:
Are you claiming that each (or most) individual developer of a language has this purpose as his motivation, or are you saying that this is how, regardless of individual motivation, language seems to strive towards? In either case (though the former is arguably wrong), I can't see how it is relevant to the status of the language itself. Actually; I would reserve myself to talk about the 'true purpose' of anything.

Mathematics is, without regard to extra-mathematical usage and motivations of individual mathematicians, a purely formal, syntactical discipline. The reason for this is not because we want it to be so in order to 'model reality better', it lies in the very nature of mathematics. We see that it is necessarily so when we see how mathematics is done. In the end, it seems that mathematics generally is pretty indiscriminate when it comes to the degree of applicability to physical modeling.

Besides: it is most likely that an individual cannot give a sufficiently complete account of his 'inner motivation' for doing what he does. Maybe this 'inner motivation' is not all that important either, or maybe it doesn't even exist. I would actually suspect that such reasons largely are created to compensate for the apparent lack of definite motivation. I personally can't honestly point to a more definite motivation than "it's interesting".

Perhaps this is what one would want. But such paradoxes are not due to a flaw in the mathematical process itself, the objections always comes from outside the discipline. In fact, I think most would agree that such a paradox is a positive thing, it shows us where to be careful. Now we know to restrict ourself to measurable sets if we want to preserve volume in rigid transformations. By using these sets we can again enjoy our intuitive feel.

OK, to me this a collection of impressions and feelings rather than a reasoned response. It may be your accurate impression of how the mathematicians you know operate (and it is my impression in general too). But I am trying to talk about what is fundamental in a reasoned fashion.

I have argued that the reason why a language like maths would enjoy any cultural capital is because it achieves a certain valued result. Its formalisms prove themselves to be good at the job of modelling reality. Now individual mathematicians may do maths for other personal purposes, but the general cultural purpose is pretty clear.

The second point is that this connection to reality may be denied within mathematical circles for a reason. Symbol systems have to be detached from what they describe so as to be free to describe them.

Howard Pattee is one of my favourite authors on this.

http://www.google.co.nz/url?sa=t&so...PcsZxT&usg=AFQjCNHouF69kz02eV_eL1CR38AtOqSZ7g
 
  • #91
Jarle said:
I don't understand your insistence on that 0 is local absence; it is a formal character used in various ways. When measuring temperature 0 is just another temperature on the scale, it is not the 'absence of degrees'. It might be an arbitrary lottery number. It can mean 'false'. 0 does not represent some fundamental feature of reality, it is a tool (in non-mathematical usage).

Zero seems to have more than one .. umm .. function.

One of those, you defined very well in the above, and I agree, it is in this case a number, a placeholder, even a 'miden' (median). Examples of this other than temperature, might be the pH scale (where I believe, the neutral point is not 0, but 7 ?). Or you can talk about middle 'C' in the musical scale - again, a different term to zero, but similar in function.

My purpose in my opening post, was to explore not that kind of zero (though I'm very pleased that we have, and I've learned a great deal from it) but to ask the question whether there is the possibility of total absence, and if so, describing it.

Edit to add; though I see additional posts now, that may have expanded on this.
 
Last edited:
  • #92
apeiron said:
OK, to me this a collection of impressions and feelings rather than a reasoned response. It may be your accurate impression of how the mathematicians you know operate (and it is my impression in general too). But I am trying to talk about what is fundamental in a reasoned fashion.

I have argued that the reason why a language like maths would enjoy any cultural capital is because it achieves a certain valued result. Its formalisms prove themselves to be good at the job of modelling reality. Now individual mathematicians may do maths for other personal purposes, but the general cultural purpose is pretty clear.

The second point is that this connection to reality may be denied within mathematical circles for a reason. Symbol systems have to be detached from what they describe so as to be free to describe them.


I think we might need to take a step back and make clear what we are discussing. It's not my intention here to make a rebuttal of your post, I'm just making my points clear.

By talking of the purpose of mathematics I take it as you mean the purpose of the use of mathematics. That may very well be modeling (in a very general sense), but let's leave that aside. I think that the main issue has been obscured. This is about the nature of the mathematical process itself, not appliance. And certainly, the individual motivation for a mathematician was a major digression, so let's leave that as well.

It is so that 'doing mathematics', that is: expanding and working within your mathematical calculus, is a formal process. It can only happen by means of applying well-defined formal rules. This fact is arguably essential to mathematics. It is an absolute categorical difference between the general usage of mathematics (e.g. modeling a physical situation), and working within the mathematical calculus (mathematical activity; e.g. calculating a value). The former is not bound by any formal rules.

For what are you really doing when you solve an equation to find some number which represent some physical measure? You are applying your mathematical calculus according to definite rules; a strictly formal process.

There must be no confusion of whether the state of mathematical activity is a consequence of anything, as if formality is something we "strive towards" because of some motivation, because it isn't. There is no such thing as a "degree of formality" in mathematics. We might be fooled in thinking so when we are exposed to so-called "informal arguments". However, (valid) informal mathematical arguments are always referring to definitive and explicit formal rules, not to anything else (like intuition).

I don't think it's fair to say that anyone are denying the relation between mathematics and physical reality. The issue (IMO) is that the critic and the mathematician are talking about slightly different things. Perhaps it is not clear that extra-mathematical usage (as modeling which includes the relation to reality) and 'mathematical activity' are two categorically different things; they might be mixed together, or treated as one. I argue that it is certainly so that the latter is a purely formal process, while the former absolutely might correspond to what one usually says e.g. about the role of physical intuition.

Actually, I completely agree with that extra-mathematical usage is a necessary criterion when choosing ones axioms (which is not a standard view in the formalist-perspective), i.e setting the premises for 'doing mathematics'. But this does not affect the way mathematics is done.

EDIT: I have made a slight edition of my post to make it clearer.
 
Last edited:
  • #93
Jarle said:
Actually, I completely agree with that extra-mathematical usage is a necessary criterion when choosing ones axioms (which is not a standard view in the formalist-perspective), i.e setting the premises for 'doing mathematics'. But this does not affect the way mathematics is done.

If this is your point - that a rule based system yields rule based outcomes - then of course I agree. But that seems both obvious and nothing much to do with the OP.
 
  • #94
apeiron said:
If this is your point - that a rule based system yields rule based outcomes - then of course I agree. But that seems both obvious and nothing much to do with the OP.

Well, that's not everything I wrote. It was actually part of the discussion between me and SonyAD.

EDIT: The main issue has been whether mathematics is referential to reality or not. It is my impression that this has been forwarded by both you and SonyAD (in some form) so I don't consider my arguments obvious.
 
Last edited:
  • #95
Jarle said:
Well, that's not everything I wrote. It was actually part of the discussion between me and SonyAD.

(In telephone operator voice) I'm sorry, SonyAD isn't here right now, so conversation is now physically impossible. Please leave a message, and try again.
 
  • #96
nismaratwork said:
(In telephone operator voice) I'm sorry, SonyAD isn't here right now, so conversation is now physically impossible. Please leave a message, and try again.

Uh, that's how it became a subject. Hence why it wasn't relevant to OP's original post. Besides, apeiron continued the discussion. I don't see your point.
 
Last edited:
  • #97
Jarle said:
I never said mathematics have no relation to reality, we use mathematical reasoning in many physical situations.

But you said there's no correspondence, I think. Is there a disparate meaning (relation/correspondence)? Here's the exact quote:

Jarle said:
You will have to understand that mathematics, or rather 'mathematical activity' is formal manipulation. It does not correspond to any physical fact of nature.

Which I think is false on its face.

Jarle said:
The point you must understand is that mathematics does not correspond to physical reality;

Parts of it don't. Infinity comes to mind.

Jarle said:
mathematics deals with definite formal rules of mathematical concepts which may very well be abstracted from physical situations.

Doesn't that necessarily imply some sort of correspondence/relation?

Jarle said:
That does not mean it is 'rooted' in physical reality.

How can it not? Why wasn't addition defined as a + b = 1, whatever a and b, or some other nonsensical way? We could have had a different, wonderful, pointless alternative algebra like we have alternative geometries.

You said it doesn't have to make sense (reflect reality as best we can discern it). So one can just make up formal rules and play with them for the hell of it. But to what purpose beyond personal gratification?

Instead of studying problems like the shortest path on (through) a curved surface between two points on (in) that surface, perfectly possible to accommodate within geometry, we forked "non-Euclidian" geometries. Nonsense.

Can't we model nor study curved surfaces or curvilinear projection in Euclidian geometry?

[PLAIN]http://img819.imageshack.us/img819/9098/panview.png[PLAIN]http://img716.imageshack.us/img716/2969/panview2.png

Yeah, interpolation sucks.

Jarle said:
Natural numbers are such an example. Arithmetic is purely the formal manipulation of the symbols we call numerals according to definite rules, while still being incredibly useful in real situations.

Numbers are not symbols. Cyphers are symbols. In positional numeral systems numbers are represented as sequences of cyphers. Numbers themselves are not symbols.

They are the result or the possible result of measurement or computations applied to measurements.

Jarle said:
It is a critical fact of mathematics that it deals only in formality.

I don't understand this statement.

Jarle said:
Do you know what an axiom is?

The alternative to circular logic and what keeps syllogisms and dialectics tied to the ground. :)

Jarle said:
No, that is not an implication. However, division by 0 does make sense! (or more precisely: it can make sense)

http://en.wikipedia.org/wiki/Riemann_sphere

I fail to see the purpose of that in place of spherical projection. Just as I fail to see the reasoning and purpose behind complex numbers and the complex plane instead of 2D vectors. But I'm no quantum physicist.

Jarle said:
EDIT: Apeiron makes an important point when he separates the informal process of choosing ones axioms from the formal deduction which takes place afterwards. However, only the latter part is mathematics, or 'mathematical activity' (not ignoring the extreme importance of this process to mathematics).

Axioms are by definition distinct from and precede the reasoning that follows them.
 
Last edited by a moderator:
  • #98
SonyAD said:
How can it not? Why wasn't addition defined as a + b = 1, whatever a and b, or some other nonsensical way? We could have had a different, wonderful, pointless alternative algebra like we have alternative geometries.

The same way clay isn't fundamentally related to any object you decide to model with the clay.

Mathematics is logical clay.

You said it doesn't have to make sense (reflect reality as best we can discern it). So one can just make up formal rules and play with them for the hell of it. But to what purpose beyond personal gratification?

The purpose is to be able to make models of reality with accurate logical statements; much like the purpose of making clay is for a sculptor to model. Some "claymakers" (mathematicians) DO just investigate formal rules to play with them, even though they don't have a meaningful physical counterpart.], but a lot of mathematics is driven directly by observation of the physical world.
 
  • #99
SonyAD said:
But you said there's no correspondence, I think. Is there a disparate meaning (relation/correspondence)?


Pythagorean put it well. Clay can be used to make sculptures of real things, but the clay itself is in no correspondence with what it imitates. The relation is always inferred from the outside. Furthermore, we don't even have to imitate real things at all.

It's no secret that we use mathematics for various purposes like physical modeling and that it is developed for these things, but the important point is, which I have stated several times, that mathematics itself does not correspond to these things. Mathematics is the purely formal development and use of strictly formal rules. It cannot correspond to anything.

However, that mathematics does not correspond to the real world does not imply that we have no motivation for the further development of mathematics, which you seem to suggest. There would be no contradiction if mathematics were used exclusively for physical modeling while also having no correspondence to physical objects and phenomena. How our calculations relates to reality is through an interpretation outside of mathematics.

So no, mathematics is not necessarily merely formal games without potential applications to reality, and this is because we have motivation for extra-mathematical use. That fact does not change the status of mathematics. At all.

SonyAD said:
You said it doesn't have to make sense (reflect reality as best we can discern it). So one can just make up formal rules and play with them for the hell of it. But to what purpose beyond personal gratification?

One can, and one do occasionally, but one does not have to... Often we have a constructive application in mind for our use and development of mathematics. And often we don't, applications will often come as a 'side-effect' of the development of new mathematics, and there are many examples of this.

SonyAD said:
Numbers are not symbols. Cyphers are symbols. In positional numeral systems numbers are represented as sequences of cyphers. Numbers themselves are not symbols.

I never said numbers were symbols, I said numerals were symbols, and they are. And I also said arithmetic is the formal manipulation of these symbols, and I can not see a single argument against that in your comment.

SonyAD said:
I don't understand this statement.
That mathematics deals only in formality means that the mathematical calculus is used and developed by following strictly formal well-defined rules. It's what I have been saying all along.

SonyAD said:
I fail to see the purpose of that in place of spherical projection. Just as I fail to see the reasoning and purpose behind complex numbers and the complex plane instead of 2D vectors. But I'm no quantum physicist.

As you can see in the link, we can formalize the use of what we call infinity as a symbol tied to certain rules; much like a number. And what you directly adressed; division of zero can also be formalized as shown. It puzzles me if you cannot see the connection between this and what I said right above the link.

SonyAD said:
How can it not? Why wasn't addition defined as a + b = 1, whatever a and b, or some other nonsensical way? We could have had a different, wonderful, pointless alternative algebra like we have alternative geometries.

We do have many different algebras as well. Some "useless", in that it has no current obvious application. In 'abstract algebra', addition is defined in many ways for different algebraic systems. There is not 'one' algebra in the same way as there is not 'one' geometry. They are all studies of formalized structures. But they can also all have potential application outside of mathematics. That doesn't make them correspondent to whatever they might be used to represent, and it doesn't change the way we use mathematics. The use is always formal, completely rule-governed and without correspondence to physical reality.
 
Last edited:
  • #100
Jarle said:
Pythagorean put it well. Clay can be used to make sculptures of real things, but the clay itself is in no correspondence with what it imitates. The relation is always inferred from the outside. Furthermore, we don't even have to imitate real things at all.

Making the classical clay~sculpture distinction is to invoke the substance~form dichotomy. Which is quite correct, except that mathematics is the science of pattern, of form. Indeed, that is why we call it "formal" modelling.

Of course, the trick was to atomise form - break it down into a kind of substance. Which is what the integers originally did, and what information theory does in a more complete way.

So the basic tenet of systems thinking is that systems are composed of the interaction of local substances (which can upwardly construct) and global forms (which can exert top-down constraints).

Maths is an exploration of the space of all possible forms using various representations of localised substance to construct every kind of shape that can be imagined.

This is taken to be a "free" exercise - unrelated to whether the resulting forms, the global patterns, have any correspondence with reality. But "intuition" often provides the global constraints that narrow the pattern-spinning productively. Which is where axioms come in.
 
  • #101
apeiron said:
Maths is an exploration of the space of all possible forms using various representations of localised substance to construct every kind of shape that can be imagined.

Mathematics is not the exploration of anything. At least not literally (it is misleading to say so). It is something we create, and not more than what our mathematical calculus has been expanded into through logical inference. Mathematics is more like a symbolical machinery, a collection of algorithms.

We use mathematics to explore things we can imagine, and more tangible things. By postulating certain properties of concepts we can draw conclusions based on our mathematical calculus. It may very well be so that this is the 'purpose' of mathematics - if you want to put it that way (and I'll agree with you) - but it is not mathematics.
 
Last edited:
  • #102
Jarle said:
Mathematics is not the exploration of anything. At least not literally (it is misleading to say so).

Perhaps you just misunderstand me. I was saying you make the Lego and then combine it every way it can be combined. You are exploring the phase space of the atomistic actions you have created. The terrain is unknown to you, but in some Platonic sense, it already exists. Much like very possible game of chess exists once the rules have been defined.
 
  • #103
The word 'mathematics' can mean different things to different people. Mathematicians are generally referring to the axioms, laymen are generally thinking about the numbers and symbols, scientists are generally referring to the discipline of mathematics as a study.

Personally, I think the axioms are invented. New axioms are discovered, but they are consequences of the original invention.

The symbols are obviously invented, but numbers like pi and e are most definitely discovered.

The discipline itself is obviously invented, but there is both discovery and invention taking place in the field.
 
  • #104
Pythagorean said:
The same way clay isn't fundamentally related to any object you decide to model with the clay.

Mathematics is logical clay.

I disagree. Mathematics had its origins in the need to cost effectively solve everyday practical problems. Like knowing how many apples you're bartering for a cow, how to divide a given number of loafs of bread equitably (as near to as possible) between a number of people, computing firing solutions, computing man-hours & labour force requirements, map making, figuring out how many different ways you can arrange stuff, etc.

That is what mathematics is, at its roots. It has its origins in practical necessities. Not pipe dreams about imaginary numbers and such.

Pythagorean said:
The purpose is to be able to make models of reality with accurate logical statements; much like the purpose of making clay is for a sculptor to model. Some "claymakers" (mathematicians) DO just investigate formal rules to play with them, even though they don't have a meaningful physical counterpart.], but a lot of mathematics is driven directly by observation of the physical world.

I would say the purpose is to be able to make verifiable, reasonably accurate predictions about reality from reasonably accurate measurements. I don't think anybody really cares how stuff works as long as it does. I think this thread is evidence enough of that.

Do you see people questioning how irrational numbers can denote physical quantities? Nope.

Jarle said:
Pythagorean put it well. Clay can be used to make sculptures of real things, but the clay itself is in no correspondence with what it imitates. The relation is always inferred from the outside. Furthermore, we don't even have to imitate real things at all.

I don't think the clay analogy is very good. At all. When I want to compute how many apples each of 5 people gets from a trolley cart full of them, I already know each one is bound to get less or all the apples in the cart? How do I know that? Math didn't tell me. It can't tell me.

How do I know no one can get more apples then there were in the cart initially? How do I know I have to divide and not multiply by the number of people? Or add the number of people to the number of apples? Or subtract from?

Nope. Sorry. Maths is just a dumb tool for use in making predictions about reality. It just models reality and does what you tell it to (by analysing the practical problem and deciding what operations to use, how to pipe them). When you tell it to do garbled nonsense the result is pointless.

I know to use division because I know it is the mathematical operation modeled after the action I perform in distributing the apples equitably.

Similarly, I know that by dividing the number of people by the number of apples in the cart I get the number/amount of people each apple gets, after an equitable distribution.

So how is mathematics not firmly rooted in reality? How was it not developed after and for reality (making predictions about it)?

There is nothing beyond that but insanity, as Georg Cantor may have found out if he realized he was going insane.

Jarle said:
It's no secret that we use mathematics for various purposes like physical modeling and that it is developed for these things, but the important point is, which I have stated several times, that mathematics itself does not correspond to these things.

I don't see how you can end on that point. Again, how does addition not correspond to hoarding stuff in reality, for instance?

Jarle said:
Mathematics is the purely formal development and use of strictly formal rules. It cannot correspond to anything.

It is rooted in observations about reality. It corresponds to reality. It went off the rails at some point, when the theoretical eggheads stole it from the engineers of their day.

Jarle said:
However, that mathematics does not correspond to the real world does not imply that we have no motivation for the further development of mathematics, which you seem to suggest.

That is not what I suggest. What I suggest is that mathematicians try to develop practical maths with immediate, fundamental applications once in a while.

And that they try to stop needlessly delving in silliness, like using the complex plane instead of 2D vectors and whatnot.

Jarle said:
How our calculations relates to reality is through an interpretation outside of mathematics.

No. That interpretation took place in the beginning and is what gave ous our particular flavour of mathematics, as you might put it, by defining its axioms. Where a + b does not equal 1 regardless of what a and b are, for instance. That interpretation is defining for and integral to mathematics.

It also takes place in the beginning of every new piece of mathematics developed. Like equations for computing the texture coordinates of the sample point from the texture coordinates of the triangles' tips by weighing these coordinates according to the distance to the sample point.

How could I have known to develop the math necessary for texture mapping, vertex rotations, fish eye lens projection, etc. on my own from scratch if what you say were true? How is it that they're basically the same others came up with long before myself (except I don't use matriceal representation), whose work I didn't have access to at the time?

Jarle said:
So no, mathematics is not necessarily merely formal games without potential applications to reality, and this is because we have motivation for extra-mathematical use. That fact does not change the status of mathematics. At all.

What you're saying is basically that people developed imaginary numbers and group theory before the addition and subtraction of natural numbers for bartering. Abelian groups were just floating around in ethereal existence waiting to be plucked by some mathematician with spare time on their hands before anyone had even learned to count.

Jarle said:
One can, and one do occasionally, but one does not have to... Often we have a constructive application in mind for our use and development of mathematics. And often we don't, applications will often come as a 'side-effect' of the development of new mathematics, and there are many examples of this.

Yeah. Side effects like using complex numbers and the complex plane instead of 2D vectors. Or a Riemann sphere instead of polar projection.

Jarle said:
I never said numbers were symbols, I said numerals were symbols, and they are. And I also said arithmetic is the formal manipulation of these symbols, and I can not see a single argument against that in your comment.

This is semantics. I don't know what you mean by numerals but numbers aren't symbols.

Jarle said:
That mathematics deals only in formality means that the mathematical calculus is used and developed by following strictly formal well-defined rules. It's what I have been saying all along.

What strict, formal, well-defined rules did I follow when I developed my sign() function or fish-eye projection on my own?

Jarle said:
As you can see in the link, we can formalize the use of what we call infinity as a symbol tied to certain rules; much like a number. And what you directly adressed; division of zero can also be formalized as shown. It puzzles me if you cannot see the connection between this and what I said right above the link.

To accomplish what? What do you accomplish by your formalisation of 1/0, infinity? Results based on division by 0, infinity. By hiding under an alias you just postponed the inevitable reckoning until you've done all the calculations you could. In the end, what you're left with is still very much as meaningless as it is still bound to division by 0 or infinity.

Or you can just make up some arbitrary convention like 1/0 = 2 and go from there. Still an exercise in pointlessness every bit as meaningless for making predictions about reality. Which has been the whole point of math since its inception.

Jarle said:
We do have many different algebras as well. Some "useless", in that it has no current obvious application. In 'abstract algebra', addition is defined in many ways for different algebraic systems.

Why must we have a myriad of dud algebras instead of a myriad of

sillyAddition69(a,b) = 1
sillyAddition70(a,b) = a+b/2
sillyAddition71(a,b) = (a-1)×b
etc.

Jarle said:
There is not 'one' algebra in the same way as there is not 'one' geometry.

Of course there is. And you can model and/or contain egghead brain farts inside the one geometry and the one algebra. :)

See above.

Why must I have a whole new (elliptical, hyperbolic) geometry to study curved surfaces (distances on them, angles, etc.)? Can't I model or study curved surfaces in "Euclidian" geometry?

Why do I need the complex plane? Don't I have vectors?

This is exactly what I'm talking about.

Jarle said:
They are all studies of formalized structures. But they can also all have potential application outside of mathematics. That doesn't make them correspondent to whatever they might be used to represent, and it doesn't change the way we use mathematics. The use is always formal, completely rule-governed and without correspondence to physical reality.

I disagree.
 
Last edited:
  • #105
apeiron said:
Perhaps you just misunderstand me. I was saying you make the Lego and then combine it every way it can be combined. You are exploring the phase space of the atomistic actions you have created. The terrain is unknown to you, but in some Platonic sense, it already exists. Much like very possible game of chess exists once the rules have been defined.

Yes, I agree; exactly the way chess existed before it was invented. But it's an odd thing to say, isn't it? It should be just as odd to say it about mathematics. But for some reason it isn't. It's quite usual to state that we are discovering and exploring already existing mathematical structures, but that's as weird as saying that a carpenter is exploring the ways of ordering wood in space.

However, I will agree that in certain contexts the word discovery is more suitable than invention, but it must be clear that it really is invention/construction.
 
Back
Top