Mathematics software/advanced calculators and the learning of mathematics.

In summary, the conversation discusses the use of technology and software in solving math problems and questions the value of learning step by step methods in higher education. The conversation also touches on the importance of understanding the process behind solving problems and the potential benefits of using calculators and software. Ultimately, the value of learning and mastering a subject is debated, as well as the potential limitations and drawbacks of relying solely on technology.
  • #36
The whole thread is a pencil vs machine argument. What do you mean by "understand the theory of what you are doing on the computer. " You object to those "who do not actually understanding the calculations behind mathematics". If a computer tells me
sin(1)~0.84147098480789650665250232163029899962256306079837106567275170999
I certainly do not actually understanding the calculations. I also do not know how to do it by hand. I cannot imagine that knowing how would be helpful. I think doing lots of tedious useless hand calculations will result in less discovery.
Many software packages are helpful for different things
Latex
For typsetting
mathematica
For integrals and such
Magma
for algebra
Matlab
for something sometime
Mathics
http://www.mathics.net/
Which is quite basic
Sage
which include many helpful packages including
R
for statistics
PARI/GP
for arithmetic
LinBox
for linear algebra
Maxima
for symbolics
 
Last edited by a moderator:
Science news on Phys.org
  • #37
You really don't know how to do sin(1) by hand? That's a bit odd. Interestingly enough, knowing how a calculator obtains answer is helpful knowledge if for no other reason than a roundoff error. It was a simple roundoff error that allowed the patriot system to fail and get 28 people killed. I'll hate to think some engineer thought to himself, "round off errors! Who cares abou that!"

I think much of Sankaku is a simple, you be aware of the limitation of the technology you us. Graphs can be misleading, computers are known to have bugs, and sometimes certain tools shouldn't be used for certain problems. To be able to figure out work around for these problems, sometimes it requires a good grasp on theory or the 'old fashion way' to solve a problem to see if the computer is giving you a reasonable solution.
 
  • #38
One of main reasons you need to know how to do it by hand, is someone has to write the stuff that performs the calculations. A computer can do the work, but whose making the computer work? And making sure its doing it right?

Also, floating point computer calculations come with some error due to floating-point arithmetic.

For example a computer cannot contain the exact value of PI in purely a floating point value. This goes for all irrational numbers.

A computer also does not understand what is meant by infinity. Since it is a finite machine. So all applications in physics that use the concept of infinity in their design will not be able to be represented in a machine. This results in further error.

Minor errors might not matter much to some people but if you scale your calculations to something larger, it now has a much larger impact.

There is also the problem that many of the theorems that are from mathematics and physics, are made through proofs. How would a computer prove something when it has no understanding of what is meant by basic language.
 
  • #39
No one is arguing round off errors should be very large, that is ridiculous. I know how to compute five or ten digits of sin(1), but not hundreds. No it is not the same thing. Even if I was world champion at computing sine by hand a computer would easily beat me. Doing four by four problems in a high school algebra book is not the same as doing million by million problems. Again with the old someone did a calculation on a computer one time and made an error. The person still made the error not the computer. People make errors in hand calculations all the time. If a calculation is impossible for a human that is an accuracy of 0%. In fact the patriot system was the result of a huge blunder, not any inherent problems with computers. That is the lesson, don't make huge blunders. The patriot system did not kill anyone, a scud did. The patriot failed to save them, which is not the same thing. The patriot system should have been able to save them, but it would be impossible without computers. A pad of paper and pen would be entirely useless, that is technology with limitations worth being aware of.
 
  • #40
I think you're missing the point entirely in some vague attempt to standfast by your position. I don't think anyone here would argue you should be able to do ALL problems without a computer or that a computer cannot help you solve a lot of problems. This isn't a computer all the way or by hand all the way argument. In fact, there shouldn't be argument. You should know how things work. You should be aware that computer programs are designed by people and you should aware that oversights can occur and be able to see if the answer is reasonable. You should be aware of inherent limits of programs.

You seem to believe that people here think you should just do everything by hand. That's not the case at all, the argument is more along the lines that you should know it can be done by hand to some extent and be aware of the technique if for no other reason than to check reasonability. Is there any harm in this? Does learning a simple technique really eat up so much of your time that you rather just be ignorant of it?

If so, then you can carry on your merry well and just believe whatever answer a computer gives if you believe that's all you need to be successful. I, as someone who does most if not all my works on computers (a lot of variable order makov models), would like to be aware of possible errors in the programming and limits of it. But, that might just be me and my need to give a better answer to my boss than simply "uh this is what the computer gave me."

*As a side note about the scud, you can get bogged down in terms and phrases if you wish. I don't care to play that game. Clearly the computer didnt make an 'error' but blind trust in the machine allowed it to fail. The fact of the matter is that it was a simple error. Something that could've and should've been stopped earlier. The computer did what it was programmed to do, and sadly it was programmed poorly. Thus, this only supports my points, you should be aware of how a program works and be aware of possible problems.
 
  • #41
  • #42
π=1 (base π)
A pad of paper cannot contain the exact value of π in purely a floating point value (base 2,7,10...). Leaving aside the details of translation of hand calculations into computer algorithms the same methods are not used. Knowing how to calculate something by hand and how to calculate it with a computer are entirely different things. A computer is a useful tool.
 
  • #43
Is anyone saying that a computer isn't useful?
 
  • #44
lurflurf said:
The whole thread is a pencil vs machine argument.

No. All your posts are pencil vs machine arguments. The rest of us are trying to point to an appropriate balance between the two. I think I am done here.
 
  • #45
I used to worry about whether I was was investing too much time in learning how to do math by hand.

Until I took a Mathematical Computing class.

There were plenty of kids in that class who had VASTLY more experience in programming than me. Most of them had already done projects in Mathematica and just wanted to solidify their knowledge of its ins and outs. I knew next to nothing about programming. But the teacher is notoriously difficult, and he gave us problems that no one managed to do on the first try. Everyone's programs bugged up and did nonsensical things until you dug into them good.

And that's when I discovered that, although I had little experience programming, all my time with pencil and paper math had prepared me better for debugging than anyone in the room. Since I wasn't as comfortable with code, I had Mathematica plot and animate all sorts of visuals representing intermediate steps. I got comfortable creating visuals early and used them to debug rather than searching through lines of code. I took derivatives by hand and then plotted them to check how many roots I SHOULD be getting, and which ones were missing. At one point we were dealing with a function that behaved so badly that FindRoot couldn't keep up with it as parameters changed, so I had to construct a fairly intricate "guess" function that approximately tracked the behavior of the function while being much simpler computationally. I expanded expressions into Taylor series by hand to see what the computer should be telling me in certain extreme cases.

This may sound unsophisticated to people who know all sorts of impressively accurate numerical techniques, but I produced working programs faster than anyone else who I saw working in the lab, and I spent half my time doing things by hand in my notebook. It was mostly due to it being the only thing I felt comfortable with, but it seemed to work really well.
 
Back
Top