Your thoughts on numerical analysis teaching

In summary, the author is uncertain about the advantages of using more precision when doing numerical analysis.
  • #1
nrqed
Science Advisor
Homework Helper
Gold Member
3,766
297
Hello all,

I may get a contract to teach numerical analysis. I did quite a lot of numerical work during my PhD but that was a while ago. Now when I look at most books on the topic, I get the feeling that a lot is outdated, and I also feel that a lot of what I knew is outdated as well because of the possibility to do arbitrary precision computations.

I mean that most examples used to try to illustrate the danger of rounding off errors are no loner an issue now if one uses "arbitrary precision" (I know, that's not really an honest terminology). I mean that I can simply keep a lot of precision in Mathematica and all these examples become completely well behaved (and one can also use arbitrary precision in Python).

Now I realize that for large scale computations, using a huge amount of precision may slow down things a lot and that if there are truly catastrophic cancellations one can still not be saved with using arbitrary precision, but I feel uneasy because it seems to me that all the books I have looked at still work within the old paradigm of low precision calculations. It's not clear to me how advantageous it is to teach more complex algorithms to get rid of rounding off errors when one can simply force the software to increase dramatically the precision of the numbers used.

So my questions are:

a) Does anyone know a textbook that would be mindful of this fact and be teaching a "modern" approach to numerical analysis? That would cover well the pros and cons of increasing the precision for different types of applications (interpolation, integration, differentiation, solutions of nonlinear equations) versus using better algorithms?

b) Any advice from anyone who is teaching that material/taking classes in that topic/using numerical analysis in their work?

Thank you in advance!
 
Technology news on Phys.org
  • #2
There’s a website https://lectures.quantecon.org/jl/index.html with tutorials on numerical python and Julia that might get you more up to speed on today’s focus.

A lot of numerical stuff is also moving toward machine learning and deep learning technologies too where the numerical work is used to fine tune the learning algorithms.
 
  • #3
jedishrfu said:
There’s a website https://lectures.quantecon.org/jl/index.html with tutorials on numerical python and Julia that might get you more up to speed on today’s focus.

A lot of numerical stuff is also moving toward machine learning and deep learning technologies too where the numerical work is used to fine tune the learning algorithms.
Thank you for the link. I must mention that I don't have control over the topics that I must cover and these are: interpolation, numerical integration an dedifferentiation, solutions of nonlinear equations and ordinary DEs. So I cannot get into machine learning, neural nets or that type of things. I am interested in a modern view point for these elementary applications.Thank you!
 
  • #4
nrqed said:
I may get a contract to teach numerical analysis. I did quite a lot of numerical work during my PhD but that was a while ago. Now when I look at most books on the topic, I get the feeling that a lot is outdated, and I also feel that a lot of what I knew is outdated as well because of the possibility to do arbitrary precision computations.
It's been a long while since I had numerical analysis, as well, but I'm not sure that the basics are all that outdated. As you note, large scale computations can be slowed down if super high precision is being used, so there's a tradeoff between calculation time and precision. Having higher precision just moves the uncertainty out several decimal places, but one still needs to be concerned about how accurate the result is. The basic ideas of numerical analysis, including root finding, calculating derivatives, and calculating integrals using several methods are still important, even if the technology has changed from computers with no built-in math coprocessors to ones capable of working with 512 bit numbers.
 
  • Like
Likes aaroman and FactChecker
  • #5
Mark44 said:
It's been a long while since I had numerical analysis, as well, but I'm not sure that the basics are all that outdated. As you note, large scale computations can be slowed down if super high precision is being used, so there's a tradeoff between calculation time and precision. Having higher precision just moves the uncertainty out several decimal places, but one still needs to be concerned about how accurate the result is. The basic ideas of numerical analysis, including root finding, calculating derivatives, and calculating integrals using several methods are still important, even if the technology has changed from computers with no built-in math coprocessors to ones capable of working with 512 bit numbers.
I agree, but I would like to have some information about the tradeoff between using more sig figs versus improving the algorithm. It seems to me that the books (at least the ones I looked at) are still based on the paradigm of 32 bits precision. I was wondering if there was some book with a more modern approach that would take into account the pros and cons of using higher precision. I have not found one so far.

Thanks for your input!
 
  • #6
Higher precision comes at a great cost as hardware can’t handle it directly. Software must be used to marshall the number data to the cpu for crunching and software solutions are always slower,

There may be some chips currently in development to handle arbitrary precision but there must also be a market for them to thrive and I don’t think we’re there yet.
 
  • Like
Likes FactChecker and nrqed
  • #7
jedishrfu said:
Higher precision comes at a great cost as hardware can’t handle it directly. Software must be used to marshall the number data to the cpu for crunching and software solutions are always slower,

There may be some chips currently in development to handle arbitrary precision but there must also be a market for them to thrive and I don’t think we’re there yet.
Point well taken, thank you for your input!
 
  • #8
Here’s a more detailed write up on arbitrary precision math

https://en.m.wikipedia.org/wiki/Arbitrary-precision_arithmetic

There used to be a notion of an extended instruction set for the Honeywell 6000 Computer that supported packed decimal arithmetic. I’m not sure if this is the scheme used for bignum though but it’s quite likely.

https://en.m.wikipedia.org/wiki/Binary-coded_decimal#Packed_BCD

It worked well in COBOL programs where math was primarily addition, subtraction, multiplication and division. Packed decimal was easy to convert to a printable number by a simple additive offset to get the digits character code.
 
  • Like
Likes nrqed
  • #9
nrqed said:
It seems to me that the books (at least the ones I looked at) are still based on the paradigm of 32 bits precision.
I'm not in the market for such books, but the 32-bit limitation seems quite dated. Processors from Intel and AMD have had native support for 64- and 80-bit floating point numbers for many years.

If I were teaching that class, I would look at numerical analysis books with recent publishing dates. Possibly they would discuss the higher precisions available on more modern processors.
 
  • Like
Likes jedishrfu

FAQ: Your thoughts on numerical analysis teaching

1. What is numerical analysis and why is it important to teach?

Numerical analysis is a branch of mathematics that deals with the development and use of algorithms for solving mathematical problems through numerical approximations. It is important to teach because it provides students with the necessary skills to solve real-world problems that cannot be solved analytically.

2. How is numerical analysis taught in the classroom?

Numerical analysis is typically taught through a combination of lectures, problem-solving sessions, and computer laboratory work. Theoretical concepts are introduced in lectures, while practical skills are developed through problem-solving sessions and hands-on experience with computational tools.

3. What are the benefits of teaching numerical analysis?

Teaching numerical analysis has many benefits, including developing critical thinking skills, improving problem-solving abilities, and preparing students for careers in fields such as engineering, computer science, and finance. It also provides a strong foundation for further study in mathematics and related disciplines.

4. What are some challenges in teaching numerical analysis?

One of the main challenges in teaching numerical analysis is finding the right balance between theory and practice. It can also be difficult to keep up with advancements in computational techniques and software. Additionally, students may struggle with the abstract nature of the subject and the need for strong mathematical foundations.

5. How can numerical analysis teaching be improved?

Numerical analysis teaching can be improved by incorporating more hands-on activities and real-world examples, providing opportunities for students to work with large datasets and use modern software, and emphasizing the importance of mathematical rigor and error analysis. Collaborating with industry professionals can also help keep the curriculum relevant and up-to-date.

Back
Top