Why backpropagation dominates neural networks instead of interpolation

  • #1
jonjacson
447
38
TL;DR Summary
I don't understand what are the advantages of backpropagation in a neural network versus using classical interpolation/extrapolation methods.
Hi guys,

I was learning machine learning and I found something a bit confusing.
When I studied physics I saw the method of least squares to find the best parameters for the given data, in this case we assume we know the equation and we just minimize the error. So if it is a straight line model we compute the best slope and constant.
With machine learning we don't know the underlying equation or model so we use a general method to "fit" as best as we can the real data.
But, isn't that what we do with interpolation/extrapolation?
What is it so special about a neural network and the backpropagation method that we can't achieve with interpolation?
 
Technology news on Phys.org
  • #2
jonjacson said:
TL;DR Summary: I don't understand what are the advantages of backpropagation in a neural network versus using classical interpolation/extrapolation methods.

With machine learning we don't know the underlying equation or model
I don't think this is correct. Each artificial neuron implements a definite function with parameters (weights and bias), which need to be determined to minimize a definite target function, which is built by composing the functions of the neurons.
 
  • #3
Hill said:
I don't think this is correct. Each artificial neuron implements a definite function with parameters (weights and bias), which need to be determined to minimize a definite target function, which is built by composing the functions of the neurons.
The weights are used to compute how much of an output from one neuron enters the next ones, but that is not a function.
 
  • #4
jonjacson said:
The weights are used to compute how much of an output from one neuron enters the next ones, but that is not a function.
Here is one way to define the function (sorry, I'd need to dig to find the source of this paper):

1702209231417.png

.......
1702209291789.png

1702209321470.png

.......
1702209388276.png

1702209430175.png
 
  • Like
Likes jonjacson
  • #5
jonjacson said:
With machine learning we don't know the underlying equation or model so we use a general method to "fit" as best as we can the real data.
But, isn't that what we do with interpolation/extrapolation?
What is it so special about a neural network and the backpropagation method that we can't achieve with interpolation?
I am not an expert, but here is my understanding:
The backpropagation of neural networks is to develop weights for intermediate layers that can help to model a sophisticated decision process. Those intermediate layers are not given data like interpolation is. They are free to interpretation. Sometimes they are obscure and other times we can imagine a meaning for them. The only hard training data are the inputs and outputs at the first and last layer.
 
  • Like
Likes jonjacson

Related to Why backpropagation dominates neural networks instead of interpolation

1. Why does backpropagation dominate neural networks instead of interpolation?

Backpropagation is favored in neural networks because it allows for the optimization of the network's weights by iteratively adjusting them to minimize the error between the predicted output and the actual output. This enables the network to learn complex patterns and relationships in the data, making it more powerful than simple interpolation methods.

2. How does backpropagation differ from interpolation in neural networks?

Backpropagation involves propagating the error gradient backwards through the network to update the weights, while interpolation simply estimates unknown values based on known data points. Backpropagation allows neural networks to learn from data and make predictions, while interpolation only provides estimates based on existing data.

3. Can interpolation be used in neural networks instead of backpropagation?

While interpolation can be used in neural networks for certain tasks, it is limited in its ability to learn complex patterns and relationships in data. Backpropagation, on the other hand, allows neural networks to adapt and improve their performance over time by adjusting their weights based on the error between predicted and actual outputs.

4. What are the advantages of backpropagation over interpolation in neural networks?

Backpropagation offers the advantage of enabling neural networks to learn from data and make predictions, rather than simply interpolating between known data points. This allows for more accurate and flexible modeling of complex relationships in the data, making neural networks more powerful and versatile than interpolation methods.

5. Are there any drawbacks to using backpropagation instead of interpolation in neural networks?

One potential drawback of backpropagation is that it can be computationally intensive and require a large amount of data to effectively train a neural network. Additionally, backpropagation can sometimes suffer from issues like vanishing or exploding gradients, which can make training unstable. However, these drawbacks are often outweighed by the benefits of using backpropagation for training neural networks.

Similar threads

  • Programming and Computer Science
Replies
1
Views
1K
  • Programming and Computer Science
Replies
13
Views
2K
  • Programming and Computer Science
Replies
0
Views
118
  • Set Theory, Logic, Probability, Statistics
Replies
30
Views
2K
  • Programming and Computer Science
Replies
3
Views
980
  • Computing and Technology
Replies
4
Views
2K
  • Other Physics Topics
Replies
2
Views
492
  • Programming and Computer Science
Replies
7
Views
6K
Replies
1
Views
1K
  • Topology and Analysis
Replies
2
Views
2K
Back
Top