How to optimize a parameter that is the index of a summation?

In summary: You could try to split it up into multiple parameters, but then you have to introduce more approximations... it seems it is not a good idea to do this.
  • #1
Condereal
3
3
TL;DR Summary
Finding the best parameter to satisfy a set of equations, but that parameter is the index of a summation.
Hi everyone!

So, the problem I'm having has more to do with "how to pose the problem to solve it in some software as Matlab or similar".

I have experimentally measured values ##\varepsilon_{exp}^i## with ##i=1,\cdots,6##, that is, I have 6 detectors.

Then, I know (from a Monte Carlo simulation) a set of values ##\psi^i(n)## with ##n\in\mathbb{N}## for each detector, that satisfy: $$\sum\limits_{n=0}^{n_{max}}\psi^i(n) = \varepsilon_{sim}^i\approx \varepsilon_{exp}^i$$ The thing is, I would like to find a number ##n_0\in\mathbb{N}## such that: $$\varepsilon_{exp}^i-\varepsilon_{sim}^i(n_0)=\varepsilon_{exp}^i-\sum_{n=n_0}^{n_{max}}\psi^i(n)\to 0$$ for all six equations at the same time, that is, for all ##i##. This is an optimization problem, and it screams ##\chi^2##-minimization or maximum-likelihood problem. Can anyone imagine a way of posing this problem in an environment like Matlab?

Every answer will be very much appreciated.
 
Mathematics news on Phys.org
  • #2
You'll need some metric for the distance to compare different approximations. Then you can express everything in terms of that metric.

If your sum follows some nice path through your 6-dimensional space (or at least through the one-dimensional "distance" to the target) you could try to approximate it, minimize the now real parameter, and then look in the vicinity of that parameter for a local minimum of the discrete problem.

If your sum doesn't follow a nice path then searching through all cases is still an option.
 
  • #3
Hi mfb, thank you for your answer!

Sadly, the paths ##\psi^i(n)## are not nice... I should probably try searching, by brute force, through all the cases. I did this but for each detector ##i## independently, setting up a tolerance. For example, I move ##n_0##, and if: $$100\frac{\lvert\varepsilon^i_{exp}-\varepsilon^i_{sim}(n_0)\rvert}{\varepsilon^i_{exp}}<t$$ for ##t=5\%## I choose that particular ##n_0## as my parameter. I obtain in that case, an ##n_0## for each ##i##, and they differ from each other. I would like to give the ##n_0## that minimizes all at once, but I wouldn't know how to start writing this in a script...!
 
  • #4
In general there won't be an n_0 where all the relative deviations are minimal at the same time. You need some definition of "best" before you can try to find the best one. The absolute sum of relative deviations? The absolute sum of the squared relative deviations? The maximal relative deviation? Whatever you like. But you need to define what you want to optimize.
 
  • #5
Yes, sorry, I expressed incorrectly the idea, that I wanted to minimize the difference for each detector with one ##n_0##, but in reality ##n_0## should give the smallest difference possible which "compromises" for all the detectors. So, your proposal would be to define a metric like "the absolute sum of relative deviation", and try for every ##n_0## until I find the minimum?

On the other hand:
Can this procedure be considered a usual maximum-likelihood technique, or a ##\chi^2##-minimization? In case not, do you think these techniques apply in my case?
I'm only asking because I've been suggested by a colleague to use a ##\chi^2##-minimization for this problem, but I really don't see how to do this.

Thank you for your time mfb, your answer was useful.
 
  • #6
Condereal said:
the smallest difference possible which "compromises" for all the detectors.
You'll need to quantify that compromise. There is no way around that, and mathematics alone cannot tell you what will be the best metric for your task.
Condereal said:
Can this procedure be considered a usual maximum-likelihood technique, or a ##\chi^2##-minimization?
It can have some similarity, but with a single parameter and a dependent variable that doesn't vary smoothly nothing from that toolbox will help.
 

FAQ: How to optimize a parameter that is the index of a summation?

1. How do I determine the optimal value for the parameter in a summation?

The optimal value for a parameter in a summation can be determined by using mathematical techniques such as differentiation or optimization algorithms. These methods involve finding the maximum or minimum value of the summation function by setting the derivative equal to zero and solving for the parameter.

2. What factors should I consider when optimizing a parameter in a summation?

When optimizing a parameter in a summation, it is important to consider the constraints of the problem, the desired outcome, and the range of values that the parameter can take. Additionally, the complexity of the summation function and the time and resources available may also impact the optimization process.

3. Can I optimize a parameter in a summation without using mathematical techniques?

While mathematical techniques are often the most efficient and accurate methods for optimizing a parameter in a summation, there are some alternative approaches that can be used. These include trial and error, heuristic methods, or using machine learning algorithms to find the optimal value.

4. How do I know if the optimized parameter is the best solution for my problem?

The optimized parameter may not always be the best solution for a problem, as it depends on the specific constraints and goals of the problem. It is important to evaluate the results of the optimization and consider other factors such as practicality, cost, and potential trade-offs. Additionally, sensitivity analysis can be used to test the robustness of the optimized parameter.

5. Can I optimize multiple parameters in a summation simultaneously?

Yes, it is possible to optimize multiple parameters in a summation simultaneously. This is known as multi-objective optimization and involves finding the best compromise between conflicting objectives. It can be achieved using techniques such as Pareto optimization or weighted sum methods.

Similar threads

Replies
21
Views
2K
Replies
8
Views
2K
Replies
3
Views
2K
6
Replies
175
Views
22K
Back
Top