Multivar Optimization question

  • Thread starter Damascus Road
  • Start date
  • Tags
    Optimization
In summary, the conversation revolved around finding the coordinates of the point (x,y,z) on the plane z=3x+2y+1 that is closest to the origin. The speaker mentioned that it is an optimization problem and they need to minimize (x,y,3x+2y+1). They then started to find the partial derivative of the function and set it equal to 0, but were unsure of what to do next. Eventually, the speaker was able to solve the problem after a few days. They also questioned if this was the correct forum for their question since they received zero replies.
  • #1
Damascus Road
120
0
Greetings,
I'm working on a problem where I am to find the coordinates of the point (x,y,z) to the plane z=3x+2y+1, which is closest to the origin.

I know that this is an optimization problem, and I believe I have to minimize (x,y,3x+2y+1).

I started by finding partial derivative, fx, of the magnitude of the function.

[tex]f_{x}=\frac{10x+12y+4}{2\sqrt{x^2+y^2+2x+3y+1}}[/tex]

Setting that = 0

[tex]0 =\frac{10x+12y+4}{2\sqrt{x^2+y^2+2x+3y+1}}[/tex]

now what?
 
Last edited:
Physics news on Phys.org
  • #2
anyone know how to do this?
 
  • #3
I've since solved this, although it took me a few days.
For future reference, was this the correct forum for this question? It got zero replies...
 

FAQ: Multivar Optimization question

What is multivariable optimization?

Multivariable optimization is a mathematical technique used to find the optimal solution for a problem that involves multiple variables. It involves finding the values of these variables that maximize or minimize a given objective function, subject to certain constraints.

What are the applications of multivariable optimization?

Multivariable optimization has various applications in fields such as engineering, economics, finance, and data analysis. It is used to optimize product designs, financial portfolios, and marketing strategies, among others.

What methods are used in multivariable optimization?

There are various methods used in multivariable optimization, including gradient descent, Newton's method, and genetic algorithms. These methods differ in their approach to finding the optimal solution and are used depending on the complexity of the problem.

What are the challenges in multivariable optimization?

One of the main challenges in multivariable optimization is dealing with high-dimensional problems, where there are a large number of variables. This can lead to computational complexity and the need for more advanced optimization techniques. Additionally, finding the global optimal solution can be difficult, and sometimes only local optima are found.

How is multivariable optimization different from single-variable optimization?

In single-variable optimization, there is only one variable to consider, and the objective function and constraints are functions of that variable. In multivariable optimization, there are multiple variables, and the objective function and constraints can be functions of each of these variables. This adds complexity to the problem and requires the use of different techniques and algorithms.

Back
Top