Optimization in Several Variables

In summary: To find the critical point, set both of these partial derivatives equal to zero and solve for x and y. In this case, you will get x=0 and y=-1/2. To prove that there is only one critical point, you can use the second derivative test, which involves finding the second partial derivatives \frac{\partial^2{F}}{\partial{x^2}} and \frac{\partial^2{F}}{\partial{x}\partial{y}} at the critical point. Then, you can use the formula for D, which is D(x,y)=\frac{\partial^2{F}}{\partial{x^2}}\cdot\frac{\partial^2{F}}{\partial{y^2
  • #1
soe236
24
0
F(x,y) = (2*y+1)*e^(x^2-y)
Find critical point and prove there is only one.
Use second derivative test to determine nature of crit. pt.


I know the procedure in solving it: set partial derivatives to zero and solve resulting equations. And by second derivative test, if D>0, f(a,b) is local min/max; D<0, (a,b) is saddle point. if f_xx(a,b)>0, f(a,b) is min
where D=D(a,b)=f_xx(a,b)f_yy(a,b)-f_xy(a,b)^2

I have no idea how to get the partial derivatives and start the problem. Any help will be appreciated, thanks.
 
Physics news on Phys.org
  • #2
To take a partial derivative, you just treat one of either x or y as a constant, and differentiate with respect to the other one. So for example if the function was f(x,y) = x2y2 + x3 the partial derivative with respect to x is

2xy2 + 3x2

and with respect to y

2yx2
 
  • #3
Office_Shredder said:
To take a partial derivative, you just treat one of either x or y as a constant, and differentiate with respect to the other one. So for example if the function was f(x,y) = x2y2 + x3 the partial derivative with respect to x is

2xy2 + 3x2

and with respect to y

2yx2

Well yea I know that.. I meant I did not know how to get the partials specifically for the function I posted above : F(x,y) = (2*y+1)*e^(x^2-y).
Is dF/dx = (2*y+1)*e^(x^2-y)*2*x ? and dF/dy = ?
 
  • #4
Yes. To find dF/dy, use the product rule. What are d(2y+1)/dy and de^(x^2-y)/y separately?
 
  • #5
HallsofIvy said:
Yes. To find dF/dy, use the product rule. What are d(2y+1)/dy and de^(x^2-y)/y separately?

d(2y+1)/dy= 2
de^(x^2-y)/dy = -e^(x^2-y)
so by product rule, dF/dy= 2*e^(x^2-y) ?
 
  • #6
The product rule is d(f*g)/dy = df/dy*g + dg/dy*f not df/dy*dg/dy
 
  • #7
So dF/dy= (2y+1)*(-e^(x^2-y) + 2*(e^(x^2-y)) ?
 
  • #8
soe236 said:
So dF/dy= (2y+1)*(-e^(x^2-y) + 2*(e^(x^2-y)) ?
Technically speaking, the two first derivatives are partial derivatives [tex]\frac{\partial{F}}{\partial{x}}[/tex] and [tex]\frac{\partial{F}}{\partial{y}}[/tex], also written as [tex]F_x[/tex] and [tex]F_y[/tex].
 

FAQ: Optimization in Several Variables

What is optimization in several variables?

Optimization in several variables is a mathematical process that involves finding the highest or lowest value of a function with multiple independent variables. It is used to determine the optimal solution to a problem, such as maximizing profit or minimizing cost.

What are the main methods used in optimization in several variables?

The main methods used in optimization in several variables are the gradient descent method, the Newton's method, and the conjugate gradient method. These methods involve iteratively adjusting the values of the independent variables to reach the optimal solution.

What is the difference between local and global optimization?

Local optimization refers to finding the optimal solution within a specific region or neighborhood of the independent variables. Global optimization, on the other hand, aims to find the absolute optimal solution, regardless of the initial values of the independent variables.

How is optimization in several variables used in real-world applications?

Optimization in several variables is used in a wide range of real-world applications, such as engineering design, financial analysis, and machine learning. It helps to improve efficiency and make better decisions by finding the most optimal solution to a problem.

What are some challenges associated with optimization in several variables?

One of the main challenges of optimization in several variables is the presence of multiple local optima, which can lead to finding suboptimal solutions. Other challenges include determining the appropriate objective function and selecting the most suitable optimization method for a particular problem.

Back
Top