Are These Two Equations Linearly Independent?

  • Thread starter turin
  • Start date
  • Tags
    Non-linear
In summary, Bezout's theorem states that two polynomials in two variables can have at most deg(f)deg(g) solutions, provided that there are no common non-constant factors in the polynomial ring in which they are represented.
  • #1
turin
Homework Helper
2,323
3
Given a system of two equations and two variables, x and y:

x + ay = c
x + by2 = d

I believe this system can be solved uniquely (please correct me if I'm wrong). My question is that of independance. Would one be correct in the statement that these two equations are linearly independent, even though the second equation is not linear?
 
Physics news on Phys.org
  • #2
Danger, amateur opinion ahead. Does the set of polynomials of two variables of degree less than or equal to 2 form a vector space? If so, I believe the elements x + ay and x + by^2 are linearly independent (which is pretty easy to check, just consider the equation n(x + ay) + m(x + by^2) = 0). I don't think the "linear" in "linear independence" refers to the kind of linearity (or lack of) you "have" in a polynomial such as x + by^2 ;)
 
  • #3
the equation may have 0,1, or 2 solutions depending on whether a and b are non-zero, as can be seen by substituting for x in the second equation to get a qaudratic in y (unless b is zero) which may have no real solutions. if a and b are both zero there may be no solutions at all. If there is a solution then it almost certainly isn't unique
 
  • #4
muzza is correct, the linear independence refers to the underlying vector space in which these vectors lie.
 
  • #5
muzza and matt grime,
Thanks very much for not treating me and my question like a couple of idiots. I do appreciate it. :)

I am a little confused about this underlying vector space idea. Is this the same basic idea as treating functions as vectors so that, for instance, the Legendre polynomials are considered as linearly independent vectors?

Actually, now that I think a little harder about it, I don't think that is quite right. The Legendre polynomials are independent by virtue of order, whereas the two equations in my example system have two independent variables. Should I generalize to:

x + ay = c(x,y)
x + by2 = d(x,y)

and then discuss whether c(x,y) and d(x,y) are linearly independent?

I appologize: I do realize that my question is lacking some element of precision. I just can't put my finger on it (and I suppose the lack of precision is the question).
 
Last edited:
  • #6
The two variables in the functions also have different orders.

The vector space you have here is R[x,y..,z] The polynomial ring in (arbitrary) variables x,y..,z with real coeffs. The two elements you want to consider are x+ay, and x+by^2. They are linearly dependent if there are real numbers such that p(x+ay)+q(x+by^2) is the zero function, the function that is identically zero for all x and y. you may do this in several ways: by letting x=0 y=1 and x=1 y=0 you can find relations amongst p q a and b that must be satified, and clearly nothing will satisfy them but p=q=0, ie they are linearly independent.
 
  • #7
Thanks, matt grime. That was a little on the math-heavy side for me, but I think I get the idea. I'll have to explore "the polynomial ring" when I find the time. Thanks again to both of you.
 
  • #8
linear independence for two objects is pretty easy. if neither is zero, it just means neither one is a multiple of the other.

What you use as multipliers determines what kind of linear independence you are using.

We usually use real numbers for multipliers so independence of x+ay and x+by^2, just means neither is a real multiple of the other.

that is obvious since multiplying by a real number cannot change a y into a y^2.

But the more interesting question you are concerned with is what does this say about the number of solutions?

Given two equations in two variables, the number of simultaneous solutions can be infinite even if they are independent in this sense.

For example x and xy are independent but share the whole y-axis as common solutions. These have a common factor of x, explaining that fact.

At least if we use complex numbers instead of reals we can say always that two polynomials share an infinite number of commono zeroes only if they have a common irreducible factor. (Then it seems to follow also for real numbers.)

the answer to the problem of how many common solutions two polynomials in two variables have, is called Bezout's theorem. I.e. if the two polynomials f,g have no common non constant factors, then they can have at most deg(f)deg(g) common solutions. Tangential solutions can count as more than one however, as usual.

there is a way to calculate the multiplicity of a solution (a,b), as the vector dimension of the local ring R/(f,g), where R is the ring formed from the polynomial ring C[x,y] by allowing as denominators all polynomials not vanishing at the given point (a,b).

I believe the first proof of this theorem was due to Gauss?, and used Euler's? theory of resultants.
 
Last edited:

FAQ: Are These Two Equations Linearly Independent?

What is non-linear independence?

Non-linear independence refers to the relationship between two or more variables in a non-linear system. It means that the variables cannot be expressed as a linear combination of each other, and therefore cannot be predicted or determined by one another.

How is non-linear independence different from linear independence?

Linear independence refers to the relationship between two or more variables in a linear system, where one variable can be expressed as a linear combination of the others. Non-linear independence, on the other hand, refers to the relationship between variables in a non-linear system, where no such linear combination exists.

Why is non-linear independence important in scientific research?

Non-linear independence is important in scientific research because it allows for a more accurate understanding of complex systems. Many real-world phenomena, such as weather patterns and biological processes, are non-linear and cannot be accurately described using linear relationships. Understanding non-linear independence can lead to more accurate models and predictions.

How can non-linear independence be tested?

Non-linear independence can be tested using various statistical methods, such as correlation analysis, regression analysis, and non-parametric tests. These methods can help determine the strength and direction of the relationship between variables in a non-linear system.

Can two variables be both linearly and non-linearly independent?

Yes, it is possible for two variables to be both linearly and non-linearly independent. This means that they are not related in a linear manner, but there may still be a non-linear relationship between them. This can occur when the relationship between the variables is complex and cannot be accurately described by a linear function.

Similar threads

Replies
6
Views
1K
Replies
5
Views
2K
Replies
5
Views
2K
Replies
9
Views
4K
Replies
3
Views
1K
Back
Top