Mosaddique
Sep 3, 2009, 10:59 PM
Find the condition that the (a)(square of x)+(b)(square of y)=1 should cut (a`)(square of x)+(b`)(square of y)=1 perpendiculary
jcaron2
Sep 4, 2009, 11:26 AM
These are, of course, the equations of ellipses centered around the origin. The two ellipses cut each other perpendicularly as one becomes infinitely taller than it is wide, and the other gets infinitely wider than it is tall. (i.e. their eccentricities go to 1) This happens when a = 0 and b' = 0 or when a'=0 and b=0.
In addition to my geometrical explanation, you can also show this analytically. Let's use the case where a=0 and b'=0:
0{\cdot}x^2 + by^2 = 1
y^2 = \frac{1}{b}
y = \pm \sqrt{\frac{1}{b}}
This is simply the equation of two horizontal lines (i.e. an ellipse that's infinitely wide).
Likewise:
a'x^2 + 0{\cdot}y^2 = 1
x^2 = \frac{1}{a'}
x = \pm \sqrt{\frac{1}{a'}}
This is the equation of two vertical lines (i.e. an ellipse that infinitely tall).
Obviously the two horizontal lines cut the two vertical lines perpendicularly.