Following from Benoît Kloeckner's comment above,
Place the points at A=(0,0) at the origin, B=(c,0) on the x-axis with the distance |AB|=c, and C=(x,y),
where we now want to satisfy |AC|=b and |BC|=a.
Simple application of the Pythagorean theorem leads to
x2+y2=b2
and (x−c)2+y2=a2
as the two constraints to be applied.
Expanding and subtracting the two equations:
x2−2cx+c2+y2=a2 and
x2+y2=b2
2cx−c2=b2−a2
2cx=(b2−a2+c2)
x=fracb2−a2+c22c
Now you can define y in terms of x.
Simply scale the points vecA=(0,0),vecB=(0,c), and vecC=(x,y) by their respective (u,v,w) barycentric coordinates to get D=(xD,yD) as a function of a,b,c,u,v,w, apply the Pythagorean theorem again to get d=|vecD| = the square root of (xd)2+(yd)2. This last step shouldn't need to be spelled out for you, but vecD=uvecA+vvecB+wvecC
No comments:
Post a Comment