# Proof that a spherical spiral has a constant angle with the meridians?

A spherical spiral is given with x(t)=(cost/sqrt(1+(at)^2), sint/sqrt(1+(at)^2), -at/sqrt(1+(at)^2)).
I define a sphere as given with d(u,v)=(cosu cosv, cosu cosv, -sinu)
so that I can observe the same point on the sphere and the spiral by putting t=u, v=arctan(at), because that way cosv=1/sqrt(1+(at)^2), and...
show more
A spherical spiral is given with x(t)=(cost/sqrt(1+(at)^2), sint/sqrt(1+(at)^2), -at/sqrt(1+(at)^2)).

I define a sphere as given with d(u,v)=(cosu cosv, cosu cosv, -sinu)

so that I can observe the same point on the sphere and the spiral by putting t=u, v=arctan(at), because that way cosv=1/sqrt(1+(at)^2), and sinv=at/sqrt(1+(at)^2).

Now, I find the first derivation of x by t to get the tangent, and of d by v to get the meridian.

x'(t)=([-sint(1+(at)^2)+aatcost]/[(1+(...

[cost(1+(at)^2)-aatsint]/[(1+(at)^2)^(...

[-a(1+2(at)^2)]/[1+(at)^2)^(3/2)] )

d'(v)=(-cosu sinv, -sinu sinv, -cosv)

By returning the values of u and v as described earlier, we get

d'(v)=(-atcost/(sqrt(1+(at)^2)), -atsint/(sqrt(1+(at)^2)), -1/(sqrt(1+(at)^2)) )

The angle between these two is given with the formula cos(alfa) = (d'(v)/||d'(v)|| ) * (x'(t)/||x'(t)|| )

Calculating, we get

||d'(v)||=1

||x'(t)||=sqrt([(1+(at)^2)^2+ a^4 *t^2 + aa(1+2(at)^2)^2)]/(1+(at)^2)^3)])

(x'(t))X(d'(v))=a/(1+(at)^2)

However, when I put all this in to calculate cos(alfa), I can't get rid of the t...

I get cos(alfa)=sqrt([aa(1+(at)^2)]/[(1+(at)^2... a^4 *t^2 + aa(1+2(at)^2)^2)])

And I get stuck. What's wrong?

I define a sphere as given with d(u,v)=(cosu cosv, cosu cosv, -sinu)

so that I can observe the same point on the sphere and the spiral by putting t=u, v=arctan(at), because that way cosv=1/sqrt(1+(at)^2), and sinv=at/sqrt(1+(at)^2).

Now, I find the first derivation of x by t to get the tangent, and of d by v to get the meridian.

x'(t)=([-sint(1+(at)^2)+aatcost]/[(1+(...

[cost(1+(at)^2)-aatsint]/[(1+(at)^2)^(...

[-a(1+2(at)^2)]/[1+(at)^2)^(3/2)] )

d'(v)=(-cosu sinv, -sinu sinv, -cosv)

By returning the values of u and v as described earlier, we get

d'(v)=(-atcost/(sqrt(1+(at)^2)), -atsint/(sqrt(1+(at)^2)), -1/(sqrt(1+(at)^2)) )

The angle between these two is given with the formula cos(alfa) = (d'(v)/||d'(v)|| ) * (x'(t)/||x'(t)|| )

Calculating, we get

||d'(v)||=1

||x'(t)||=sqrt([(1+(at)^2)^2+ a^4 *t^2 + aa(1+2(at)^2)^2)]/(1+(at)^2)^3)])

(x'(t))X(d'(v))=a/(1+(at)^2)

However, when I put all this in to calculate cos(alfa), I can't get rid of the t...

I get cos(alfa)=sqrt([aa(1+(at)^2)]/[(1+(at)^2... a^4 *t^2 + aa(1+2(at)^2)^2)])

And I get stuck. What's wrong?

Follow

1 answer
1

Are you sure you want to delete this answer?