next up previous contents
Next: Iterated functions Up: The derivative of a Previous: harmonic functions of real

the Schwartz derivative

Consider the basic linear fractional transformation

w = $\displaystyle \frac{az+b}{cz+d}$ (73)

which is equivalent to the vanishing of the implicit expression
cwz + dw -az -b = 0. (74)

To get a relation between w and z which does not depend on a, b, c, and d, take three derivatives with respect to a still unspecified parameter:
c(wz)' + dw' -az' = 0   (75)
c(wz)'' + dw'' -az'' = 0   (76)
c(wz)''' + dw''' -az''' = 0.   (77)

In equivalent matrix form,
$\displaystyle \left[ \begin{array}{lll}
(wz)' & w' & -z' \\
(wz)'' & w'' & -z'...
...nd{array} \right]
\left[ \begin{array}{c} c \\  d \\  a \\  \end{array} \right]$ = $\displaystyle \left[ \begin{array}{c} 0 \\  0 \\  0 \\  \end{array} \right].$ (78)

The determinant of this matrix, which is a Wronskian, must vanish to get a nontrivial solution. But

\begin{eqnarray*}\left\vert \begin{array}{lll}
(wz)' & w' & -z' \\
(wz)'' & ...
...+ 2\frac{w'''}{w'}-3w''(\frac{w''}{w'}+\frac{z''}{z'}) \right\}
\end{eqnarray*}


Since this expression vanishes, the factor (w'z')2 can be discarded (unless it happens to vanish everywhere) to leave
$\displaystyle 2\frac{w'''}{w'}-3\left(\frac{w''}{w'}\right)^2$ = $\displaystyle 2\frac{z'''}{z'}-3\left(\frac{z''}{z'}\right)^2,$ (79)

which is a relationship which is expected to hold after applying a linear fractional transformation to a variable, no matter what the linear fractional transformation. Besides, if z is the variable, the right hand is zero, which leaves the left hand side as a constraint on any function that purports invariance. Some rearrangement can make the invariant read

\begin{displaymath}\frac{2w'w'''-3(w'')^2}{(w')^2},\end{displaymath}

half of which is called a Schwartz derivative. The symbol for the Schwartz derivative of w with respect to z is $\{w,z\}$, some variants of whose definition are:
$\displaystyle \{w,z\}$ = $\displaystyle \frac{w'''}{w'}-\frac{3}{2}\left(\frac{w''}{w'}\right)^2,$ (80)
  = $\displaystyle \left\{\frac{w''}{w'}\right\}'
-\frac{1}{2}\left(\frac{w''}{w'}\right)^2,$ (81)
  = $\displaystyle \frac{d^2}{dz^2}\left[\ln\frac{dw}{dz}\right]
-\frac{1}{2}\left\{\frac{d}{dz}\left[\ln\frac{dw}{dz}\right]\right\}^2.$ (82)

The Schwartz derivative is not really a derivative (it is a differential invariant) because it doesn't follow the product rule when applied to products, nor does it even work linearly on sums, and it is furthermore immune to non-zero multiplicative constants. As the derivative was derived, w and z depended on a parameter with respect to which two differential expressions were identical. When z itself was the parameter, one result was obviously zero, forcing the other to assume the same value; which of the two was chosen creates an asymmetry by which $\{w,z\}\neq\{z,w\}$.

Nevertheless, the invariant can be calculated for various w's all depending on the same z, for which the following rules and particular results can be verified through calculation:

     
$\displaystyle \left\{\frac{aw+b}{cw+d},z\right\}$ = $\displaystyle \{w,z\},$ (83)
$\displaystyle \left\{w,\frac{az+b}{cz+d}\right\}\frac{(ad-bc)^2}{(cz+d)^4}$ = $\displaystyle \{w,z\},$ (84)
$\displaystyle \{w,z\}$ = $\displaystyle \{w,u\}\left(\frac{du}{dz}\right)^2 + \{u,z\},$ (85)
$\displaystyle \{w,z\}$ = $\displaystyle - \{z,w\}\left(\frac{dw}{dz}\right)^2$ (86)
$\displaystyle \{w,z\}=0$ $\textstyle \Rightarrow$ $\displaystyle w = \frac{az+b}{cz+d},$ (87)
$\displaystyle \{z^n,z\}$ = $\displaystyle \frac{1-n^2}{2z^2},$ (88)
$\displaystyle \{e^{\lambda z},z\}$ = $\displaystyle -\frac{1}{2}\lambda^2.$ (89)

The last three lines suggest the task of inverting the Schwartz derivative; that is, of finding w(z) for which $\{w,z\}=Q(z)$, for some prescribed function Q. The answer is that one should solve the differential equation

$\displaystyle y''(z)+\frac{1}{2}Q(z)y(z)=0,$     (90)

for which one can expect to obtain two linearly independent solutions y1(z) and y2(z). Their quotient
$\displaystyle w(z) = \frac{y_2(z)}{y_1(z)}$     (91)

will then solve the Schwartzian equation. This is a subject better postponed until differential equations, second order linear differential equations, and Ricatti equations have all been introduced. However, the connections can be foreseen in the logarithmic derivative version of the Schwartz derivative, and a few comments can already be made.

For example, the conclusion in equation (87) follows from observing that a solution to

y''(z)=0

could be az+b for two constants a and b, that a second solution could be cz+d with $ad-bc\neq 0$, and the fractional linear quotient follows.

Given the privleged status of exponentials with respect to differentiation, we have already seen in equation (89) that $\{\exp(\lambda z),z\} = -\lambda^2/2$, whereby complex exponentials generate constant Schwartz derivatives. How does this reconcile with the case where $\lambda = 0$ where this might lead us to think that just constants have a zero Schwartz derivative?

Again we need to consider that the most general form of the solution of the associated Ricatti equation is a fractional linear combination of $\exp(\lambda z)$ and $\exp(-\lambda z)$. But $\lambda = 0$ represents the confluent case where the two solutions are $\exp(\lambda z)$ and its derivative with respect to $\lambda$, namely $z\exp(\lambda z)$. The limit gives a fractional linear combination of 1 and z, the Möbius transformation already seen.

A somewhat similar discussion could be based on equation (88) where both n=1 and n=-1 give zero Schwartz derivatives, but it suffices to say that these two exceptions are already fractional linear transformations. Aside from that, the denominator of $\{z^n,z\}$ tends to zero with large z, so it is again possible to wonder about Schwartz derivatives which are small, but not necessarily zero.

In turn it could be asked, what it means to have a mapping which is almost, but not quite, single valued? The example of the exponential shows that it could introduce a very long periodicity, with an anomaly which concentrates around infinity. Powers put a definitive multiple valuedness right at the origin, which of course extends right out to infinity. We could also ask, ``Can singlevaluedness be assured, at least in a finite region?'' It is a question which can be better answered later on.

That singlevaluedness is a delicate and unstable concept is already apparent in the implicit polynomial which was used to obtain its properties. At the level of polynomials, any addition whatsoever will create additional roots which may possibly be located far away, but whose vestiges could nevertheless be far reaching. Consider $\varepsilon z^2+z+1=0$, for example. An extremely flat parabola has been added to a straight line which, although not changing the existing zero by much, introduces a new root near infinity.


next up previous contents
Next: Iterated functions Up: The derivative of a Previous: harmonic functions of real
Microcomputadoras
2001-04-05