2.3.1 Differentiability

So far we have dealt with concepts which are more or less obviously generalizations of the elementary notions of real analysis. Our ultimate goal, the development of the differential and integral calculus of complex functions, rests upon the notion of complex derivative and this, as we shall see, leads to definitely new insights.

A complex function \(w=f(z)\) is said to be differentiable or monogenic at the point \(\zeta\) if as \(z\) approaches \(\zeta\) the difference quotient \[\tag{3.10} \frac{f(z)-f(\zeta)}{z-\zeta}\] tends to a limiting value fixed completely independently of the manner of approach. The limit of the difference quotient is then called the derivative of \(f(z)\) at the point \(\zeta\) and is denoted by \(\left(\frac{d w}{d z}\right)_{z=\zeta}\) or \(f'(\zeta)\) . A differentiable function is certainly continuous since the difference \(f(z)-f(\zeta)\) must tend to zero with \(z-\zeta\) . Equivalently, we may present the definition in a way reminiscent of the mean value theorem of calculus:

A function \(f(z)\) is differentiable, at a point \(\zeta\) , if it can be written in the form: \[\tag{3.11} f(z)=f(\zeta)+\left[f'(\zeta)+\varepsilon(z, \zeta)\right](z-\zeta)\] where \(\varepsilon(z, \zeta) \rightarrow 0\) as \(z \rightarrow \zeta\) ; i.e. \(f(z)\) may be approximated by a linear function in the neighborhood of \(\zeta\) . If \(f'(z)\) is a continuous function in any closed bounded subset of the domain of analyticity of \(f(z)\) the function, \(\varepsilon(z, \zeta)=\frac{f(z)-f(\zeta)}{z-\zeta}-f'(\zeta)\) , being continuous 1 must therefore be uniformly continuous whence we observe in (3.11) that we can arbitrarily preassign an \(\varepsilon>0\) so that \[|\varepsilon(z, \zeta)|<\varepsilon\quad \text{for}\quad |z-\zeta|<\delta(\varepsilon)\] where \(\delta(\varepsilon)\) is completely independent of the choice of \(z\) and \(\zeta\) .

We are especially interested in functions which possess a derivative at every point of a domain \(D\) and these we term analytic in \(D\) . Later we call such functions analytic and regular , the word analytic by itself being used more generally to designate functions which are differentiable at all but certain possible exceptional points. For the present, however, we do not make this distinction.

The rules for differentiation of analytic functions are formally the same as for real functions, the proofs carrying over in exact analogy. The derivative of the sum of two analytic functions exists and is the sum of their derivatives, \[\tag{3.12} \frac{d}{d z}[f(z)+g(z)]=f'(z)+g'(z).\] For the product of the analytic functions \(f(z)\) and \(g(z)\) we have \[\tag{3.13} \frac{d[f(z) g(z)]}{d z}=f'(z) g(z)+f(z) g'(z).\] The quotient \(f(z) / g(z)\) is differentiable provided \(g(z) \neq 0\) and, as in the real case, \[\tag{3.14} \frac{d}{d z}\left[\frac{f(z)}{g(z)}\right]=\frac{f'(z) g(z)-f(z) g'(z)}{g(z)^{2}}.\] Finally, an analytic function of an analytic function is analytic; more precisely, if \(w=f(z)\) is differentiable at a point \(\zeta\) and if \(g(w)\) is differentiable at the point \(\xi=f(\zeta)\) then \(h(z)=g(f(z))\) is differentiable at \(z=\zeta\) and has the derivative \[h'(\zeta)=g'(\xi) f'(\zeta).\] From these rules we shall be able to obtain a wide class of analytic functions once we have established the analytic nature of a few basic functions. We shall return to the consideration of special analytic functions in the next section.

To this point the theory of complex functions runs a close parallel to that of real functions. A function \(f(z)\) of a complex variable \(z=x+i y\) is thus far nothing more than a pair of real functions \[f(z)=u(x, y)+i v(x, y).\] A series of complex functions converges in accordance with the convergence of the separate series of their real and imaginary parts. Similarly, to say a complex function is continuous is equivalent to saying \(u(x, y)\) and \(v(x, y)\) are both continuous. In general, formal manipulations with complex functions have seemed to lead to results which, formally at least, are similar to those in real function theory. It would be natural to suppose that a complex function is differentiable provided only that its real and imaginary parts are differentiable.

But this is actually the point of departure. The differentiability of \(u(x, y)\) and \(v(x, y)\) does not of itself imply the differentiability of \(f(x, y)\) as may be seen from a simple example:

Let \(f(z)=x + 2yi\) . Here \(u=x\) and \(v=2 y\) are both differentiable functions of \(x\) and \(y\) . Setting \(z-\zeta=\Delta x+ i\Delta y\) in (3.10) we obtain \[\frac{x + \Delta x + 2(y + \Delta y)i - (x + 2yi)}{\Delta x + i\Delta y} = \frac{\Delta x + 2i\Delta y}{\Delta x + i\Delta y}\] If we let \(z-\zeta\) tend to zero through real values then \(\Delta y=0\) and the difference quotient tends to \(1\) ; but, if we keep \(\Delta x=0\) as \(z\) approaches \(\zeta\) the limit of the difference quotient is \(2\) . So \(f(z)\) is not analytic. It is by the extremely strong requirement that the limit of (3.10) is the same no matter how \(z\) tends to \(\zeta\) that the definition of derivative introduces a completely new factor which is absent in the theory of real functions.

Let us now investigate what the differentiability of \(f(z)=u(x, y)+i v(x, y)\) implies for \(u\) and \(v\) . We have \[f(z)=f(\zeta)+\left[f'(\zeta)+\varepsilon(z, \zeta)\right](z-\zeta)\] where \(\varepsilon(z, \zeta) \rightarrow 0\) as \(z \rightarrow \zeta\) . Separating real and imaginary parts by writing \(z=x+i y\) , \(\zeta=\xi+i \eta\) , \(f'(z)=A+i B\) and \(\varepsilon=\varepsilon_{1}+i \varepsilon_{2}\) this becomes \begin{align} f(z) - f(\zeta) &= u(x, y)+i v(x, y)-[u(\xi, \eta)+i v(\xi, \eta)]\\ &= (A+\varepsilon_{1})(x-\xi)-(B+\varepsilon_{2})(y-\eta)\\ & \quad +i\left[\left(B+\varepsilon_{2}\right)(x-\xi)+\left(A+\varepsilon_{1}\right)(y-\eta)\right] \end{align} whence \[u(x, y) - u(\xi, \eta) = (A+\varepsilon_{1})(x-\xi)-(B+\varepsilon_{2})(y-\eta)\] and \[v(x, y)-v(\xi, \eta)=\left(B+\varepsilon_{2}\right)(x-\xi)+\left(A+\varepsilon_{1}\right)(y-\eta).\] Since \(\varepsilon_{1}\) and \(\varepsilon_{2}\) tend to zero as \((x, y) \rightarrow(\xi, \eta)\) it follows from these relations that \(A=u_{x}\) , \(B=-u_{y}\) and \(A=v_{y}\) , \(B=v_{x}\) . Hence \(u\) and \(v\) must satisfy a pair of partial differential equations, the famous Cauchy-Riemann equations \begin{align} \tag{3.15} \left\{ \begin{aligned} \dfrac{\partial u}{\partial x}&=\dfrac{\partial v}{\partial y}\\ \dfrac{\partial u}{\partial y}&=-\dfrac{\partial v}{\partial x}. \end{aligned} \right. \end{align} These equations then are necessary conditions that \(f(z)\) be analytic. In fact we have the

Theorem 2.4 . A necessary and sufficient condition that a complex function \[f(z)=u(x, y)+ iv(x, y)\] have a continuous derivative in a domain \(D\) is that the real and imaginary parts are differentiable and satisfy the Cauchy-Riemann equations \begin{align} \tag{3.16} \frac{\partial u}{\partial x}=\frac{\partial v}{\partial y}, \quad \frac{\partial u}{\partial y}=-\frac{\partial v}{\partial x}. \end{align} 

Proof. The proof of sufficiency is simple. We may write \[u(\xi+\Delta x, \eta+\Delta y)-u(\xi, \eta)=\frac{\partial u}{\partial x}\, \Delta x+\frac{\partial u}{\partial y}\, \Delta y+\varepsilon_{1} \sqrt{\Delta x^{2}+\Delta y^{2}}\] and \[v(\xi+\Delta x, \eta+\Delta y)-v(\xi, \eta)=\frac{\partial v}{\partial x}\, \Delta x+\frac{\partial v}{\partial y}\, \Delta y+\varepsilon_2 \sqrt{\Delta x^{2}+\Delta y^{2}}\] where \(\varepsilon_{1}\) and \(\varepsilon_{2}\) tend to zero with \(\sqrt{\Delta x^{2}+\Delta y^{2}}\) . Using the Cauchy-Riemann equations we obtain by straightforward computation \[\frac{f(\zeta+\Delta z)-f(\zeta)}{\Delta z}=\frac{\partial u}{\partial x}+i\, \frac{\partial y}{\partial x}+\frac{\left(\varepsilon_{1}+i \varepsilon_{2}\right)|\Delta z|}{\Delta z}.\] We conclude that \(f(z)\) has a derivative, \[f'(z)=\frac{\partial u}{\partial x}+i\, \frac{\partial v}{\partial x}=\frac{\partial v}{\partial y}-i\, \frac{\partial u}{\partial y}.\]

As an example of an analytic function take \[w=\left(x^{2}-y^{2}\right)+2 x y i.\] Here \(u=x^{2}-y^{2}\) and \(v=2 x y\) are certainly continuously differentiable and since \(u_{x}=2 x\) , \(u_{y}=-2 y\) ; \(v_{x}=2 y\) , \(v_{y}=2 x\) the Cauchy Riemann equations (3.15) are satisfied. Note that \(u\) and \(v\) are the real and imaginary parts of the function \(w=z^{2}\) and that \(d w / d z=u_{x}+i v_{x}=2 z\) . The derivative of \(w=z^{2}\) could have been obtained without recourse to the Cauchy-Riemann equations by forming the difference quotient and passing to the limit. on the other hand we could not be satisfied with the result obtained by taking the limit in only one direction. So simple a function as \(f(z)=\bar{z}\) does not have a derivative.

Although formally the same, the definition of derivative for complex functions leads to a far more restricted class of functions than the differentiable functions of a real variable. From the behavior of a real function in one part of its domain of definition nothing can be said about its behavior elsewhere. On the other hand, as we shall see, the local behavior of an analytic function is intimately related with its behavior in the entire domain of its definition. First we shall establish what is perhaps the most remarkable result of the theory – the fact that the derivative of an analytic function is in turn differentiable and hence that any analytic function is indefinitely differentiable.

2.3.2 Remarks and Examples, Harmonic Functions

Let us consider a number of examples of analytic functions. Take \(f(z)=z^{n}\) . In the form of (3.11) we have \[z^{n}-\zeta^{n}=(z-\zeta)\left[n \zeta^{n-1}+(z-\zeta) R(z, \zeta)\right]\] where \(R(z, \zeta)\) is a polynomial and consequently \((z-\zeta) R(z, \zeta) \rightarrow 0\) as \(z\) tends to \(\zeta\) . Hence \(f(z)=z^{n}\) is differentiable for any finite value \(\zeta\) , i.e. \(f(z)\) is analytic throughout the \(z\) -plane with the derivative \[\tag{3.20} \frac{d}{d z} z^{n}=n z^{n-1}.\] From the rules (3.12) , (3.13) , (3.14) it is immediately clear that any rational function \[R(z)=\frac{a_{0}+a_{1} z+\cdots+a_{m} z^{m}}{b_{0}+b_{1} z+\cdots+b_{n} z^{n}}\] is analytic in the entire \(z\) -plane with the possible exception of the points at which the denominator is zero. In particular, the general linear function is analytic.

Another important example is the function \[f(z)=e^{x}(\cos y+i \sin y).\] The Cauchy-Riemann equations (3.15) are plainly satisfied and furthermore \begin{align} f(z_1)f(z_2) &= e^{x_{1}+x_{2}}\left[\cos \left(y_{1}+y_{2}\right)+i \sin \left(y_{1}+y_{2}\right)\right]\\ &=f(z_1 + z_2). \end{align} For real values of \(z, y=0\) and the function reduces to \(e^{x}\) . This function provides us with a natural extension of the exponential to a complex variable. We shall see later that it is the only possible analytic extension. Consequently we define \[\tag{3.21} e^{z}=e^{x}(\cos y+i \sin y).\] From \(f'(z)=u_{x} + iv_{x}\) we have the familiar property of the exponential function \[\frac{d}{d z}\left(e^{z}\right)=e^{z}.\] 

The function \(e(\theta)=\cos \theta+i \sin \theta\) defined here can now be written in the form \(e^{i \theta}\) so that any complex number \(z\) may be described in the polar form \[z=r e^{i \theta}\] where \(r=|z|\) , \(\theta = \operatorname{am}(z)\) . This suggests the extension of the logarithmic function to complex values: 3 \begin{align} \log z &= \log\left(r e^{i \theta}\right) = \log r+i \theta \\ &=\log \sqrt{x^{2}+y^{2}}+i \arctan \frac{y}{x}.\tag{3.22} \end{align} 

Clearly \[u_{x}=\frac{x}{\sqrt{x^{2}+y^{2}}}=v_{y}; \quad u_{y}=\frac{y}{\sqrt{x^{2}+y^{2}}}=-v_{x}\] so that \(\log{z}\) is analytic in the entire \(z\) -plane excepting the origin.

Very generally, we have already seen ( here ) that a power series \[P(z)=a_{0}+a_{1} z+\cdots+a_{n} z^{n}+\cdots\] is analytic in the interior of its circle of convergence and, in fact, possesses derivatives of all orders in that circle, the \(k^{\text{th}}\) derivative being \[P^{(k)}(z)=k ! a_{k}+\frac{(k+1) !}{1} a_{k+1} z+\cdots+\frac{(k+n) !}{n !} a_{k + n} z^{n}+\cdots\] 

In analogy to theorems of the real calculus we have the following results:

Theorem 2.5 . If \(g(z)=a+i b\) is the derivative of an analytic function then the function is uniquely determined to within an arbitrary constant. 

This is a direct consequence of the Cauchy-Riemann equations for if \(f'(z)=a+i b\) then \begin{align} \left\{ \begin{array}{l} u_x = a\\ u_y = -b \end{array} \right. \quad \left\{ \begin{array}{l} v_{x}=b\\ v_{y}=a \end{array}.\right. \end{align} But these equations determine \(u(x, y)\) and \(v(x, y)\) , hence \(f(z)\) , to within an arbitrary constant.

If \(w=f(z)=u(x, y)+i v(x, y)\) furnishes a one-to-one mapping of a domain \(D\) of the \(z\) -plane onto a point set \(D'\) of the \(w\) -plane then the inverse function exists. That is, to each point \((u, v)\) of \(D'\) there corresponds a unique point \((x, y)\) of \(D\) – the point which is mapped on \((u, v)\) . Therefore we may write \[z=x(u, v)+i y(u, v)=g(w).\] 

Theorem 2.6 . If \(w=f(z)=u+i v\) is continuously differentiable at a point \(z\) and if \(f'(z) \neq 0\) then the inverse function \(z=g(w)\) exists and is differentiable with the derivative \[\tag{3.23} g'(w)=\frac{1}{f'(z)}.\] 

Proof. We have by the Cauchy-Riemann equation \begin{align} \Delta=\frac{\partial(u, v)}{\partial(x, y)}&=\begin{vmatrix} u_{x} & u_{y} \\ v_{x} & v_{y} \end{vmatrix}\\ &=u_{x}^{2}+v_{x}^{2}\\ &=\left|f'(z)\right|^{2}\\ &\neq 0. \end{align} Since the Jacobian of the transformation is non-zero and the derivatives are continuous it follows that the mapping \(u=u(x, y)\) , \(v=v(x, y)\) has an inverse. Hence we may write \(x\) and \(y\) as differentiable functions of \(u\) and \(v\) . Using this fact we observe that \begin{align} \begin{array}{ll} \dfrac{\partial u}{\partial u}=u_{x} x_{u}+u_{y} y_{u}=1; & \quad \dfrac{\partial u}{\partial v}=u_{x} x_{v}+u_{y} y_{v}=0\\ \\ \dfrac{\partial v}{\partial u}=v_{x} x_{u}+v_{y} y_{u}=0; & \quad \dfrac{\partial v}{\partial v}=v_{x} x_{v}+v_{y} y_{v}=1. \end{array} \end{align} From these equations we derive \begin{align} \begin{array}{ll} {x}_{{u}}={v}_{{y}} / \Delta & {x}_{{v}}=-{u}_{{y}} / \Delta \\ {y}_{{u}}=-{v}_{{x}} / \Delta & {y}_{{v}}={u}_{{x}} / \Delta \end{array} \end{align} whence \begin{align} \frac{\partial x}{\partial u}&=\frac{\partial y}{\partial v}\\ \frac{\partial x}{\partial v}&=-\frac{\partial y}{\partial u} \end{align} that is, \(x\) and \(y\) satisfy the Cauchy-Riemann equations as functions of \(u\) and \(v\) . The inverse function is therefore analytic and possesses and derivative \begin{align} \frac{d z}{d w} &=x_{u}+ iy_{u}\\ &=\frac{u_{x}-i v_{x}}{\Delta}\\ &=\frac{\overline{f'(z)}}{f'(z) \overline{f'(z)}}\\ &=\frac{1}{f'(z)}. \end{align}

But the theorems of the differential calculus give no more than this result. Since the Jacobian \(\Delta=\left|f'(z)\right|^{2}\) is not zero it follows that the neighborhood of a point \(\zeta\) in the \(z\) -plane is mapped on a full neighborhood of the point \(f(\zeta)\) in the \(w\) -plane and \(\left|f'(\zeta)\right|^{2}\) is the ratio of magnification of the area. We know, even further, that if \(f(z) \neq 0\) in a domain \(D\) of the \(z\) -plane then the image \(D'\) in the \(w\) -plane is again a domain with boundary points corresponding to the boundary points of \(D\) . Thus an analytic function preserves domains. We shall see that this simple mapping property has a consequence of wide applications: \(|f(z)|\) does not reach a maximum inside any domain of analyticity of \(f(z)\) .

2.3.2.1 Harmonic functions

How much latitude does one actually have in fixing the real and imaginary parts of an analytic function? Once we specify the real part \(u(x, y)\) of an analytic function \(f(z)\) it is clear by the Cauchy-Riemann equations, \[\tag{3.24} v_{x}=-u_{y}(x, y); \quad v_{y}=u_{x}(x, y),\] that the function \({v}({x}, {y})\) is determined to within an arbitrary additive constant. However, we are even restricted in the choice of \(u(x, y)\) . Suppose (we shall prove it later) that the second partial derivatives of \(u\) and \(v\) exist. Then it follows from (3.24) that \[v_{x y}=-u_{y y}; \quad v_{y x}=u_{x x}\] whence we obtain the equation \[\tag{3.25} \Delta u = u_{xx} + u_{yy} = 0.\] Thus \(u\) must satisfy a second order partial differential equation. This equation, called the Laplace or the potential equation, appears frequently in analysis and has vital significance for the applications. The solutions of \(\Delta {u}=0\) are called variously harmonic functions or potential functions . Now, given a harmonic function \(u\) we may determine a conjugate harmonic function \(v\) by the equations (3.24) and thereby define an analytic function. Conversely, granting the existence of second derivatives, the real and imaginary parts of any analytic function are harmonic. As examples of harmonic functions we already have \[x^{2}-y^{2}, \quad \log \sqrt{x^{2}+y^{2}}, \quad \theta=\arctan \frac{y}{x}, \quad e^{x} \cos x,\] which may easily be verified.