When Differentiation Becomes Kind: A Journey into Generalized Functions
- Get link
- X
- Other Apps
Opening the Door: What Happens When Calculus Breaks?
You’ve probably heard the golden rule of calculus: differentiation is a local operation. That is, if a function behaves well in a neighborhood, so does its derivative. But what if I told you there are sequences of smooth, well-behaved functions whose derivatives absolutely do not behave?
Let’s warm up with a mysterious sequence: \[f_v(x) = \frac{1}{v} \sin(vx)\]
As \(\quad v \to \infty\), these functions shrink and flatten. They converge uniformly to zero.
So far, so tame.
Now differentiate:\[\quad f_v'(x) = \cos(vx)\] Suddenly—bam! Oscillations explode. The derivative sequence doesn't converge classically. It wobbles infinitely fast, like a tuning fork hit too hard.
In classical analysis, we say:
- Differentiation is not continuous with respect to uniform convergence.
But in the world of generalized functions (also called distributions), something beautiful happens:
- Differentiation is continuous.
Differentiation: Reimagined for Generalized Functions
Let’s make this precise.
Suppose you have a sequence \[ f_1, f_2 ,...,f_n
\] of generalized functions that converges to ๐. Then:\[f_n \to f \Rightarrow f_n' \to f'
\]
Why?
Because in the distributional framework, we define differentiation not pointwise, but via how a function acts on test functions \( \varphi(x) \in K\) (smooth functions with compact support). Specifically:\[\langle f', \varphi \rangle := -\langle f, \varphi' \rangle\] So if \( f_n → f\) then for any \(\varphi(x) \in K\) \[ \langle f_n', \varphi \rangle = -\langle f_n, \varphi' \rangle \to -\langle f, \varphi' \rangle = \langle f', \varphi \rangle.
\] Voilร — differentiation behaves!
Classical Failure vs. Distributional Grace
Let’s revisit the earlier rogue sequence:
- \(f_v(x) = \frac{1}{v} \sin(vx) \to 0 \)
- \( f_v'(x) = \cos(vx)\) : does not converge classically
But in the distributional sense:\[\langle f_v', \varphi \rangle =-\int_{-\infty}^{\infty} f_v(x) \varphi'(x) ,dx =-\int_{-\infty}^{\infty} \sin(vx) \varphi'(x) ,dx
\] Integration by parts (and Riemann-Lebesgue lemma) tells us this vanishes as
\(\quad v \to \infty\). So, \(f_v' \to 0\) as a distribution!
Even better:
- \(f_v'' = -\sin(vx) \to 0\)
- \( f_v^{(3)} = -v \cos(vx) \to 0 \)
- \(f_v^{(n)} \to 0 \quad \text{for all } n, \text{ as generalized functions} \)
SageMath Visualization
Let's visualize the chaos — and its hidden harmony.
Symbolic Computation with SymPy
We approximate the distributional inner products numerically:
import sympy as sp
import numpy as np
import matplotlib.pyplot as plt
# Define symbolic variables
x, v = sp.symbols('x v')
phi_x = sp.exp(-x**2) # Smooth test function
phi_prime_x = sp.diff(phi_x, x) # Derivative of test function
# Define f_v(x) and its derivative
f_v_x = (1/v) * sp.sin(v * x)
f_v_prime_x = sp.cos(v * x)
# Compute distributional inner product ⟨f_v', ฯ⟩ numerically
for v_val in [1, 5, 10, 20]:
integrand = f_v_x.subs(v, v_val) * (-phi_prime_x)
integral_val = sp.integrate(integrand, (x, -5, 5)).evalf()
print(f"v = {v_val:>2} | ⟨f_v', ฯ⟩ ≈ {integral_val}")
๐ก Try It Yourself! You can directly paste and run this code in your Python environment.
Analysis of Output
Implementation is working well—it’s computing the distributional inner product \(⟨( f_v' ), ( \varphi )⟩\) numerically! The integral values you're getting indicate how the oscillations of \( f_v'(x) = \cos(vx) \) are being "smoothed out" when tested against \( \varphi(x) = e^{-x^2} \).
- For ( v = 1 ): \(⟨( f_v' ), ( \varphi )⟩ ≈ 1.380 \)— This shows significant interaction between \( f_v' \) and \( \varphi \), meaning the oscillations still have a strong integral presence.
- As \( v \) increases: The integral should gradually approach zero, reinforcing the idea that \( f_v' \to 0 \) in the distributional sense. You can check this by running the same calculations for larger values of \( v \), such as 50, 100, or more.
Visualization with Matplotlib
We plot the classical divergence of \( f_v'(x) \) for various values of \( v \):
# Define function for plotting
def f_prime_v(x, v):
return np.cos(v * x)
x_vals = np.linspace(-np.pi, np.pi, 400)
# Plot classical derivatives f_v'(x) = cos(vx) for different v
plt.figure(figsize=(8, 5))
for v_val, color in zip([1, 5, 10, 20], ['blue', 'red', 'green', 'purple']):
plt.plot(x_vals, f_prime_v(x_vals, v_val), label=f'v={v_val}', color=color)
plt.title(r"Classical Divergence of $f_v'(x) = \cos(vx)$")
plt.xlabel("x")
plt.ylabel(r"$f_v'(x)$")
plt.legend()
plt.grid()
plt.show()
๐ก Try It Yourself! You can directly paste and run this code in your Python environment.
Graph Interpretation
- This function \( f_v(x) = \cos(vx) \) is displayed for different values of \( v \). The blue curve \(v = 1\) has the slowest oscillations.
- As \( v \) increases, oscillations intensify—matching the mathematical expectation. Your reasoning about \( f(x) = \cos(bx) \) and matching \( b = v \) is spot-on! The legend confirms that for the blue curve, \( v = 1 \), meaning \( b = 1 \).
Series: Term-by-Term Differentiation is Legal Again
In classical analysis, you often can’t differentiate series term-by-term.
But for generalized functions, you can.
Let \( h_1 + h_2+...→g\) as generalized functions. Then:\[h'_1 + h'_2+...→g'\] Even more impressively:
Example 1: Fourier Series Lives Again
Let ๐(๐ฅ) be a periodic function with Fourier series: \[ f(x) = \sum_{n=-\infty}^{\infty} c_n e^{inx} \] Even if the series fails to converge classically, it may still converge distributionally.
And if the integrated version:
\[ f(x) = \sum \frac{c_n}{in} e^{inx} \] converges uniformly to an absolutely continuous function, then you can recover ๐(๐ฅ) via differentiation:\[ f(x) = \sum c_n e^{inx} \] Legally. Cleanly. In the world of generalized functions.
import sympy as sp
# Define symbolic variables
x, n = sp.symbols('x n')
# Fourier series representation
c_n = sp.Function('c_n')(n) # Coefficients of the Fourier series
f_x = sp.Sum(c_n * sp.exp(sp.I * n * x), (n, -sp.oo, sp.oo))
# Compute integral of F(x) = ∑ (i n c_n e^(i n x))
F_x = sp.Sum(sp.I * n * c_n * sp.exp(sp.I * n * x), (n, -sp.oo, sp.oo))
# Differentiate to recover f(x)
f_recovered = sp.diff(F_x, x)
# Display results
print("Fourier Series Representation:")
sp.pprint(f_x)
print("\nIntegrated Version F(x):")
sp.pprint(F_x)
print("\nRecovered Function via Differentiation:")
sp.pprint(f_recovered)
๐ก Try It Yourself! You can directly paste and run this code in your Python environment.
Example 2: Diverging Coefficients? Still Okay.
Even if your series has coefficients growing like powers of ๐, say: \[ \sum a_n e^{i n x}, \quad \text{with } |a_n| \leq C |n|^k \] this still converges as a generalized function — provided enough differentiations are involved.
How?
Because it can be written as the derivative of a better-behaved series: \[ \sum \frac{a_n}{{(in)}^k}e^{i n x} \] which converges classically. Then just differentiate ๐ times to get your original series.
import sympy as sp
# Define symbolic variables
x, n, k = sp.symbols('x n k')
# Define the original divergent Fourier-like series
a_n = sp.Function('a_n')(n) # Coefficients of the series
f_x = sp.Sum(a_n * sp.exp(sp.I * n * x), (n, -sp.oo, sp.oo))
# Apply k-th differentiation to transform the series
f_k_x = sp.Sum((sp.I * n)**k * a_n * sp.exp(sp.I * n * x), (n, -sp.oo, sp.oo))
# Display results
print("Original Fourier Series:")
sp.pprint(f_x)
print("\nDifferentiated Series:")
sp.pprint(f_k_x)
๐ก Try It Yourself! You can directly paste and run this code in your Python environment.
Final Thoughts: Differentiation with a Broader Mind
The classical tools of calculus often falter near sharp edges—when oscillations grow too wild, when functions blow up, or when convergence teeters on the edge.
Distributions step in to say:
- “You’re not broken — you just need a gentler framework.”
So next time you’re told “you can’t differentiate this series,” or “the derivative doesn’t exist,” pause.
Maybe… it just lives in a different world.
A more generalized one.
- Get link
- X
- Other Apps
Comments
Post a Comment
If you have any queries, do not hesitate to reach out.
Unsure about something? Ask away—I’m here for you!